Highlights:

  • According to Google, the new capabilities are intended to assist businesses in optimizing the cost-effectiveness of their big data initiatives, embracing open data ecosystems, and bringing the magic of artificial intelligence and machine learning to current data, all while enhancing security.

During its third annual Google Data Cloud & AI Summit, Google Cloud revealed several new product developments and partner solutions to accelerate the adoption of its big-data ecosystem.

According to Google, the new capabilities are intended to assist businesses in optimizing the cost-effectiveness of their big data initiatives, embracing open data ecosystems, and bringing the magic of artificial intelligence and machine learning to current data, all while enhancing security.

Most updates impact Google’s fully managed, serverless data warehouse, BigQuery. With new price editions and features, including autoscaling and compressed storage invoicing, it seems fair that Google is assisting many service users in lowering their running expenses.

Google executives Gerrit Kazmaier and Andi Gutmans highlighted that, with BigQuery editions, users now have greater freedom in selecting the precise feature set required for their unique workloads. Clients may combine Standard, Enterprise, and Enterprise Plus versions to achieve the ideal price/performance for predictable workloads, with single- or multi-year contracts available at discounted rates.

Customers can pay depending on the amount of computing resources consumed for uncertain workloads. Kazmeier, vice president and general manager of data analytics, stated during a press event, “This is a unique new model… for spiky workloads.”

Google stated that users could supply more capacity in precise increments, preventing them from overpaying for underutilized capacity.

Yet, BigQuery is not the only solution for businesses to cut their big data expenditures. In addition, Google believes that companies may optimize cost-effectiveness by migrating from costly old databases to its cloud-hosted AlloyDB service. Yet, many reports cannot move data to the cloud due to regulatory or data sovereignty concerns. Google can address these problems starting soon.

Google said the firm had unveiled a technical preview of AlloyDB Omni, a downloaded edition of AlloyDB that can run on-premises, at the network edge, on several cloud platforms, and even on developer laptops. All the regular AlloyDB database features, such as excellent performance, PostgreSQL compatibility, and Google Cloud support, are available to clients with AlloyDB at a far cheaper cost than older databases. AlloyDB is twice as fast as conventional PostgreSQL for transactional workloads and up to 100 times quicker for analytical queries, according to Google’s internal tests.

Google also announced a new Database Migration Assessment tool that users could use to determine how much work will be necessary to migrate workloads to AlloyDB.

Andi Gutmans, vice president and general manager of databases at Google Cloud, said, “We want to meet customers where they need us. We will be the single throat to choke for customers.”

Enhancing data standards

When it comes to business intelligence, firms must be confident in the reliability of their data. Google has introduced a new tool called Looker Modeler, which enables customers to specify particular business KPIs using Looker’s semantic modeling layer. The intention is for Looker Modeler to become the authoritative source of all database metrics, which can subsequently be shared with any BI tool.

Regarding data privacy, Google unveiled BigQuery data clean rooms, a new tool meant to facilitate the secure sharing and matching of information across businesses. Users can submit first-party data and mix it with third-party statistics for marketing reasons beginning in the third quarter.

Through partnerships with companies like Habu Inc., which will integrate with BigQuery to support privacy-safe data orchestration, and LiveRamp Holdings Inc., which will enable privacy-centric data collaboration and identity resolution in BigQuery, Google is also expanding its vision for data clean rooms.

Expanding machine learning operations

The last upgrades concern BigQuery ML, a service that enables the application of machine learning with current SQL tools and expertise. To bring machine learning even closer to customers’ data, Google is adding new features to BigQuery that will enable the import of models such as PyTorch, the hosting of distant models on Google’s Vertex AI service, and the execution of pre-trained models from Vertex AI.

Google also introduced several machine learning-focused BigQuery extensions. For example, BigQuery now interacts with DataRobot Inc.’s machine learning operations platform, allowing clients to use notebook environments to construct ML models and publish their predictions within BigQuery. Google noted that the goal is to enable developers to experiment with machine learning more quickly.

A second cooperation involves Neo4j Inc., the developer of the graph database management system Neo4j. Google said customers may now enhance SQL analysis with graph-native data science and machine learning. Google is now integrating BigQuery, LookML, and Google Sheets with ThoughtSpot Inc.’s BI analytics platform to facilitate AI-powered natural language search and make it easier for users to extract insights from corporate data.