Vertex AI, the Google Cloud development platform that enables companies to build services using machine learning and Google’s large language models, is getting new features to help prevent apps and services from sending inaccurate information. After releasing general availability for Vertex AI Grounding with Google Search feature in May – which allows models to retrieve information live from the internet – Google has now announced that customers will also have the option to improve their services’ AI results with specialized third-party data sets.
Google says the service will use data from providers like Moody’s, MSCI, Thomson Reuters and ZoomInfo and that landing with third-party datasets will be available in the “third quarter of this year.” This is one of several new features Google is developing to encourage organizations to adopt its “Enterprise-ready” generative AI experiences reducing the frequency with which models disclose misleading or inaccurate information.
Another is “high fidelity mode,” which allows organizations to derive insights for results generated from their own enterprise data sets rather than Gemini’s broader knowledge bank. High fidelity mode is powered by a specialized version of Gemini 1.5 Flash and is now available in preview via Vertex AI Experiments Tool.
Vector Search, which allows users to find images by referencing similar graphics, is also being expanded to support hybrid search. The update is available in public preview and allows these vector-based searches to be combined with text-based keyword searches to improve accuracy. Grounding with Google Search will also soon provide a “dynamic retrieval” feature that automatically selects whether information should come from established Gemini or Google Search datasets for requests that may require frequently updated features.