UPDATED 09:00 EDT / MARCH 26 2024

digital circuit cloud electronics AI

Vectara adds anti-hallucination features to its generative AI service

Vectara Inc., the developer of a platform that organizations can use to train AI generative artificial intelligence models on their own data, today said it has added a factual consistency score metric to all responses trained by its service to assess their accuracy.

The company offers retrieval-augmented generation or RAG as a service. RAG is a method used in natural language processing that enhances the quality, relevance and accuracy of generated text by first retrieving relevant information from a large dataset or knowledge base and using it to inform the generation process. Organizations can use RAG to refine a model to deliver responses based on their own documents and databases rather than public information.

The company is addressing one of the biggest gating factors to broader enterprise adoption of generative AI. Numerous studies have shown that fear of bias, inaccuracy and contextually irrelevant responses called hallucinations are the biggest reasons companies haven’t deployed externally facing generative AI models more broadly. Vectara’s published calculations estimate hallucination rates of 3% to 16.2%, depending on the large language model used.

Vectara mitigates this ambiguity for enterprises by providing a score that grades the likelihood that a generated response is a hallucination. It’s based on an evolved version of the Hughes Hallucination Evaluation Model the company developed.

HHEM compares the original source document and the summary generated by the LLM and assigns a score between 0 and 1, where 0 indicates a complete hallucination and 1 represents perfect factual consistency.

The score assesses a direct probability; for example, 0.98 indicates a 98% probability of factual consistency. The company said most other classifiers disregard calibration. Users can set thresholds for response acceptance based on a detailed accuracy score, giving them the flexibility to set thresholds for action.

Vectara said its HHEM is the No. 1 hallucination detection model on Hugging Face, with more than 100,000 downloads in five months.

Image: Pixabay

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU