DataRobot announces new applied generative AI offering for building trustworthy AI apps
Artificial intelligence startup DataRobot Inc. today announced a new generative AI offering designed to allow businesses and developers to integrate their data into AI models and gain confidence in the answers that they receive.
DataRobot provides a full-lifecycle AI platform for generative and predictive AI models and a broad ecosystem of tools for applied AI experts to experiment, build, deploy, monitor and govern enterprise-grade applications that use artificial intelligence.
This new offering extends the company’s DataRobot AI Platform to provide a host of new capabilities for enterprise customers who are looking to take advantage of the explosive popularity of generative AI large language models such as OpenAI’s GPT-4. These enterprise-grade LLMs allow businesses to do research using their own internal data and records, quickly summarize large documents and receive answers using natural conversational language.
When businesses use AI models, they want to be able to use data from within their own knowledge base at large scale and DataRobot says it provides an easy way to do that. At the same time, they want to be sure that the answers they are getting are the correct ones, so DataRobot provides increased observability and monitoring for the end users.
“For our customers, a major pain point of getting these offerings into the market and using them was that everyone rushed to get them out there,” Jay Schuren, chief customer officer at DataRobot, told SiliconANGLE in an interview. “So, people start building prototypes and then immediately they get to a point of asking: ‘Is that good?’”
That can be a real challenge. Anyone who has used OpenAI LP’s AI chatbot ChatGPT has probably experienced its tendency to “hallucinate,” or when it produces entirely inaccurate and contrafactual results with extreme confidence. This is unlikely to happen on a trivial question such as “How do you make apple pie?” but it can happen on something complex or a question about a recent event, because the model doesn’t have the answer.
Schuren said that DataRobot added a confidence value to every reply that the model produces so that customers can have an idea about the answer that they’re receiving. Aside from potentially providing references, DataRobot can also give a percentage score showing a confidence value to the end user. The end users can also vote on how accurate they believe the answer was to refine the probability.
“Imagine we have a bunch of bankers who have to read federal filings like 10-Ks,” Schuren said. “As they’re chatting and asking questions, we can take those 10-Ks, read them in and create one of these solutions. Now there’s a chatbot to extract information about 10-Ks and then we can capture, ‘Is it useful: yes or no.’ As they get their questions, this assigns a probability score. If it’s good, that gives people confidence to start to go from ‘only the experts are using it’ to now the whole analyst team can have expert quality guidance on whether an answer is sound.”
DataRobot also provides generative AI-specific management and governance capabilities such as guardrails for preventing toxicity, maintaining tone, ensuring the model stays “on-topic” and more based on customer needs.
Customers in highly-regulated industries, such as healthcare, also face a problem of where they are concerned that any data sent to an AI might become incorporated into the training data of the AI model. One of the services that DataRobot provides is a framework for allowing businesses to use their proprietary data as a source for summarization and answers without it violating laws or regulations such as the Health Insurance Portability and Accountability Act.
Baptist Health South Florida has been using DataRobot’s solution to provide summarizations and answers about patient records for healthcare workers about patient care through GPT-4 without integrating the regulated information into the AI training data. With the AI platform, healthcare workers can ask questions such as what is a safe meal to provide to a patient given their medications and medical history.
“The generative AI space is changing quickly, and the flexibility, safety and security of DataRobot helps us stay on the cutting edge with a HIPAA-compliant environment we trust to uphold critical health data protection standards,” said Rosalia Tungaraza, assistant vice president of artificial intelligence at Baptist Health South Florida. “We’re harnessing innovation for real-world applications, giving us the ability to transform patient care and improve operations and efficiency with confidence.”
Schuren said that having the ability to use enterprise data with these tools is extremely important but what’s been holding business teams back is confidence in the models themselves.
“People start building and then they finally hit the question of is this good enough for my business, and can I make a decision based on this, and that’s where teams really slow down,” said Schuren. “They’re excited to adopt; they love the prototype and then they hit the quality part. And that’s where the observability and being able to capture the feedback becomes important.”
Image: DataRobot
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU