Arthur helps tackle the pain point of data scientists babysitting data
Breakthroughs in artificial intelligence, such as ChatGPT, are turning out to be huge inflection points in this field.
Using automation and real-time optimization and metrics, ArthurAI Inc. streamlines artificial intelligence models so that functions of practitioners such as data scientists are not encumbered, according to Adam Wenchel (pictured), chief executive officer of Arthur.
“Traditionally, data scientists would spend 25, 30% of their time just manually checking in on their model, day-to-day babysitting as we call it, just to make sure that the data hasn’t drifted, the model performance hasn’t degraded, that a programmer did make a change in an upstream data system,” Wenchel stated. “What we do is bring the same kind of automation that you have for other kinds of, let’s say, infrastructure monitoring, application monitoring, we bring that to your AI systems.”
Wenchel spoke with theCUBE industry analyst John Furrier for a CUBE conversation ahead of the AWS Startup Showcase: “Top Startups Building Generative AI on AWS” event. They discussed how Arthur is making the artificial intelligence space seamless. (* Disclosure below.)
The AI-native way will be the new norm
Taking the AI-native route will take shape in the near future, according to Wenchel, who said that this will be the case because artificial intelligence will underpin key areas that cumulatively drive profit and loss in enterprises.
“The thing I’m really excited about for the next couple years now … is a sort of convergence of AI and AI systems … turning into AI-native businesses,” he stated. “One of the things that we work a lot with our customers is to just understand, you know, take these really esoteric data science notions and performance and tie them to all their business KPIs … it’s kind of like the operating system for running your AI-native business.”
Based on performance analysis, Wenchel believes Arthur plays an instrumental role in showcasing what’s cooking in AI models. As a result, common pitfalls are avoided.
“I mean, there’s all sorts of ways to access LLMs either via either API access or downloadable in some cases,” he said. “Then our secret sauce really is the way that we provide that performance analysis of what’s going on. We can tell you in a very actionable way, like, ‘Hey, here’s where your model is doing good things; here’s where it’s doing bad things.’”
LLMs are taking the world by storm
Since large language models can be used for generative tasks, Wenchel believes they are proving to be revolutionary. For instance, applications are being produced using LLMs because they are incredibly useful in the real world.
“So an LLM or a large language model would imply a large language model that’s been trained on a huge amount of data typically pulled from the internet,” Wenchel noted. “It’s a general purpose language model that can be built on top for all sorts of different things that includes traditional NLP tasks like document classification and sentiment understanding. These language models could be applied in so many different business contexts, and the amount of value that’s being created is again unprecedented compared to anything.”
Given that Arthur is an operation-focused platform, it triggers alerts as problems pop up, according to Wenchel, who said that this makes resolution easier.
“If you’re familiar with like the way people run security operations centers or network operations centers, we do that for data science,” he pointed out. “So think of it as a DSOC, a Data Science Operations Center, where all your models, you might have hundreds of models running across your organization, you may have five, but as problems are detected, alerts can be fired and you can actually work the case.”
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s pre-event coverage of the AWS Startup Showcase: “Top Startups Building Generative AI on AWS” event:
(* Disclosure: ArthurAI Inc. sponsored this segment of theCUBE. Neither Arthur nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
Photo: SiliconANGLE
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU