UPDATED 19:50 EST / AUGUST 20 2019


Company profile: H20.ai sprinkles in new risk mitigation features

Artificial intelligence has become the intelligent nerve center of most cloud applications. However, AI has also become a major risk factor in many modern enterprises.

AI’s risks may stem from design limitations in a specific buildout of the technology, such as when a machine learning model incorporates a biased feature set.  Other risks may be due to inadequate lifecycle governance practices, such as failure to retrain decaying machine learning models on fresh data.

Enterprises require strong AI DevOps tooling and practices to mitigate these risks, and that’s where solution providers such as H20.ai enter the picture. Founded in 2012, H20.ai has become a well-respected niche vendor of data science workbench solutions. It offers a deep stack of open-source tools and libraries for most AI application development scenarios.

Through partnerships with the leading public cloud providers, H20 has put its tooling in the hands of enterprises of all sizes and all industries worldwide. In addition, H20 has strong integration with Nvidia graphics processing units to ensure high-performance training and inferencing in diverse development and production environments.

One of H20.ai’s key competitive differentiators is keen focus on AI developer productivity. The vendor has focused on providing user-friendly tooling that can be used by business analysts and other subject matter experts to build and optimize AI without need for traditional data scientists. Its Driverless AI solution automates the end-to-end workflow of building, training, deploying, optimizing, and governing ML models.

Also, in its latest release, H20.ai has added several features to help AI DevOps teams mitigate some business risks associated with machine learning. The key new features in this regard are the ability to:

  • Analyze whether a model produces disparate adverse outcomes for various demographic groups even if it wasn’t designed with that outcome in mind,
  • Automate monitoring of deployed models for predictive decay,
  • Benchmark alternative models for A/B testing; and
  • Alert system administrators when models need to be recalibrated, retrained, and otherwise maintained to keep them production-ready.

Ensuring that AI DevOps teams work seamlessly together is a big part of the risk-mitigation picture, and H20.ai also has strong capabilities in this regard. In this latest release, it has established an shared DevOps workspace where data scientists and enterprise IT can collaborate on different projects, build and manage models, and scalably deploy the models to diverse production environments.

The VC community holds a high opinion of H20.ai’s growth prospects. This explains why, in its latest funding round, announced this week, the vendor has almost doubled its outstanding capitalization.

Giving vendors ample reason for confidence, H20.ai now reports that it has tripled its enterprise customer base since its last funding round just 21 months ago in November 2017. It now has more than 5,000 customers worldwide, as well as an extensive range of partners among systems integrators, resellers, technology vendors, and cloud services providers. It boasts a growing customer base in every industry on every continent. And it has expanded its global footprint by establishing AI centers of excellence in the Czech Republic and India.

If there are any dark clouds on H20.ai’s horizon, they likely come from a few strategic directions:

For starters, H20.ai competes in an increasingly crowded niche of data-science DevOps solution providers, many of which are also startups backed by a wide range of VCs.

Also, H20.ai’s principal cloud partners — AWS, Microsoft, and Google Cloud Platform—have also been investing heavily in building out their respective AI DevOps tooling and may grow less inclined over time to partner with a startup competitor.

Furthermore, H20.ai’s strategic bet that demand from so-called “citizen data scientistswill sustain its growth may fizzle if business analysts and other non-traditional developers don’t warm up to its tooling.

And, though it’s not yet a competitive showstopper here in mid-2019, the AI DevOps market is starting to shift toward solutions geared to edge, robotics, and other use cases that rely on reinforcement learning, which is not an H20.ai strong suit.

Nevertheless, Wikibon remains quite positive on H20.ai, which is firmly entrenched as a go-to niche vendor of robust enterprise AI DevOps tooling. My colleague John Furrier talked to company founder and Chief Executive Sri Satish Ambati in a recent interview on theCUBE, SiliconANGLE’s video studio:

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy