As enterprises adopt AI, open-source leaders worry about regulatory and proprietary issues
The open-source community has a message for regulators and the tech industry as it relates to artificial intelligence: Keep governance neutral and the technology open.
This was one of the key talking points that emerged from the AI.dev + Cassandra Summit hosted by The Linux Foundation in San Jose this week as open-source contributors and tech company executives gathered to assess the current state of enterprise AI.
“Just like the internet, generative AI was built on open technology,” Jim Zemlin, executive director of The Linux Foundation, said in his keynote remarks at the conference on Tuesday. “Unfortunately, we are starting to see a little bit of a trend moving away from open AI toward more closed foundation models.”
Zemlin’s concerns were backed by Linux Foundation survey results he presented that showed 41% of respondents would prefer open-source generative AI technologies versus only 9% who preferred proprietary solutions. A significant majority of 95% also supported neutrality as a key aspect of generative AI governance.
“Calls for restricting open AI are futile at best,” Zemlin said.
The tech community’s interest in an open model and neutral governance structure has not prevented government regulators from becoming involved in the role of open AI.
Following months of debate, European Union regulators recently signaled that open-source models could be exempted under the region’s landmark Artificial Intelligence Act. In the U.S., an executive order from the White House delegated a decision about the use of open-source AI models to the National Telecommunications and Information Administration or NTIA. The agency’s administrator indicated a willingness this week to hear all viewpoints on the subject before making recommendations to the White House by July.
New models and projects
In the meantime, activity within the open-source community to create new tools and solutions for deployment of generative AI is continuing at a brisk pace. Conference attendees heard from executives from a number of key companies that will have significant involvement in the technology’s direction.
Ankit Patel, senior director of developer marketing at chipmaker Nvidia Corp., noted that his firm is currently involved in more than 500 open-source projects, and just added the Mixtral 8x7B Instruct large language model to its catalog. The model is based on a recent release from Mistral AI, a European competitor to OpenAI and ChatGPT.
Nvidia has also devoted its resources to the deployment of Megatron-LM, a deep learning architecture developed by the company’s research team. “Lots of the community has been able to learn and share from the things we put into Megatron-LM,” Patel said during his keynote appearance on Wednesday. “We have lots of contributors, we work with everybody in the community.”
Participation in open AI projects can be found among the cloud giants as well. Project Jupyter, an open-source initiative that builds tools for machine learning frameworks, has been actively supported by Amazon Web Services Inc., which is a key contributor to the project. In May, AWS announced two generative AI extensions for Jupyter to democratize AI and scale up machine learning workloads.
“We have a team that works full time on open source and Jupyter is a particular area of focus,” said Brian Granger, senior principal technologist for AI platforms at AWS. “It’s collaborative and very configurable and extensible.”
Building data architecture
The cloud-native world is also influencing the next wave of AI innovation through database technology. Astra DB, a cloud database built on top of open-source Apache Cassandra, is being positioned by DataStax Inc. as a flexible data architecture that can adapt to businesses’ rapidly changing AI needs.
“You have to focus on developer experience because it’s about getting production apps quickly,” said Chet Kapoor, chairman and chief executive officer of DataStax Inc., in his keynote remarks on Tuesday. “Relevance is a new metric to think about in an app.”
One of the more significant companies to emerge in the open-source AI space over the past year is Hugging Face Inc., which hosts more than 500,000 AI models and reports serving more than 1 million model downloads per day. It has also been building its own ecosystem of industry partnerships, including recent alliances with Dell Technologies Inc., Advanced Micro Devices Inc., Nvidia, ServiceNow Inc. and AWS.
Hugging Face’s role in making powerful base models available for AI throughout the open-source community has helped fuel continued adoption of the technology. “The open-source community is amazing and nothing is catching up to its progress,” said Hugging Face executive Jeff Boudier. “When you build with open source, you build with technology that’s future-proofed.”
The future for open source can be seen in the library of more than 170 projects that the Cloud Native Computing Foundation is currently shepherding. These are not all AI projects, yet the sheer volume of activity in the generative world will likely draw many of them into the AI orbit.
“As projects age in their lifecycles, there are different needs, it’s not a static landscape,” Jorge Castro (pictured), senior director of developer experience at the CNCF, said in an exclusive interview with SiliconANGLE. “Now AI has been introduced into the environment. Every project is going to try and figure out how they fit into that environment.”
Photo: Mark Albertson/SiliconANGLE
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU