UPDATED 09:00 EDT / JUNE 24 2024

SECURITY

Ollama addresses remote execution flaw following Wiz discovery

As generative artificial intelligence continues to grow in popularity and become mainstream, so do security issues surrounding large language models and their support services.

A new report today from Wiz Inc. details one such vulnerability discovered in Ollama, the open-source infrastructure project designed to simplify the packaging and deployment of AI models. However, in a welcome twist, those behind the project responded promptly to address it.

Ollama was founded to simplify the packaging and deployment of AI models and allow users to run those models efficiently. Inspired by Docker and used by developers, data scientists and organizations looking to efficiently package, deploy and run AI models, Ollama is popular among open-source communities and enterprises that leverage AI for various applications, from research and development to production environments.

However, as detailed in Wiz’s report, Ollama was found to have a remote code execution vulnerability, designated CVE-2024-37032. Dubbed “Probllama,” the vulnerability allows an attacker to send specially crafted HTTP requests to an Ollama application programming interface server.

The Probllama vulnerability operates through a mechanism known as path traversal, which exploits insufficient input validation in the API endpoint “/api/pull.” By crafting a malicious file containing a path traversal payload in the digest field, an attacker can manipulate the server to overwrite arbitrary files on the system.

In Docker deployments, where the server runs with root privileges, the vulnerability can be exploited to gain full remote code execution. By corrupting crucial system files, such as “/etc/ld.so.preload,” attackers are able to place malicious code that gets executed whenever a new process starts, giving them control over the server and the ability to compromise the AI models and applications hosted on it.

Wiz’s researchers found that many Ollama instances with the vulnerability were exposed to the internet, posing a significant security risk. Fortunately, though, the Ollama team’s response was highly impressive.

Ollama responded around four hours after Wiz informed it of the vulnerability on May 4 and immediately committed to creating a fix. The fix was released three days later, on May 8 — at this point, big tech companies should be taking notes.

Though the fix has been out for over a month and a half, Wiz’s researchers are advising security teams to make sure theyre running patched versions of Ollama — those released May 8 or later — to protect against the vulnerability.

Image: Ollama

A message from John Furrier, co-founder of SiliconANGLE:

Support our open free content by sharing and engaging with our content and community.

Join theCUBE Alumni Trust Network

Where Technology Leaders Connect, Share Intelligence & Create Opportunities

11.4k+  
CUBE Alumni Network
C-level and Technical
Domain Experts
15M+ 
theCUBE
Viewers
Connect with 11,413+ industry leaders from our network of tech and business leaders forming a unique trusted network effect.

SiliconANGLE Media is a recognized leader in digital media innovation serving innovative audiences and brands, bringing together cutting-edge technology, influential content, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — such as those established in Silicon Valley and the New York Stock Exchange (NYSE) — SiliconANGLE Media operates at the intersection of media, technology, and AI. .

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a powerful ecosystem of industry-leading digital media brands, with a reach of 15+ million elite tech professionals. The company’s new, proprietary theCUBE AI Video cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.