UPDATED 16:16 EST / FEBRUARY 11 2026

INFRA

Photonic AI chip startup Olix nabs $220M investment

Olix Computing Ltd., a startup developing an artificial intelligence chip with integrated optical components, has raised $220 million in funding.

The Financial Times reported today that the investment was led by Belgian venture capital firm Hummingbird Ventures. The deal values U.K.-based Olix Computing at over $1 billion. The company previously raised an undisclosed amount of funding from Plural, Vertex Ventures, LocalGlobe and Entrepreneurs First.

Olix’s chip is optimized for inference, the task of running AI models in production after training is complete. It’s unclear what optical components are included in the processor or how they’re used. However, a blog post on the company’s website states that its chip features a “novel memory and interconnect architecture.” That suggests Olix is using the photonic components to power an interconnect, the part of a processor responsible for moving data between circuits.

Several other startups are developing photonic interconnects. One of the best-funded players, Ayar Labs Inc., has built an optical interposer that can be used to build chips with a surface area of 40 square centimeters. That’s more than double the size of Nvidia Corp.’s Blackwell B200 graphics cards.

Optical interconnects’ value proposition is that light travels faster than the electrical signals used by today’s chips to move data. As a result, the technology can theoretically provide significantly higher throughput. It also uses less power.

Olix says its processor is designed to address a technical challenge called the memory wall. It relates to HBM, the external memory that AI chips use to store data. 

A graphics card carries out calculations by loading data from HBM memory to its onboard cache and then saving the results back to the HBM module. The speed at which the chip can move data to and from memory directly influences performance. A memory wall emerges when a chip can’t operate at full speed because of performance bottlenecks in its HBM module.

According to Olix, its chip design addresses the challenge by not using HBM. The processor stores data solely in SRAM, a significantly faster type of memory.

An HBM memory cell, the basic building block of HBM devices, comprises a transistor and a kind of miniature battery known as a capacitor. SRAM, in contrast, uses a more complex design with six transistors. Another contributor to SRAM’s performance is that it’s usually integrated directly into AI chips, whereas HBM memory is implemented as a standalone module. That means SRAM is closer to the host chip’s transistor, which reduces data travel times.

Startup Cerebras Systems Inc. also prioritized SRAM when designing its wafer-size AI accelerator. The chip includes 44 gigabytes of SRAM, which enables many AI models to run without using HBM. Olix claims that its photonics technology outperforms “silicon-only SRAM-architectures in interactivity and latency.”

Olix’s chip is called the OLIX Optical Tensor Processing Unit, or OTPU for short. A tensor is a mathematical object that AI models use to hold information. Many AI chips include circuits specifically optimized to process such objects.

The chip presumably also includes circuits optimized for workloads other than tensors. Google LLC’s tensor processing units, which are likewise designed to power AI models, combine tensor-optimized cores with scalar and vector units. Those are circuits optimized for tasks such as managing memory.

Olix will use its newly raised capital to finance chip development initiatives. A job posting on the company’s website indicates that it’s also working on a compiler that can adapt existing AI models to run on its silicon. According to the Financial Times, Olix expects to start shipping OTPU chips to customers next year.

Photo: Unsplash

A message from John Furrier, co-founder of SiliconANGLE:

Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.

  • 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
  • 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About SiliconANGLE Media
SiliconANGLE Media is a recognized leader in digital media innovation, uniting breakthrough technology, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — with flagship locations in Silicon Valley and the New York Stock Exchange — SiliconANGLE Media operates at the intersection of media, technology and AI.

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.