UPDATED 14:00 EDT / OCTOBER 17 2017

INFRA

Intel delivers Nervana Neural Network Processor chips for AI workloads

After pushing the concept of using Field Programmable Gate Arrays as accelerators for artificial intelligence and deep learning neural networks, Intel Corp. is taking the next step with the launch of its first batch of dedicated AI chips.

The chip giant today announced it will soon ship what it says are the first processors built specifically for AI workloads. Intel said its new Nervana Neural Network Processor family, codenamed “Lake Crest,” has been three years in the making, and will greatly enhance the speed and computational efficiency of deep learning.

The new chips were detailed in a blog post by Naveen Rao, corporate vice president and general manager of Intel’s Artificial Intelligence Products Group, who noted grandly that “machine learning and deep learning have taken over as the most important computational workloads of our millennia.”

Few will deny the importance of AI and machine learning, which has established itself as one of the fastest growing technology trends of the day. Nvidia Corp. has rocketed to fame and fortune the last few years thanks to its graphics processing unit chips becoming the standard for doing AI and machine learning work.

Intel is determined to put itself at the forefront of this trend as part of its strategy to maintain its dominance of the data center. That explains why Intel has spent the last 18 months touting the abilities of its FPGA accelerators, which are used to accelerate deep learning workloads for its general-purpose computing chips.

But FPGAs could soon find themselves shunted to one side by the Nervana Neural Network Processor series, which deliver many of the same speed and performance benefits. This is because the chips sport a new memory architecture that Rao said is designed to maximize their power and processing speeds. Unlike previous chips, the Nervana series does not come with a managed cache hierarchy, but instead uses software to manage on-chip memory to determine allocations for different operations.

Rao said this feature enables the Nervana chips to “achieve new levels of compute density and performance for deep learning.” He explained that one neat trick the chip can perform is something called “massive bi-directional data transfer,” which enables multiple processors to act as one large virtual chip that can accommodate larger AI models and deliver greater insights from data faster than before.

“This will be a big chip dedicated to training foremost, but can also be used for high throughput data requirements for inference,” said Patrick Moorhead, president and principal analyst at the research firm Moor Insights & Strategy. Training is the process of feeding data to deep learning algorithms so they can learn on their own rather than being explicitly programmed.

However, Moorhead said the Nervana chips wouldn’t completely eliminate the need for FPGAs for some workloads. “FPGAs will still be more cost effective for many inference apps, for example where one to three bits of precision can be applied to specific layers,” he said. Inference refers to the process of running the trained algorithms.

Intel hasn’t provided an exact date for Nervana Neural Network Processor’s launch, but said the new chips would be available soon.

Image: Intel

A message from John Furrier, co-founder of SiliconANGLE:

Support our open free content by sharing and engaging with our content and community.

Join theCUBE Alumni Trust Network

Where Technology Leaders Connect, Share Intelligence & Create Opportunities

11.4k+  
CUBE Alumni Network
C-level and Technical
Domain Experts
15M+ 
theCUBE
Viewers
Connect with 11,413+ industry leaders from our network of tech and business leaders forming a unique trusted network effect.

SiliconANGLE Media is a recognized leader in digital media innovation serving innovative audiences and brands, bringing together cutting-edge technology, influential content, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — such as those established in Silicon Valley and the New York Stock Exchange (NYSE) — SiliconANGLE Media operates at the intersection of media, technology, and AI. .

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a powerful ecosystem of industry-leading digital media brands, with a reach of 15+ million elite tech professionals. The company’s new, proprietary theCUBE AI Video cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.