UPDATED 21:33 EDT / AUGUST 08 2018

INFRA

Intel to boost its CPUs with Optane memory to fend off AMD and Nvidia

Intel Corp. on Wednesday laid out its plans to stave off the threat to its business from rival chipmakers Advanced Micro Devices Inc. and Nvidia Corp.

The company said it’s planning to combine current generation of microchips with its new Optane memory technology in order to give them superior capabilities that can speed up artificial intelligence workloads.

Intel has become increasingly reliant on the data center market in recent years as the market for personal computers continues to stagnate. But during its most recent earnings call last month, the company provided a disappointing outlook for its data center business, causing its stock to fall by more than 8 percent.

Meanwhile, analysts believe Intel could lose significant market share to AMD next year thanks to its rival’s planned release of its superior seven-nanometer chips.

Intel’s current generation of chips have 14 nanometers between transistors, which means they’re much slower than AMD’s next-generation of chips. Intel is busy developing its own line of 10-nanometer chips, but these won’t be released until 2019, with server chips not expected until 2020.

That means Intel is in an uncomfortable position, since AMD will have substantial time to encroach on its rival in both the PC and data center markets, especially if its 10-nanometer chips run into any delays.

In order to protect its lead in both markets, Intel said at its Data-Centric Innovation Summit in Santa Clara, California Wednesday that it’s planning to “stitch together” its current central processing units or CPUs with its new Optane memory chips and software called Intel DL Boost that’s designed to accelerate AI workloads. Intel said the combined technologies, called Cascade Lake, will also help it compete against another rival, Nvidia, which has made huge progress in AI with its own graphics processing units or GPUs.

Intel has spent the best part of a decade building its Optane memory technology from scratch, and both the company and analysts believe it can provide additional capabilities that its rivals’ chips lack. The technology is designed to support the massive storage requirements of today’s hyperscale data centers and comes with much higher storage capacities per module than traditional dynamic random access memory. Its first products based on the technology are available in three capacities – 128, 256 and 512 gigabytes – which is significantly larger than the most recent DRAM modules can offer.

“For certain big-data workloads that work best with two-plus sockets and a ton of memory, Cascade Lake plus Optane will deliver incredible performance,” Patrick Moorhead, president and principal analyst at Moor Insights & Strategy, told SiliconANGLE. “SAP is a great example of this.”

Even so, Intel’s plans are clearly only a temporary strategy designed to hold off its rivals’ pursuit until the company can ship its next generation of processors, another analyst said.

Holger Mueller, principal analyst and vice president of Constellation Research Inc., told SiliconANGLE that Intel was currently behind its rivals in the race to supply data center infrastructure for next-generation applications that rely on AI.

“It’s a new game with new players where GPUs and CPU efficiency play an increased role, and Nvidia and AMD are leading here,” Mueller said. “But Intel is going to bring together higher-level capabilities above the CPU in order to keep its customers happy and buying its chips.”

The good news for Intel is that at least some of its customers are indeed happy with its efforts. Prior to the Cascade Lake announcement, Intel boasted that it had sold $1 billion worth of AI processor chips during 2017, in its first disclosure of revenue from the fast-growing business segment.

Navin Shenoy (pictured), executive vice-president and general manager of Intel’s Data Center Group, said the company had been able to modify its CPUs to make them 200 times better at AI training workloads, resulting in $1 billion worth of sales of its Xeon processors specifically for these applications.

“The step-function increase in performance led to a meaningful business impact for us,” Shenoy told the audience.

Shenoy also said the company estimates a market opportunity for its data center business worth $200 billion by 2022. Previously the company had estimated a total addressable market of just $160 billion.

Image: Intel/livestream

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU