![](https://d15shllkswkct0.cloudfront.net/wp-content/blogs.dir/1/files/2020/10/Shahin-Khan-Exascale-Day-2020.jpg)
![](https://d15shllkswkct0.cloudfront.net/wp-content/blogs.dir/1/files/2020/10/Shahin-Khan-Exascale-Day-2020.jpg)
The increasing amount of data being generated today often calls for the use of supercomputers to process it. But this phenomenal amount of information also tends to push supercomputing to the edge, to process data as close as possible to the source or end user.
“I think initially you’re going to see it on base stations, antenna towers, where you’re aggregating data from a large number of end points and sensors,” said Shahin Khan (pictured), founding partner and analyst at OrionX.net. “That thing can now do the processing and do some level of learning and decide what data to ship back to the cloud, and what data to get rid of, and what data to just hold.”
Khan spoke with Dave Vellante, host of theCUBE, SiliconANGLE Media’s livestreaming studio, during Exascale Day 2020. They discussed the hottest current trends in technology the intersection between supercomputing and areas such as 5G, AI and blockchain, and competition between companies and countries around these technologies. (* Disclosure below.)
A good example about using supercomputing on the edge involves surveillance cameras. “You don’t really need to ship every image back to the cloud, and if you ever need it, the guy who needs it is going to be on the scene, not back at the cloud,” Khan explained.
In this way, a supercomputer can do some of the processing onsite and send an image every five seconds or every 10 seconds, Khan explained. Thus, it is possible to have a record of it while also reducing the bandwidth by orders of magnitude.
“Things like that are happening, and to make sense of all of that is to recognize when things change. Did somebody come into the scene, or is it just [that] the day became night?” Khan pointed out. “That sort of decision can now be automated, and fundamentally what is making it happen may not be supercomputing exascale class, but it’s definitely [high-performance computing].”
Supercomputing also plays a significant role with 5G communication, managing the traffic of multiple-in/multiple-out, which references multiple transmission and reception antennas for enhanced wireless communications performance, such as data throughput.
“To optimally manage that traffic, such that you know exactly what beam it’s going to and what antenna it’s coming from, that turns out to be a nontrivial, partial differential equation,” Khan said. “So next thing you know, you’ve got HPC in there.”
In addition to allowing some data reduction and data processing almost at the point of inception or aggregation, supercomputing at the edge in this case can help people to build a controller there, according to Khan.
“People want vector instructions there, people want matrix algebra there, because it makes sense to process the data before you have to ship it,” he said. “So HPC is cropping up really everywhere.”
Supercomputing and 5G are among the mega technology trends of the current informational age, according to Khan.
“Mega trends that drive that in our mind is IoT, because that’s the fountain of data; 5G, because that’s how it’s going to get communicated; AI and HPC, because that’s how we’re going to make sense of it; blockchain and cryptocurrencies, because that’s how it’s going to get transacted and that’s how value is going to get transferred from a place to place; and then finally quantum computing, because that exemplifies how things are going to get accelerated,” he explained.
Supercomputing is also very connected to AI, which has a long history and whose focus has changed over the years. First, AI was focused on expert systems and logical processing; then it turned to search and later became computational with neural networks, according to Khan.
“When deep neural nets showed up about a decade ago or more, [AI] finally started working, and it was a confluence of a few things: The algorithms were there, the data sets were there, and the technology was there in the form of GPU and accelerators that finally made this tractable,” he explained.
AI was kind of languishing for decades before HPC technologies reignited it, Khan added.
“And when you look at deep learning, which is really the only part of AI that has been prominent and has made all this stuff work, it’s all HPC, it’s all matrix algebra, it’s all signal processing,” he said. “I see a lot of interest in HPC talent right now, in part motivated by AI.”
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of Exascale Day 2020. (* Disclosure: TheCUBE is a paid media partner for Exascale Day 2020. Neither Hewlett Packard Enterprise, the sponsor for theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
THANK YOU