blobbotronic - stock.adobe.com

News Stay informed about the latest enterprise technology news and product updates.

Intel AI chip aims to boost inference for AI users

Adding to the Intel Nervana chip line, Intel and Facebook are expected to launch in the second half of 2019 a new processor designed to speed up inference.

Intel said it will expand its line of processors dedicated to handling AI workloads with a new Intel AI chip designed specifically to accelerate inference for AI users.

Unveiled during the 2019 Consumer Electronics Show (CES) in Las Vegas, the Intel Nervana Neural Network Processor for Inference (NNP-I) is a 10 nm chip that's built on Intel's new Ice Lake CPU microarchitecture. It's being developed in partnership with Facebook, and it's expected to be released in the second half of 2019.

"This is a really big deal for us. It expands our position in AI above and beyond what we've done in Xeon and what we've done in Core into a new domain," Navin Shenoy, executive vice president of Intel's data center group, said during Intel's livestreamed CES keynote on Jan. 8. Shenoy was referring to Intel's popular Xeon and Core processor lines.

AI and inference

As you're looking at new data outside the training set ... that's where these chips come in.
Adrian Bowlesanalyst at Storm Insights

The Intel AI chip is intended to accelerate inference for organizations with AI workflows. Inference, in the AI world, refers to a neural network's ability to automatically categorize data based on learned data.

The new Intel AI chip might be useful in supervised machine learning, said Adrian Bowles, an analyst at Boston-based Storm Insights.

"As you're looking at new data outside the training set, the system is going to make new inferences, and that's where these chips come in," Bowles said.

During the CES presentation, Intel said the chip will be useful for social media companies, like development partner Facebook.

Intel corporate sign
Intel's corporate sign

"It's going to be important for many new inference applications, like image recognition [and] image classification," Shenoy said.

The chip builds on the Intel Nervana NNP line, a suite of processors Intel released in 2017 that are built specifically to handle AI workflows.

In the past year or so, tech vendors -- including Amazon and Google -- have released chips designed for AI workflows, and companies like Nvidia and AMD have released powerful CPUs and GPUs for the same purpose.

Not enough info

Where the Intel Nervana NNP-I will stand next to competitors is still unclear. Bowles noted that Intel has released limited information about the chip, and it's too early to tell if it will be used mainly for cloud-based and edge computing, or if customers will buy and use the physical processor.

"If the trend is to move inference to the edge ... where does this chip fit?" Bowles said.

"Is it architecture that will be scaled down to devices? Is it something that will be distributed?" he continued.

During the CES event, Intel also unveiled a new client platform, code-named Lakefield, and said it plans to be heavily involved in upcoming 5G technologies. The company also briefly touched on another upcoming Intel AI chip, code-named Spring Crest, that is meant to better handle neural network training.

Dig Deeper on AI technologies

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

How might your organization use the upcoming Intel chip?
Cancel

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchCRM

SearchCIO

SearchDataManagement

SearchERP

Close