Intel Steps Into AI, Debuts Nervana Neural Network Processor

WebProNewsTechnology

Share this Post

Intel might not have taken to AI immediately but the company is now scrambling to catch up. In a bid to assert its dominance, the Silicon Valley stalwart recently unveiled the Intel Nervana Neural Network Processor (NNP). These range of chips were designed specifically with artificial intelligence in mind.

Since they are intended for AI, Intel has partnered with Facebook in the design and development of these neural chips. According to Intel CEO Brian Krzanich, the chips could usher in new kinds of AI applications that could change social networks, the health industry, cars and even weather forecasts.

The technology behind the chips is undoubtedly tied to Nervana Systems, a company that Intel bought for $350 million last August. Since acquiring the deep learning startup, Intel has been sending teasers about the NNP line under the code “Lake Crest.” The resulting neural chips are an integral part of the company's goal to establish deep learning models' speedy training time.

Since the NNP is designed to meet the demands of machine learning, it's likely that the chips won't be found on personal computers but in data centers. Intel CPUs might have a strong presence in server stacks, with a 96% market share in data centers, but present AI workloads are better handled by the graphical processors (GPUs) of ARM and NVIDIA. This is why the demand for their chips have risen astronomically. Even Google has gotten in on the action with its Tensor Processing Unit (TPU), which the company uses to power its cloud servers. Meanwhile, firms like Graphcore are also looking to break into the industry.

The question now is how fast are Intel's neural chips. Unfortunately, the company is keeping mum on the details. When Google launched the newest version of its TPU chips, the company published test results it conducted against its rivals. In Intel's case, the company would only say that it's on its way of meeting its objective of advancing deep learning speeds by as much as 100 times and that it intends to achieve this by 2020.

To meet its goal, NNP has to perform better than NVIDIA's V100 GPU and Google's second-gen TPU. But there's a strong possibility that Intel would be able to meet this mark and more, especially since the company has its 14nm process technology on hand.

Intel is also keeping quiet on when the NPP chips will hit the market, although more details are expected to come out soon. There are some rumors that the new neural chips could be available in limited quantities by the end of the year.  

[Featured image via Intel]
WebProNews

WebProNews | Breaking eBusiness News Your source for investigative ebusiness reporting and breaking news.