Intel has just announced a brand new class of AI processor: the Intel Nervana NNP-1. This is one of the first truly powerful AI processors that Intel has promised to produce. All previous AI chips the company made were in the mWatt of power, this one is going to be in the "hundreds of watts" of power. While no specific details were given in the demo, it was inferred that the technology will take advantage of Intel's DL Boost technology to offer a CPU based competitor to GPUs.
Intel takes aim at the GPGPU AI market with Intel Nervana NNP AI chip with 'hundreds of watts' of power
The AI inference market has pretty much been dominated by GPUs ever since its inception - and for good reason considering GPUs are widely parallel processors that are ideally suited for inference. Intel, however, claims to have solved the problem of the CPU AI performance and demoed a competitive benchmark that showcased them achieving 5x inference performance on a tailor-made inference suite. If the GPU in question was actually a decent one and priced appropriately, then its a pretty big deal and could mark the beginning of a disruption of the AI market.
The new AI chip will consume juice in the 100 of watts, which is great news because power consumption correlates with performance and their last chip could only sip electricity in the milliwatts. This means they have effectively created a performance jump that will be in the hundreds compared to its predecessor. If the wrench benchmark showcased on screen with a 5x speedup is even remotely close to the truth (on a generalized and reproducible basis) than this represents a very material change in Intel's ability to compete in AI.
The company mentioned it plans to introduce the chip by 2019. What's more, the company has partnered up with Facebook for the development of this chip which is also a huge deal. If the chip can deliver on its performance then Facebook will be a huge market (they use inference in multiple services including the auto-tagging feature) that they can tap into almost immediately.