Intel’s AI Neural Co-Processor Featured On Its CPUs Gets New GNA Driver

Jason R. Wilson

The future of AI and deep learning has several uses in science and quantum-level calculations, plus many more discoveries. Machine learning and artificial intelligence have been the most significant focuses over the last two years. The top three manufacturers — AMD, NVIDIA, and Intel — have introduced their advanced hardware and software in several supercomputers and various new applications. Intel updated their Gaussian and Neural Accelerator, or GNA, to the main Linux code.

Intel updates the GNA co-processor in the current Linux driver kernel

The GNA co-processor from Intel was introduced in Cannon Lake in 2018. It was constructed on the 10nm die and last saw the introduction of AVX-512 in the series. The architecture was short-lived, one of Intel's shortest architecture life, and only one mobile processor was released. Cannon Lake was replaced by Intel's Ice Lake processor architecture and discontinued the Cannon Lake line in early 2020.

Related StoryHassan Mujtaba
Intel Core i9-13900KS 6 GHz CPU Benchmarks Leak Out: 5% Faster Single & 10% Faster Multi-Threading Performance Versus 13900K

GNA co-processors are currently found in Gemini Lake, Elkhart Lake, Ice Lake, and higher, and the purpose of the co-processor is to allow the CPU resources to be opened for use in other processing areas while it is simultaneously seeing help in recognizing speech and reducing noise and more. Last year, engineers from Intel continued to update and advance the quality and purpose of the GNA co-processor in Linux to assist with future technology.

Currently, the GNA co-processor integration is on its fourth variation, and Intel has altered the coding to include the Linux Direct Rendering Manager framework or DRM. DRM engineers highly requested this integration to place the GNA library within the AI and DRM placements in the main Linux kernel and its subsystems.

The library consists of TensorFlow, Caffe, PaddlePaddle, PyTorch, mxnet, Keras, and ONNX, and Intel utilizes that for optimization in their CPUs, iGPUs, GPUs, VPUs, and FPGA. Intel's GNA library also employs the company's OpenVINO software toolkit, allowing developers to use the deep learning development kit for streamlined development and easy distribution to several platforms simultaneously with a broader support base, optimized API, and integrations, along with performance and portability. The integration flows into Windows, macOS, and Linux operating systems. In Linux, version three was introduced with updated support for newer development and deep learning scenarios.

News Sources: Phoronix, Intel

Share this story

Deal of the Day