AMD has just taken edge AI to a whole new level, as with the recent driver update, the company has now brought the support of large LLMs to PC users.
AMD's Strix Halo Platform Has Brought High-End AI Power to Consumer Machines, Setting New Standards
Team Red is indeed one of the leading firms that has brought massive AI compute into consumer machines through its XDNA engines. AMD's APU offerings, such as those within the Strix Point and Strix Halo lineups, offer some of the highest AI performance out there, and now, AMD has taken edge AI to the next level. The company's most recent Adrenalin Edition 25.8.1 driver has brought in support for up to 128B parameter LLM model, and this will now allow consumer devices to support models like Meta’s Llama 4 Scout, which is a one-of-a-type achievement.

Interestingly, through AMD's Variable Graphics Memory (VGM), consumers can make up to 96 GB of graphics memory available to the iGPU, enabling large-scale models to run locally. As the above-mentioned Llama model is an MoE-based implementation, it only utilizes 17B parameters at run time, but despite that, consumers can have decent TPS figures, allowing the LLM to be used as a as a highly capable AI assistant.

More importantly, AMD has made massive strides in the model context size, and while the usual industry standard with consumer processors was somewhere around 4096 tokens, the firm has managed to increase it by several times, reaching to a context length of 256,000 tokens, allowing a much greater control over your workflow, and never worry ever about performance. These are some serious figures by AMD, and they show that the "AI compute" fever has indeed expanded towards consumer PCs.
For now, the Strix Halo platform is available on a limited devices, and they aren't easily accessible, with some of them even exceeding the $2,000 price tag. However, AMD's advancements with AI computation is indeed something optimistic, and it makes "AI power" accessible to everyone, although you would need to carry a heavy wallet in order to acquire the currently available devices.
Follow Wccftech on Google to get more of our news coverage in your feeds.





