AMD has developed the AMDXDNA accelerator driver for the mainline Linux kernel to support AMD Ryzen AI NPUs. User-space software for effectively utilizing these NPUs on Linux has been limited, with most applications relying on iGPU support instead.
phoronix.com
4 min
3/11/2026
Anush Elangovan, AMD's VP of AI Software, is developing a pure-Python AMD GPU user-space driver using Claude Code. This driver aims to support the testing and debugging of the ROCm/HIP user-space stack.
phoronix.com
2 min
3/5/2026
A small-scale distributed inference cluster can be built using AMD’s Ryzen™ AI Max+ AI PC platform to run a one trillion-parameter Large Language Model. A four-node cluster of Framework Desktop systems demonstrates the local inference of the Kimi K2.5 open-source model.
amd.com
14 min
3/1/2026
AMD has developed the AMDXDNA accelerator driver for the mainline Linux kernel to support AMD Ryzen AI NPUs. User-space software for effectively utilizing these NPUs on Linux has been limited, with most applications relying on iGPU support instead.
phoronix.com
4 min
3/11/2026
A small-scale distributed inference cluster can be built using AMD’s Ryzen™ AI Max+ AI PC platform to run a one trillion-parameter Large Language Model. A four-node cluster of Framework Desktop systems demonstrates the local inference of the Kimi K2.5 open-source model.
amd.com
14 min
3/1/2026
Anush Elangovan, AMD's VP of AI Software, is developing a pure-Python AMD GPU user-space driver using Claude Code. This driver aims to support the testing and debugging of the ROCm/HIP user-space stack.
phoronix.com
2 min
3/5/2026
AMD has developed the AMDXDNA accelerator driver for the mainline Linux kernel to support AMD Ryzen AI NPUs. User-space software for effectively utilizing these NPUs on Linux has been limited, with most applications relying on iGPU support instead.
phoronix.com
4 min
3/11/2026
Anush Elangovan, AMD's VP of AI Software, is developing a pure-Python AMD GPU user-space driver using Claude Code. This driver aims to support the testing and debugging of the ROCm/HIP user-space stack.
phoronix.com
2 min
3/5/2026
A small-scale distributed inference cluster can be built using AMD’s Ryzen™ AI Max+ AI PC platform to run a one trillion-parameter Large Language Model. A four-node cluster of Framework Desktop systems demonstrates the local inference of the Kimi K2.5 open-source model.
amd.com
14 min
3/1/2026
No more articles to load