Themata.AI
Themata.AI

Popular tags:

#developer-tools#ai-agents#llms#claude#code-generation#ai-ethics#openai#ai-safety#anthropic#open-source

AI is changing the world. Don't stay behind. Clear summaries, community insight, delivered without the noise. Subscribe to never miss a beat.

© 2026 Themata.AI • All Rights Reserved

Privacy

|

Cookies

|

Contact
🕒 Latest🔥 Top

Filtering by tag:

amdClear
NewsOpinionResearchTool
AMD Ryzen AI NPUs Are Finally Useful Under Linux For Running LLMs
amdryzen-aillmslinux
News

AMD Ryzen AI NPUs are finally useful under Linux for running LLMs

AMD has developed the AMDXDNA accelerator driver for the mainline Linux kernel to support AMD Ryzen AI NPUs. User-space software for effectively utilizing these NPUs on Linux has been limited, with most applications relying on iGPU support instead.

phoronix.com

🔥🔥🔥🔥🔥

4 min

3/11/2026

AMD Engineer Leverages AI To Help Make A Pure-Python AMD GPU User-Space DriverTool

AMD engineer leverages AI to help make a pure-Python AMD GPU user-space driver

Anush Elangovan, AMD's VP of AI Software, is developing a pure-Python AMD GPU user-space driver using Claude Code. This driver aims to support the testing and debugging of the ROCm/HIP user-space stack.

phoronix.com

🔥🔥🔥🔥🔥

2 min

3/5/2026

Running a One Trillion-Parameter LLM Locally on AMD Ryzen AI Max+ Cluster

A small-scale distributed inference cluster can be built using AMD’s Ryzen™ AI Max+ AI PC platform to run a one trillion-parameter Large Language Model. A four-node cluster of Framework Desktop systems demonstrates the local inference of the Kimi K2.5 open-source model.

amd.com

🔥🔥🔥🔥🔥

14 min

3/1/2026

AMD Ryzen AI NPUs are finally useful under Linux for running LLMs

AMD has developed the AMDXDNA accelerator driver for the mainline Linux kernel to support AMD Ryzen AI NPUs. User-space software for effectively utilizing these NPUs on Linux has been limited, with most applications relying on iGPU support instead.

phoronix.com

🔥🔥🔥🔥🔥

4 min

3/11/2026

Running a One Trillion-Parameter LLM Locally on AMD Ryzen AI Max+ Cluster

A small-scale distributed inference cluster can be built using AMD’s Ryzen™ AI Max+ AI PC platform to run a one trillion-parameter Large Language Model. A four-node cluster of Framework Desktop systems demonstrates the local inference of the Kimi K2.5 open-source model.

amd.com

🔥🔥🔥🔥🔥

14 min

3/1/2026

AMD engineer leverages AI to help make a pure-Python AMD GPU user-space driver

Anush Elangovan, AMD's VP of AI Software, is developing a pure-Python AMD GPU user-space driver using Claude Code. This driver aims to support the testing and debugging of the ROCm/HIP user-space stack.

phoronix.com

🔥🔥🔥🔥🔥

2 min

3/5/2026

AMD Ryzen AI NPUs are finally useful under Linux for running LLMs

AMD has developed the AMDXDNA accelerator driver for the mainline Linux kernel to support AMD Ryzen AI NPUs. User-space software for effectively utilizing these NPUs on Linux has been limited, with most applications relying on iGPU support instead.

phoronix.com

🔥🔥🔥🔥🔥

4 min

3/11/2026

AMD engineer leverages AI to help make a pure-Python AMD GPU user-space driver

Anush Elangovan, AMD's VP of AI Software, is developing a pure-Python AMD GPU user-space driver using Claude Code. This driver aims to support the testing and debugging of the ROCm/HIP user-space stack.

phoronix.com

🔥🔥🔥🔥🔥

2 min

3/5/2026

Running a One Trillion-Parameter LLM Locally on AMD Ryzen AI Max+ Cluster

A small-scale distributed inference cluster can be built using AMD’s Ryzen™ AI Max+ AI PC platform to run a one trillion-parameter Large Language Model. A four-node cluster of Framework Desktop systems demonstrates the local inference of the Kimi K2.5 open-source model.

amd.com

🔥🔥🔥🔥🔥

14 min

3/1/2026

No more articles to load