Google's Gemma 4 model family now runs natively on iPhones, enabling full local AI inference offline. Early benchmarks show the 31B variant of Gemma 4 performing comparably to Qwen 3.5's 27B model.
gizmoweek.com
2 min
4/15/2026
RCLI is an on-device voice AI for macOS that allows users to interact with their Mac and query documents without requiring cloud services. It features a complete STT, LLM, and TTS pipeline running natively on Apple Silicon, with 38 macOS actions available via voice and sub-200ms end-to-end latency.
github.com
5 min
3/11/2026
Google's Gemma 4 model family now runs natively on iPhones, enabling full local AI inference offline. Early benchmarks show the 31B variant of Gemma 4 performing comparably to Qwen 3.5's 27B model.
gizmoweek.com
2 min
4/15/2026
RCLI is an on-device voice AI for macOS that allows users to interact with their Mac and query documents without requiring cloud services. It features a complete STT, LLM, and TTS pipeline running natively on Apple Silicon, with 38 macOS actions available via voice and sub-200ms end-to-end latency.
github.com
5 min
3/11/2026
Google's Gemma 4 model family now runs natively on iPhones, enabling full local AI inference offline. Early benchmarks show the 31B variant of Gemma 4 performing comparably to Qwen 3.5's 27B model.
gizmoweek.com
2 min
4/15/2026
RCLI is an on-device voice AI for macOS that allows users to interact with their Mac and query documents without requiring cloud services. It features a complete STT, LLM, and TTS pipeline running natively on Apple Silicon, with 38 macOS actions available via voice and sub-200ms end-to-end latency.
github.com
5 min
3/11/2026
No more articles to load