AI Edge Gallery
Run Gemma 4 and open-source LLMs directly on your Android or iPhone
Google's AI Edge Gallery is a mobile application that turns your Android or iPhone into a local LLM inference machine. Available on Android 12+ and iOS 17+, the app runs open-source models—with particular focus on Google's Gemma 4 family—entirely on-device. No internet required, no data leaves your phone, no API costs. The Gallery supports multi-turn conversation with a Thinking Mode that lets you watch the model's reasoning steps, image analysis through multimodal capabilities, voice transcription and translation, model performance benchmarking on your specific device hardware, and even device automation powered by fine-tuned models. Custom models can be loaded via Hugging Face integration. The updated version with official Gemma 4 support is particularly timely: Gemma 4's 2B parameter model has been benchmarked outperforming its 12B predecessor on multi-turn benchmarks, and running it on a modern iPhone or Android flagship is now genuinely fast. For privacy-conscious users, developers who want to test local inference without cloud costs, or anyone who needs AI capabilities in environments without reliable internet, AI Edge Gallery bridges the gap between cutting-edge open-source models and practical mobile use.
Panel Reviews
The Builder
Developer Perspective
“On-device LLM inference on consumer phones with Gemma 4 support is a genuine capability milestone. The model benchmarking feature is practically useful for understanding what's actually running where. This is solid infrastructure for mobile AI development testing.”
The Skeptic
Reality Check
“On-device LLM quality still trails cloud APIs significantly for complex tasks. You're trading capability for privacy and offline access—that's a real tradeoff, not a free lunch. Battery drain and thermal throttling on extended sessions remain practical problems on most phones.”
The Futurist
Big Picture
“Local inference on mobile phones is the long game—as models compress and chips improve, the gap between on-device and cloud closes. AI Edge Gallery is Google planting a flag in the world where your phone is your private AI, not a terminal that routes everything through a data center.”
The Creator
Content & Design
“Privacy-first, works offline, no subscription—AI Edge Gallery is genuinely useful for creators who travel or work in low-connectivity environments and want AI assistance without sending their work to the cloud. The voice transcription feature alone is worth downloading for on-the-go note capture.”
Community Sentiment
“Gemma 4 on-device performance benchmarks”
“Battery drain and thermal throttling in extended sessions”
“Privacy-first local LLM inference vs cloud alternatives”