Back to reviews
AI Edge Gallery

AI Edge Gallery

Run Gemma 4 and open-source LLMs directly on your Android or iPhone

Google's AI Edge Gallery is a mobile application that turns your Android or iPhone into a local LLM inference machine. Available on Android 12+ and iOS 17+, the app runs open-source models—with particular focus on Google's Gemma 4 family—entirely on-device. No internet required, no data leaves your phone, no API costs. The Gallery supports multi-turn conversation with a Thinking Mode that lets you watch the model's reasoning steps, image analysis through multimodal capabilities, voice transcription and translation, model performance benchmarking on your specific device hardware, and even device automation powered by fine-tuned models. Custom models can be loaded via Hugging Face integration. The updated version with official Gemma 4 support is particularly timely: Gemma 4's 2B parameter model has been benchmarked outperforming its 12B predecessor on multi-turn benchmarks, and running it on a modern iPhone or Android flagship is now genuinely fast. For privacy-conscious users, developers who want to test local inference without cloud costs, or anyone who needs AI capabilities in environments without reliable internet, AI Edge Gallery bridges the gap between cutting-edge open-source models and practical mobile use.

Panel Reviews

The Builder

The Builder

Developer Perspective

Ship

On-device LLM inference on consumer phones with Gemma 4 support is a genuine capability milestone. The model benchmarking feature is practically useful for understanding what's actually running where. This is solid infrastructure for mobile AI development testing.

The Skeptic

The Skeptic

Reality Check

Skip

On-device LLM quality still trails cloud APIs significantly for complex tasks. You're trading capability for privacy and offline access—that's a real tradeoff, not a free lunch. Battery drain and thermal throttling on extended sessions remain practical problems on most phones.

The Futurist

The Futurist

Big Picture

Ship

Local inference on mobile phones is the long game—as models compress and chips improve, the gap between on-device and cloud closes. AI Edge Gallery is Google planting a flag in the world where your phone is your private AI, not a terminal that routes everything through a data center.

The Creator

The Creator

Content & Design

Ship

Privacy-first, works offline, no subscription—AI Edge Gallery is genuinely useful for creators who travel or work in low-connectivity environments and want AI assistance without sending their work to the cloud. The voice transcription feature alone is worth downloading for on-the-go note capture.

Community Sentiment

Overall850 mentions
76% positive18% neutral6% negative
GitHub200 mentions
76%18%6%

Gemma 4 on-device performance benchmarks

Reddit300 mentions
72%20%8%

Battery drain and thermal throttling in extended sessions

Twitter/X350 mentions
80%15%5%

Privacy-first local LLM inference vs cloud alternatives