Back to stories
Models

Apple Is Using Google's Gemini to Train Smaller On-Device AI Models

Michael Ouroumis2 min read
Apple Is Using Google's Gemini to Train Smaller On-Device AI Models

Apple's partnership with Google runs deeper than previously understood. According to a new report from The Information, Apple has been granted "complete access" to Google's Gemini model inside its own data centers — and is using that access to train smaller AI models for deployment on its devices.

The arrangement, stemming from the deal announced in January 2026, leverages a technique called distillation: a frontier-scale "teacher" model is used to generate training data and supervision signals for a smaller "student" model, which can then run efficiently on-device.

Why Distillation Matters

Model distillation has become one of the most important techniques in practical AI deployment. Training a model that can match GPT-4 or Gemini Ultra on a broad range of tasks requires enormous compute resources. But distilling a focused version of those capabilities into a smaller model — one optimized for specific Apple use cases — is far more tractable.

The resulting models can run on iPhone, iPad, and Mac hardware without constant cloud inference, preserving privacy and reducing latency. It's the same general approach Apple used to build its Apple Intelligence features, though the use of Gemini as a teacher is new.

The Strategic Calculus

For Apple, the arrangement is pragmatic. Building frontier-scale models internally would require massive investment in data centers and research talent. Using Google's model as a scaffold allows Apple's teams to focus on the distillation pipeline, device optimization, and Apple-specific fine-tuning — areas where they already excel.

For Google, it deepens the commercial relationship with Apple's enormous install base, even as the two companies compete in AI assistants and mobile software. Notably, it also means Gemini's capabilities — however indirectly — end up powering Apple's on-device AI features.

Privacy Implications

Apple has been careful to position its AI features around on-device processing and Private Cloud Compute. If Gemini is being used purely as a training-time teacher model, with no inference happening via Google's servers during normal device use, that's largely consistent with Apple's privacy narrative.

The more sensitive question is what training data flows through this arrangement — something neither Apple nor Google has commented on publicly.

What to Watch

The distillation strategy suggests Apple is serious about closing the capability gap with Google and OpenAI without abandoning its hardware-first, privacy-first positioning. If the approach works, it could become a template for how device manufacturers at scale build competitive AI without the resources of a frontier lab.

Expect more details to emerge as Apple Intelligence features roll out in the next major iOS and macOS releases.

How AI Actually Works — Free Book on FreeLibrary

A free book that explains the AI concepts behind the headlines — no jargon, just clarity.

More in Models

Arm's New AGI CPU Claims 136 Cores and Double the Performance Per Watt of x86
Models

Arm's New AGI CPU Claims 136 Cores and Double the Performance Per Watt of x86

Arm has unveiled its AGI CPU — a server-class chip with up to 136 cores designed for AI inference workloads, claiming 2x the performance per watt over x86 competitors.

1 day ago2 min read
GPT-5.4 Hits Human-Level Performance on Knowledge Work Benchmarks
Models

GPT-5.4 Hits Human-Level Performance on Knowledge Work Benchmarks

OpenAI's GPT-5.4 scores above the human baseline on OSWorld-V and GDPVal, signaling a shift from conversational AI to autonomous digital coworker.

2 days ago2 min read
Mystery 'Hunter Alpha' AI Model Revealed as Xiaomi's MiMo-V2-Pro
Models

Mystery 'Hunter Alpha' AI Model Revealed as Xiaomi's MiMo-V2-Pro

Xiaomi officially unveils MiMo-V2-Pro as the anonymous 'Hunter Alpha' model that topped OpenRouter benchmarks, delivering near-frontier performance at a fraction of Western competitors' costs.

1 week ago2 min read