OfflineLLM 4 Launches with Support for Apple Intelligence, Metal 4, and All New Apple OS Releases

Today, we’re proud to introduce OfflineLLM 4—our biggest update yet. Launching alongside iOS 26, iPadOS 26, macOS 26, and visionOS 26, this release is packed with powerful new features, visual upgrades, and performance improvements.

OfflineLLM 4 continues to push the boundaries of what’s possible with completely offline AI on Apple devices—giving you speed, privacy, and control like never before.

🌟 What’s New in OfflineLLM 4?

🧠 Chat with Apple Intelligence Foundation Models

OfflineLLM 4 brings support for the Apple Intelligence Foundation Models, making it possible to chat with the same LLM family behind Apple’s AI features—directly on your device, with no internet required.

Enjoy all the intelligence of cutting-edge AI models while keeping your conversations 100% private. This is local AI at its most powerful—and most secure.

💎 Liquid Glass Redesign

Say hello to the new Liquid Glass UI: a visually stunning, minimalist design inspired by the refined aesthetics of iOS 26 and visionOS 26. OfflineLLM 4 now features:

  • A cleaner, more modern layout
  • Improved readability and navigation
  • A visual experience that feels fluid, futuristic, and intuitive

This isn’t just a facelift—it’s a full visual upgrade that makes everyday use a joy.

⚡ Powered by Metal 4

OfflineLLM 4 now takes advantage of Metal 4, Apple’s latest high-performance graphics and compute API. This enables:

  • Even faster model execution
  • Lower latency interactions
  • Better performance across the board on Apple Silicon

Metal 4 helps OfflineLLM maintain its lead as one of the fastest offline LLM apps on the App Store.

🔋 Smarter Execution Engine

We’ve significantly optimized our execution engine to be:

  • More memory-efficient – Run larger models on more devices
  • More battery-efficient – Get longer sessions without draining your device

These updates make OfflineLLM 4 more versatile across a broader range of Apple hardware—from the latest Macs to older iPhones.

✅ Day-One Support for Apple’s New OS Releases

OfflineLLM 4 is fully compatible with:

  • iOS 26
  • iPadOS 26
  • macOS 26
  • visionOS 26

Whether you’re upgrading your phone, tablet, Mac, or Apple Vision Pro, OfflineLLM 4 is ready to go—on launch day.

🧩 Everything You Love, Now Better

OfflineLLM 4 continues to support the rich features you’ve come to rely on:

  • Chat with all the models you love
    Run the most advanced models offline, including Llama, DeepSeek, Qwen, Gemma, Phi, Mistral, RWKV, CodeLlama, TinyLlama, and many more.

  • Multimodal model support
    Send images to multi-modal LLms to interact with powerful vision-capable models.

  • RAG (Retrieval-Augmented Generation)
    Feed your AI your own documents and files to personalize its responses.

  • Beginner & Advanced modes
    Whether you’re just getting started or fine-tuning every setting, OfflineLLM 4 fits your workflow.

  • Shortcuts, Automations & Widgets
    Extend OfflineLLM into your daily routines and productivity flows.

  • OpenAI-Compatible API
    Integrate OfflineLLM with your existing tools and services using a familiar API structure.

🔐 Offline. Private. Yours.

OfflineLLM 4 redefines what’s possible with AI on Apple devices—without relying on the cloud. Whether you’re coding, writing, summarising, studying, or just chatting, your data stays local, private, and secure.

If you’re ready to take control of your AI experience, OfflineLLM 4 is available now from the App Store.

Stay private. Stay fast. Stay offline.