7 providers. Your keys.
Grip OS connects to every major LLM provider with BYOK. Run cloud models or go fully local with Ollama and MLX on Apple Silicon.
Anthropic
Claude Sonnet 4.6
Advanced reasoning and nuanced analysis. Ideal for complex agent workflows, code generation, and long-context understanding.
OpenAI
GPT-5.1
Fast, versatile model family for general tasks. Strong at drafting, summarization, and multi-step tool use.
Google Gemini
Gemini Pro
Multimodal capabilities with large context windows. Effective for document analysis and cross-modal tasks.
xAI
Grok
Real-time knowledge and conversational reasoning. Useful for up-to-date information retrieval and analysis.
Mistral
Mistral Large
Efficient European-built models with strong multilingual support. Great for code and structured output tasks.
Ollama
Local models
Run open-source models locally with zero API cost. Full privacy — no data leaves your machine. Supports Llama, Phi, and more.
MLX
Apple Silicon native
Native Apple Silicon inference via mlx-swift. Powers Grip Mail self-learning models on-device with sub-10ms latency.
Local inference
MLX on Apple Silicon powers Grip Mail's self-learning models with sub-10ms latency and zero API cost. Ollama runs any open-source model locally. Your conversations and data never leave your machine.
Privacy-first BYOK
Bring your own API keys for every cloud provider. Grip OS never proxies your data through our servers. Your keys, your data, your control.
Stay in the loop
Get notified when new model providers and add-ons are available for Grip OS.