Rooted to belocal firstWith branchesto the cloudRooted to belocal firstWith branches tothe cloudRooted to belocal firstWith branches tothe cloud
Thinking: Medium
Top open-source models supported.
Run the latest open-source AI models directly on your device. From conversational AI to advanced reasoning, choose from industry-leading models optimized for Apple Silicon.
Llama
Meta's flagship family of foundation models
Gemma
Google's lightweight, state-of-the-art models
SmolLM
Compact, efficient models by Hugging Face
DeepSeek
Advanced reasoning and coding models
Qwen
Alibaba's powerful multilingual models
Granite
IBM's enterprise-grade foundation models
Cloud Models & Self-Hosted
When you need more power or specific capabilities, connect to cloud models or run your own Ollama server. Full control with optional cloud access.
Google Gemini
Google's powerful multimodal AI models
Ollama
Run open-source models locally on your device
OpenAI
Industry-leading AI models with advanced capabilities
How it works
Cedar routes requests based on what you select. Local stays local. Your server stays yours. Cloud is opt-in.
- 1Pick a modelApple Intelligence, an Ollama model, or OpenAI — switch any time.
- 2Toggle behaviorEnable thinking levels and search only when you want them.
- 3Keep data in its laneOn-device or your own server by default — no silent sharing.
Local-first by design
- Apple and Open-Source models run on your iPhone.
- Ollama stays between the phone and your own server, VPN recommended.
- For extra power, use OpenAI or Google Gemini for specific tasks—only when you explicitly choose it, using their private API and your API key.