# Osaurus > Osaurus is a local-first AI runtime optimized for Apple Silicon that runs open-source models on Mac with privacy and no cloud dependency. Osaurus is a local-first AI runtime designed specifically for Apple Silicon Macs, offering blazing fast performance, privacy by default, and open flexibility. It enables users to run state-of-the-art open-source AI models directly on their Mac without relying on cloud services, ensuring data privacy and zero latency. The runtime is lightweight, developer-friendly, and integrates seamlessly with Apple's MLX ecosystem. **Blazing Fast Performance**: Optimized for M-series chips and written in Swift for seamless local inference. **Privacy by Default**: All AI inference runs locally on your device, ensuring your data never leaves your machine. **Open & Flexible**: MIT-licensed and API-compatible with OpenAI and Ollama, supporting extensibility through plugins and models. **Developer Friendly**: Simple installation, clean APIs, and both CLI and GUI interfaces for full control. **Community Driven**: Open source with active community support and contributions. To get started, install Osaurus on your Apple Silicon Mac, run your preferred MLX-compatible model, and explore integrations to connect with your favorite tools. ## Features - Local-first AI runtimeOptimized for Apple Silicon M-series chipsPrivacy-focused with no cloud dependencyOpen source under MIT licenseAPI-compatible with OpenAI and Ollama - Lightweight runtime (~7MB)CLI and GUI interfacesIntegration with Apple's MLX ecosystem ## Integrations OpenAI API, Ollama API, Apple MLX ecosystem ## Platforms MACOS ## Pricing Open Source ## Links - Website: https://osaurus.ai - Documentation: https://docs.osaurus.ai/ - Repository: https://github.com/dinoki-ai/osaurus - EveryDev.ai: https://www.everydev.ai/tools/osaurus