Osaurus icon

Osaurus

Local Inference

Osaurus is a local-first AI runtime optimized for Apple Silicon that runs open-source models on Mac with privacy and no cloud dependency.

At a Glance

Pricing

Free tier available

Free and open-source under MIT license with no usage limits.

Engagement

16views
0likes
0comments

Available On

macOS

About Osaurus

Osaurus is a local-first AI runtime designed specifically for Apple Silicon Macs, offering blazing fast performance, privacy by default, and open flexibility. It enables users to run state-of-the-art open-source AI models directly on their Mac without relying on cloud services, ensuring data privacy and zero latency. The runtime is lightweight, developer-friendly, and integrates seamlessly with Apple's MLX ecosystem.

Blazing Fast Performance: Optimized for M-series chips and written in Swift for seamless local inference.

Privacy by Default: All AI inference runs locally on your device, ensuring your data never leaves your machine.

Open & Flexible: MIT-licensed and API-compatible with OpenAI and Ollama, supporting extensibility through plugins and models.

Developer Friendly: Simple installation, clean APIs, and both CLI and GUI interfaces for full control.

Community Driven: Open source with active community support and contributions.

To get started, install Osaurus on your Apple Silicon Mac, run your preferred MLX-compatible model, and explore integrations to connect with your favorite tools.

Community Discussions

Be the first to start a conversation

Share your experience with Osaurus, ask questions, or help others learn from your insights.

Pricing

FREE

Free Plan Available

Free and open-source under MIT license with no usage limits.

  • Full access to Osaurus runtime
  • Run open-source AI models locally
  • Community support
View official pricing

Capabilities

Key Features

  • Local-first AI runtimeOptimized for Apple Silicon M-series chipsPrivacy-focused with no cloud dependencyOpen source under MIT licenseAPI-compatible with OpenAI and Ollama
  • Lightweight runtime (~7MB)CLI and GUI interfacesIntegration with Apple's MLX ecosystem

Integrations

OpenAI API
Ollama API
Apple MLX ecosystem