Main Menu
  • Tools
  • Developers
  • Topics
  • Discussions
  • News
  • Blogs
  • Builds
  • Contests
  • Compare
Create
    EveryDev.ai
    Sign inSubscribe
    Home
    Tools

    1,932+ AI tools

    • New
    • Trending
    • Featured
    • Compare
    Categories
    • Agents1033
    • Coding970
    • Infrastructure415
    • Marketing397
    • Design335
    • Projects311
    • Analytics298
    • Research290
    • Testing180
    • Integration167
    • Data163
    • Security156
    • MCP145
    • Learning135
    • Communication120
    • Extensions114
    • Prompts110
    • Commerce105
    • Voice102
    • DevOps83
    • Web71
    • Finance18
    1. Home
    2. Tools
    3. Osaurus
    Osaurus icon

    Osaurus

    Local Inference

    Osaurus is a local-first AI runtime optimized for Apple Silicon that runs open-source models on Mac with privacy and no cloud dependency.

    Visit Website

    At a Glance

    Pricing
    Free

    Free and open-source under MIT license with no usage limits.

    Engagement

    Available On

    macOS

    Resources

    WebsiteDocsGitHubllms.txt

    Topics

    Local InferenceAI InfrastructureDevelopment Environments

    Alternatives

    EnsuLemonadeAI Backends
    Developer
    Dinoki LabsDinoki Labs builds Osaurus, a local-first AI runtime optimiz…

    Updated Feb 2026

    About Osaurus

    Osaurus is a local-first AI runtime designed specifically for Apple Silicon Macs, offering blazing fast performance, privacy by default, and open flexibility. It enables users to run state-of-the-art open-source AI models directly on their Mac without relying on cloud services, ensuring data privacy and zero latency. The runtime is lightweight, developer-friendly, and integrates seamlessly with Apple's MLX ecosystem.

    Blazing Fast Performance: Optimized for M-series chips and written in Swift for seamless local inference.

    Privacy by Default: All AI inference runs locally on your device, ensuring your data never leaves your machine.

    Open & Flexible: MIT-licensed and API-compatible with OpenAI and Ollama, supporting extensibility through plugins and models.

    Developer Friendly: Simple installation, clean APIs, and both CLI and GUI interfaces for full control.

    Community Driven: Open source with active community support and contributions.

    To get started, install Osaurus on your Apple Silicon Mac, run your preferred MLX-compatible model, and explore integrations to connect with your favorite tools.

    Osaurus - 1

    Community Discussions

    Be the first to start a conversation about Osaurus

    Share your experience with Osaurus, ask questions, or help others learn from your insights.

    Pricing

    FREE

    Open Source

    Free and open-source under MIT license with no usage limits.

    • Full access to Osaurus runtime
    • Run open-source AI models locally
    • Community support

    Capabilities

    Key Features

    • Local-first AI runtimeOptimized for Apple Silicon M-series chipsPrivacy-focused with no cloud dependencyOpen source under MIT licenseAPI-compatible with OpenAI and Ollama
    • Lightweight runtime (~7MB)CLI and GUI interfacesIntegration with Apple's MLX ecosystem

    Integrations

    OpenAI API
    Ollama API
    Apple MLX ecosystem

    Reviews & Ratings

    No ratings yet

    Be the first to rate Osaurus and help others make informed decisions.

    Developer

    Dinoki Labs

    Dinoki Labs builds Osaurus, a local-first AI runtime optimized for Apple Silicon Macs. The team focuses on privacy, speed, and open-source development, creating tools that empower developers and creators to run AI models locally with full control.

    2 employees
    Read more about Dinoki Labs
    WebsiteGitHubX / Twitter
    1 tool in directory

    Similar Tools

    Ensu icon

    Ensu

    Ensu is a local LLM app by Ente that lets you run and chat with AI language models entirely on your own device, with full privacy.

    Lemonade icon

    Lemonade

    Open-source local LLM server for Windows, Linux, and macOS that runs LLMs, image generation, speech, and more on GPUs and NPUs with an OpenAI-compatible API.

    AI Backends icon

    AI Backends

    Self-hosted open-source AI API server that exposes unified REST endpoints and supports multiple LLM providers for integration into applications.

    Browse all tools

    Related Topics

    Local Inference

    Tools and platforms for running AI inference locally without cloud dependence.

    67 tools

    AI Infrastructure

    Infrastructure designed for deploying and running AI models.

    180 tools

    Development Environments

    AI-enhanced code editors and IDEs that improve the coding experience.

    119 tools
    Browse all topics
    Back to all tools
    Explore AI Tools
    • AI Coding Assistants
    • Agent Frameworks
    • MCP Servers
    • AI Prompt Tools
    • Vibe Coding Tools
    • AI Design Tools
    • AI Database Tools
    • AI Website Builders
    • AI Testing Tools
    • LLM Evaluations
    Follow Us
    • X / Twitter
    • LinkedIn
    • Reddit
    • Discord
    • Threads
    • Bluesky
    • Mastodon
    • YouTube
    • GitHub
    • Instagram
    Get Started
    • About
    • Editorial Standards
    • Corrections & Disclosures
    • Community Guidelines
    • Advertise
    • Contact Us
    • Newsletter
    • Submit a Tool
    • Start a Discussion
    • Write A Blog
    • Share A Build
    • Terms of Service
    • Privacy Policy
    Explore with AI
    • ChatGPT
    • Gemini
    • Claude
    • Grok
    • Perplexity
    Agent Experience
    • llms.txt
    Theme
    With AI, Everyone is a Dev. EveryDev.ai © 2026
    92views
    Discussions