Main Menu
  • Tools
  • Developers
  • Topics
  • Discussions
  • News
  • Blogs
  • Builds
  • Contests
Create
Sign In
    EveryDev.ai
    Sign inSubscribe
    Home
    Tools

    1,711+ AI tools

    • New
    • Trending
    • Featured
    • Compare
    Categories
    • Agents891
    • Coding869
    • Infrastructure377
    • Marketing357
    • Design302
    • Research276
    • Projects271
    • Analytics266
    • Testing160
    • Integration157
    • Data150
    • Security131
    • MCP125
    • Learning124
    • Extensions108
    • Communication107
    • Prompts100
    • Voice90
    • Commerce89
    • DevOps70
    • Web66
    • Finance17
    Sign In
    1. Home
    2. Tools
    3. LM Studio
    LM Studio icon

    LM Studio

    AI Development Libraries

    Run local LLMs, chat with documents, and power apps using a local AI server.

    Visit Website

    At a Glance

    Pricing

    Free tier available

    Get started with LM Studio at no cost with Run models locally and Document RAG.

    Enterprise: Custom/contact/mo
    Teams (coming soon): Custom/contact/mo

    Engagement

    Available On

    macOS
    Windows
    Linux
    API

    Resources

    WebsiteDocsGitHubllms.txt

    Topics

    AI Development LibrariesLocal Inference

    Alternatives

    BitNetparakeet.cppMLX-VLM

    Developer

    Yagil Burowski

    Updated Feb 2026

    About LM Studio

    LM Studio is a cross-platform toolkit that lets you download and run open-source LLMs (like Llama, Qwen, DeepSeek, Gemma) directly on your laptop or desktop. You can use it for chatting, retrieving answers from local documents (RAG), or hosting a local LLM server to integrate into your own applications.

    It includes a discovery UI for finding Hugging Face models in GGUF or MLX format, a chat interface, document RAG system, and a local server mode for API access. Developers can use the LM Studio SDK (via npm) to build local AI apps without managing model dependencies.

    LM Studio is free for both personal and commercial use, with paid plans only needed for enterprise features like private sharing, SSO, and model access control.

    LM Studio - 1

    Community Discussions

    Be the first to start a conversation about LM Studio

    Share your experience with LM Studio, ask questions, or help others learn from your insights.

    Pricing

    FREE

    Free Plan Available

    Get started with LM Studio at no cost with Run models locally and Document RAG.

    • Run models locally
    • Document RAG
    • Local server mode
    • Use at home and at work

    Enterprise

    Enterprise-grade solution with SSO and Private model sharing and dedicated support.

    Custom
    contact sales
    • SSO
    • Private model sharing
    • Model/MCP access control
    • Audit and team management

    Teams (coming soon)

    Designed for teams with Private artifact sharing and Self-serve collaboration tools and collaboration features.

    Custom
    contact sales
    • Private artifact sharing
    • Self-serve collaboration tools
    View official pricing

    Capabilities

    Key Features

    • Run open-source LLMs locally on Mac, Windows, or Linux
    • Chat interface with support for document RAG
    • Host local AI server with REST API
    • Discover and download GGUF and MLX models
    • Cross-platform support for Llama.cpp and MLX formats
    • Free SDK for building local AI apps
    • Team and enterprise collaboration via LM Hub (upcoming)

    Integrations

    Hugging Face
    Llama.cpp
    MLX
    API Available
    View Docs

    Reviews & Ratings

    No ratings yet

    Be the first to rate LM Studio and help others make informed decisions.

    Developer

    Yagil Burowski

    Read more about Yagil Burowski
    WebsiteGitHubX / Twitter
    1 tool in directory

    Similar Tools

    BitNet icon

    BitNet

    Microsoft's official implementation of BitNet, enabling efficient 1-bit large language model inference on CPUs without requiring GPUs.

    parakeet.cpp icon

    parakeet.cpp

    parakeet.cpp is a lightweight C++ implementation for running Parakeet speech recognition models locally with fast, offline transcription capabilities.

    MLX-VLM icon

    MLX-VLM

    A Python library for running Vision Language Models on Apple Silicon using the MLX framework.

    Browse all tools

    Related Topics

    AI Development Libraries

    Programming libraries and frameworks that provide machine learning capabilities, model integration, and AI functionality for developers.

    121 tools

    Local Inference

    Tools and platforms for running AI inference locally without cloud dependence.

    54 tools
    Browse all topics
    Back to all tools
    Explore AI Tools
    • AI Coding Assistants
    • Agent Frameworks
    • MCP Servers
    • AI Prompt Tools
    • Vibe Coding Tools
    • AI Design Tools
    • AI Database Tools
    • AI Website Builders
    • AI Testing Tools
    • LLM Evaluations
    Follow Us
    • X / Twitter
    • LinkedIn
    • Reddit
    • Discord
    • Threads
    • Bluesky
    • Mastodon
    • YouTube
    • GitHub
    • Instagram
    Get Started
    • About
    • Editorial Standards
    • Corrections & Disclosures
    • Community Guidelines
    • Advertise
    • Contact Us
    • Newsletter
    • Submit a Tool
    • Start a Discussion
    • Write A Blog
    • Share A Build
    • Terms of Service
    • Privacy Policy
    Explore with AI
    • ChatGPT
    • Gemini
    • Claude
    • Grok
    • Perplexity
    Agent Experience
    • llms.txt
    Theme
    With AI, Everyone is a Dev. EveryDev.ai © 2026
    Sign in
    36views