Crush
Crush is a terminal-based AI coding assistant developed by Charm that integrates with multiple large language models to provide intelligent code assistance directly from the command line. It connects your development tools, codebase, and workflows to your preferred LLM provider, functioning as what the developers call "your new coding bestie" for the terminal.
Crush supports a wide range of LLM providers including Anthropic, OpenAI, Google Gemini, Groq, Amazon Bedrock, Azure OpenAI, and local models through Ollama or LM Studio. Users can switch between models mid-session while preserving conversation context.
- Multi-Model Support - Connect to Anthropic, OpenAI, Google Gemini, Groq, Cerebras, OpenRouter, Amazon Bedrock, Vertex AI, or local models via Ollama and LM Studio
- LSP Integration - Uses Language Server Protocol for additional code context, providing the same intelligence your IDE uses
- MCP Extensibility - Extend capabilities through Model Context Protocol servers supporting stdio, HTTP, and SSE transports
- Session Management - Maintain multiple work sessions and contexts per project for organized development workflows
- Agent Skills - Supports the Agent Skills open standard for extending capabilities with reusable skill packages
- Cross-Platform - First-class support on macOS, Linux, Windows (PowerShell and WSL), FreeBSD, OpenBSD, and NetBSD
- Custom Providers - Configure OpenAI-compatible or Anthropic-compatible APIs for additional model access
- Project Initialization - Analyzes your codebase and creates context files to improve future interactions
Installation is available through Homebrew, npm, Winget, Scoop, apt, yum, Nix, or direct binary downloads. Configuration supports LSP servers for language-specific context, MCP servers for external integrations, and custom provider endpoints for specialized deployments.
Crush Tool Discussions
No discussions yet
Be the first to start a discussion about Crush
Demo Video for Crush

Stats on Crush
Usage
Pricing and Plans
Free
Free to use under FSL-1.1-MIT license (converts to MIT after 2 years)
- Full AI coding assistant functionality
- Multi-model LLM support
- LSP integration for code context
- MCP server extensibility
- Session management
- Cross-platform support