Crush icon

Crush

Command Line Assistants

An AI-powered coding assistant CLI that connects your tools, code, and workflows to your choice of LLM.

At a Glance

Pricing

Free tier available

Free to use under FSL-1.1-MIT license (converts to MIT after 2 years)

Engagement

Available On

Windows
macOS
Linux

About Crush

Crush is a terminal-based AI coding assistant developed by Charm that integrates with multiple large language models to provide intelligent code assistance directly from the command line. It connects your development tools, codebase, and workflows to your preferred LLM provider, functioning as what the developers call "your new coding bestie" for the terminal.

Crush supports a wide range of LLM providers including Anthropic, OpenAI, Google Gemini, Groq, Amazon Bedrock, Azure OpenAI, and local models through Ollama or LM Studio. Users can switch between models mid-session while preserving conversation context.

  • Multi-Model Support - Connect to Anthropic, OpenAI, Google Gemini, Groq, Cerebras, OpenRouter, Amazon Bedrock, Vertex AI, or local models via Ollama and LM Studio
  • LSP Integration - Uses Language Server Protocol for additional code context, providing the same intelligence your IDE uses
  • MCP Extensibility - Extend capabilities through Model Context Protocol servers supporting stdio, HTTP, and SSE transports
  • Session Management - Maintain multiple work sessions and contexts per project for organized development workflows
  • Agent Skills - Supports the Agent Skills open standard for extending capabilities with reusable skill packages
  • Cross-Platform - First-class support on macOS, Linux, Windows (PowerShell and WSL), FreeBSD, OpenBSD, and NetBSD
  • Custom Providers - Configure OpenAI-compatible or Anthropic-compatible APIs for additional model access
  • Project Initialization - Analyzes your codebase and creates context files to improve future interactions

Installation is available through Homebrew, npm, Winget, Scoop, apt, yum, Nix, or direct binary downloads. Configuration supports LSP servers for language-specific context, MCP servers for external integrations, and custom provider endpoints for specialized deployments.

Crush

Demo Video

Crush Demo Video
Watch on YouTube

Community Discussions

Be the first to start a conversation about Crush

Share your experience with Crush, ask questions, or help others learn from your insights.

Pricing

FREE

Free Plan Available

Free to use under FSL-1.1-MIT license (converts to MIT after 2 years)

  • Full AI coding assistant functionality
  • Multi-model LLM support
  • LSP integration for code context
  • MCP server extensibility
  • Session management
View official pricing

Capabilities

Key Features

  • Multi-model LLM support (Anthropic, OpenAI, Gemini, Groq, Bedrock, local models)
  • Mid-session model switching with context preservation
  • LSP integration for code intelligence
  • MCP server support (stdio, HTTP, SSE)
  • Session-based context management
  • Agent Skills support
  • Custom provider configuration
  • Project initialization and codebase analysis
  • Git integration with attribution
  • .crushignore file support
  • Configurable tool permissions

Integrations

Anthropic Claude
OpenAI GPT
Google Gemini
Groq
Amazon Bedrock
Azure OpenAI
Vertex AI
OpenRouter
Cerebras
Hugging Face
Ollama
LM Studio
Language Server Protocol (LSP)
Model Context Protocol (MCP)
Git