LLM CLI
LLM is a lightweight command-line tool and Python library for working with large language models. It runs prompts and chat sessions, streams output, manages system prompts and templates, and logs everything to SQLite for later search and analysis. It supports structured outputs via JSON schemas, multimodal inputs (images/audio/video) through attachments, and can grant models controlled access to tools. Embeddings are first-class: you can generate, store, and run similarity search against vectors in SQLite. The plugin system adds providers and local runtimes—OpenAI, Anthropic (Claude), Google Gemini, Mistral, Ollama, llama.cpp, GPT4All and more—so you can mix remote APIs with models that run on your own machine. Installable with pip/pipx/Homebrew/uv, and usable as a Python API as well as a CLI. (Docs & feature list, plugins, CLI help, and PyPI requirements cited in sources.)
No discussions yet
Be the first to start a discussion about LLM CLI
Demo Video for LLM CLI
Developer
Pricing and Plans
Plan | Price | Features |
---|---|---|
Open Source | Free |
|