# LLM CLI > Open-source CLI and Python library to run prompts, chat, embeddings, schemas, and tool-use across OpenAI, Claude, Gemini, and local models (Ollama, llama.cpp). Stores logs and vectors in SQLite and is extensible via plugins. LLM is a lightweight command-line tool and Python library for working with large language models. It runs prompts and chat sessions, streams output, manages system prompts and templates, and logs everything to SQLite for later search and analysis. It supports structured outputs via JSON schemas, multimodal inputs (images/audio/video) through attachments, and can grant models controlled access to tools. Embeddings are first-class: you can generate, store, and run similarity search against vectors in SQLite. The plugin system adds providers and local runtimes—OpenAI, Anthropic (Claude), Google Gemini, Mistral, Ollama, llama.cpp, GPT4All and more—so you can mix remote APIs with models that run on your own machine. Installable with pip/pipx/Homebrew/uv, and usable as a Python API as well as a CLI. (Docs & feature list, plugins, CLI help, and PyPI requirements cited in sources.) ## Features - Prompt execution with streaming output - Interactive chat mode (llm chat) - System prompts, templates, and fragments for long-context work - SQLite logging of prompts, responses, token usage, and metadata - Schema-based structured output (JSON) from models - Embeddings: generate, store, and similarity-search in SQLite - Multimodal attachments (image, audio, video) where models support it - Pluggable providers and local runtimes via llm install - Tool use / function calling support with safety notes - API key management (llm keys) and model discovery (llm models) - Python API parity for prompts, schemas, tools, fragments, and streaming - Configurable user content directory and custom locations ## Integrations OpenAI, Anthropic Claude, Google Gemini, Mistral AI, Ollama, llama.cpp / GGUF, GPT4All, Cohere, Groq, Replicate, OpenRouter, Azure OpenAI ## Platforms WINDOWS, MACOS, LINUX, DEVELOPER_SDK ## Pricing Open Source ## Version 0.27.1 ## Links - Website: https://llm.datasette.io/en/stable/ - Documentation: https://llm.datasette.io/en/stable/ - Repository: https://github.com/simonw/llm - EveryDev.ai: https://www.everydev.ai/tools/llm-cli