BAML icon

BAML

AI Development Libraries

Domain-specific language and toolchain for type-safe LLM functions, structured outputs, and multi-provider orchestration.

At a Glance

Pricing

Free tier available

Get started with BAML at no cost with CLI & editor extensions and Unlimited BAML schemas.

Team: $25/mo
Enterprise: Custom/contact/mo

Engagement

Available On

Web
Windows
macOS
Linux
API

About BAML

BAML (by Boundary) is a domain-specific language and toolchain for building reliable, type-safe LLM workflows. You define functions, types, and clients in .baml files, then BAML generates language clients (e.g., Python/TypeScript) that call providers like OpenAI or Anthropic and return validated, structured results. The VS Code/Cursor extension includes an integrated playground with prompt preview and raw cURL visibility. BAML supports streaming typed outputs, multimodal inputs (PDFs, images, audio, video—depending on provider), checks/asserts for validation, dynamic types via a TypeBuilder, and a Collector for usage and raw response inspection. You can also expose functions as REST endpoints with an OpenAPI spec and generate clients for other languages.

Demo Video

BAML Demo Video
Watch on YouTube

Community Discussions

Be the first to start a conversation about BAML

Share your experience with BAML, ask questions, or help others learn from your insights.

Pricing

FREE

Free Plan Available

Get started with BAML at no cost with CLI & editor extensions and Unlimited BAML schemas.

  • CLI & editor extensions
  • Unlimited BAML schemas
  • TypeScript generation
  • Basic schema validation
  • Local development
TRIAL

14 days

Try BAML for 14 days with access to Free trial available.

  • Free trial available

Team

Designed for teams with Advanced type generation and Runtime validation and collaboration features.

$25
per month
  • Advanced type generation
  • Runtime validation
  • Unlimited schemas
  • Team collaboration
  • Private schemas
  • Custom transformations
  • Priority support
  • Annual billing available (save 20%)

Enterprise

Enterprise-grade solution with On-premise deployment and SSO & SAML and dedicated support.

Custom
contact sales
  • On-premise deployment
  • SSO & SAML
  • Custom rate limits
  • Audit logs & compliance
  • 99.9% uptime SLA
  • Dedicated account manager
  • Custom training & onboarding
  • Contact sales
View official pricing

Capabilities

Key Features

  • Define LLM functions with typed inputs/outputs and generate language clients from `.baml` files.
  • VS Code/Cursor playground with prompt preview and raw cURL request view.
  • Provider-agnostic client config; switch providers/models at runtime.
  • Streaming structured outputs with type guarantees, including partial field updates.
  • Expose functions as REST endpoints and auto-generate OpenAPI client SDKs.
  • Dynamic types at runtime using TypeBuilder for classes/enums.
  • Checks and asserts to validate values and enforce constraints.
  • Collector API for token usage, timings, and raw LLM responses.
  • Multimodal inputs (PDF, image, audio, video) where supported by providers.
  • Concurrency, retries, error handling, and abort/timeout controls.

Integrations

OpenAI
Anthropic
Google Gemini / Vertex AI
AWS Bedrock
Azure OpenAI
OpenAPI Generator
Visual Studio Code
Cursor
JetBrains IDEs
Next.js
Vercel
API Available
View Docs