# Beam > AI infrastructure platform for developers to run sandboxes, inference, and training with ultrafast boot times and instant autoscaling. Beam provides AI infrastructure for developers, enabling them to run sandboxes, inference, and training workloads with ultrafast boot times, instant autoscaling, and a streamlined developer experience. The platform is 100% open source and can run on Beam's cloud or self-hosted infrastructure, making it flexible for various deployment needs. - **Secure Runtime for Agents** provides a secure environment to run and test agent-generated code, built specifically for AI agents, code interpreters, or any application that needs to execute untrusted code at scale. - **Dynamic Container Environments** allow you to capture the complete state of any sandbox—including filesystem, memory, and running processes—as an immutable snapshot for reproducibility. - **Elastic GPU Scaling** enables automatic scaling up and down with traffic, with pay-per-use billing down to the CPU cycle for cost efficiency. - **Custom Model Inference** lets you host any custom model on GPU or CPU with your own container images. - **Sandboxed Code Execution** runs LLM-generated code in secure execution environments, ideal for AI agent workflows. - **Training & Fine-Tuning** supports training and fine-tuning of models from SLMs and LLMs to diffusion models. - **Easy Local Debugging** makes it simple to test code before deploying using the exact configuration you'll run in production. - **Multiple Workers Per Container** allows vertical scaling by running multiple workers on the same container. - **Deploy from GitHub Actions** integrates with existing CI/CD pipelines for automatic API deployments. To get started, install the Beam SDK and use simple Python decorators to define your workloads. Switch hardware by changing one line of Python, and deploy instantly with the Beam CLI or GitHub Actions integration. ## Features - Secure runtime for agents - Dynamic container environments - Elastic GPU scaling - Custom model inference - Sandboxed code execution - Training and fine-tuning - Audio processing pipelines - Streamlit and Gradio UIs - Web scraping with Chromium - Easy local debugging - Multiple workers per container - Import remote Dockerfiles - Deploy from GitHub Actions - Storage volumes - Scheduled jobs - Task queues ## Integrations GitHub Actions, Docker, Hugging Face, Streamlit, Gradio, Chromium ## Platforms WEB, API, DEVELOPER_SDK ## Pricing Open Source ## Links - Website: https://www.beam.cloud/ - Documentation: https://docs.beam.cloud/ - Repository: https://github.com/beam-cloud/beta9 - EveryDev.ai: https://www.everydev.ai/tools/beam