Getting Started
Introduction
Deploy inference endpoints, train AI models, and autoscale to hundreds of GPUs, without managing infrastructure.
Features
- Scale out workloads to thousands of GPU (or CPU) containers
- Ultrafast cold-start for custom ML models
- Automatic scaling up and down to zero
- Flexible distributed storage for storing models and function outputs
- Distribute workloads across multiple cloud providers
- Easily deploy task queues and functions using simple Python abstractions
Quickstart
Create an account on Beam and download the Beam SDK to get started:
pip install beam-client
How It Works
Beam is designed for launching remote serverless containers quickly. There are a few things that make this possible:
- A custom, lazy loading image format (CLIP) backed by S3/FUSE
- A fast, redis-based container scheduling engine
- Content-addressed storage for caching images and files
- A custom runc container runtime
Getting Started Guides
Tutorials
Was this page helpful?