Beam vs. Cog

Beam and Cog are fundamentally different products. Cog is a wrapper around Docker, whereas Beam provides a custom container runtime. Moreover, the interface for Cog is yaml, whereas the interface for Beam is pure Python.

Beam’s custom container runtime is the magic behind fast cold-start. In addition, Beam provides other utilities for developers building AI apps, like the ability to cache models in storage volumes and run development previews.

You don’t need Docker to use Beam. The only requirement is having the Beam CLI and SDK installed.

Refactoring a Cog to Beam

The structure of a Cog is two-fold. There is a predict.py, which is the entry-point to your model, and a cog.yaml which defines the libraries and dependencies to package with your app.

With Beam, your predict.py and cog.yaml will be combined into the same Python file.

You don’t necessarily need to keep your entire Beam app in a single file; feel free to split your code into different files and import the modules accordingly.

Migrate the logic in predict.py

The first thing we’ll do is refactor the predict.py portion of the Cog to use Beam primitives.

This is the original predict.py file:

from cog import BasePredictor, Input


class Predictor(BasePredictor):
    def setup(self):
        self.prefix = "hello"

    def predict(self, text: str = Input(description="Text to prefix with 'hello '")) -> str:
        return self.prefix + " " + text

To migrate predict.py to Beam, create a new Python file. You can call it whatever you want, but for this example, we will name it app.py.

app.py
from beam import App, Runtime

app = App(
    name="inference",
    runtime=Runtime(),
)


@app.rest_api()
def predict(**inputs):
    return {"response": inputs["text"]}

Migrate the image and dependencies in cog.yaml

Now, we’ll move the dependencies in cog.yaml to the app.py file with your Beam app:

build:
  python_version: "3.8"
predict: "predict.py:Predictor"

These dependencies go in the Image class.

app.py
from beam import App, Image, Runtime

app = App(
    name="inference",
    runtime=Runtime(
        memory="1Gi",
        cpu=1,
        gpu="T4",
        image=Image(
            python_version="python3.8",
        ),
    ),
)


@app.rest_api()
def predict(**inputs):
    return {"response": inputs["text"]}

Adding System Commands & Python Packages

This is a simple Hello World example, but your real-world Cog app probably has additional dependencies. Suppose your cog.yaml looks like this:

cog.yaml
build:
  python_version: "3.8"
  python_packages:
    - "pillow==8.2.0"
  system_packages:
    - "libpng-dev"
    - "libjpeg-dev"
predict: "predict.py:Predictor"

To add these dependencies to a Beam app, you can use the python_packages and commands field in Image():

app.py
from beam import App, Image, Runtime

app = App(
    name="inference",
    runtime=Runtime(
        memory="1Gi",
        cpu=1,
        gpu="T4",
        image=Image(
            python_version="python3.8",
            python_packages=["pillow==8.2.0"],
            commands=["apt-get update && apt-get install -y libpng-dev libjpeg-dev"]
        ),
    ),
)


@app.rest_api()
def predict(**inputs):
    return {"response": inputs["text"]}

Test Your Beam Code

To test your Beam code, you can spin up a temporary dev server:

beam serve app.py:predict

This will spin up a dev server on the cloud, so you can test your code changes live as you work:

(.venv) user@MacBook demo % beam serve app.py:predict
 i  Using cached image.
 ✓  App initialized.
 i  Uploading files...
 ✓  Container scheduled, logs will appear below.
⠴ Starting container... 5s (Estimated: 3m20s)

================= Call the API =================

curl -X POST 'https://apps.beam.cloud/serve/3dpga/650b636542ef2e000aef54fa' \
-H 'Accept: */*' \
-H 'Accept-Encoding: gzip, deflate' \
-H 'Connection: keep-alive' \
-H 'Authorization: Basic [YOUR_AUTH_TOKEN]' \
-H 'Content-Type: application/json' \
-d '{}'

============= Logs Streamed Below ==============

INFO:     | Starting app...
INFO:     | Loading handler in 'app.py:predict'...
INFO:     | Ready for tasks.

Deploy to Beam

When you are ready to deploy your app, run this command:

beam deploy app.py

This will package your app and spin up a persistent web endpoint to invoke it. You can then invoke your deployed API with a cURL command:

curl -X POST --compressed "https://apps.beam.cloud/ahg0v" \
   -H 'Accept: */*' \
   -H 'Accept-Encoding: gzip, deflate' \
   -H 'Authorization: Basic [YOUR_AUTH_TOKEN]' \
   -H 'Connection: keep-alive' \
   -H 'Content-Type: application/json' \
   -d '{"text": "hello world"}'

Next Steps

After you’ve refactored your app to run on Beam, there are more optimizations you can take advantage of.