Other agent frameworks can't...

• Multi-task
• Automatically synchronize state
• Run each task in an isolated environment
• Scale to 100s of GPUs

Beam gives you everything you need

• Sandboxed compute environments
• Concurrency
• Task management and queuing
• Edge deployment and autoscaling
• Authentication
• Lots of GPUs

Introduction

Today, most agent frameworks are based on graph DAGs. While useful for simple tasks, this limits you to performing one action at a time (i.e. using one tool at a time).

Beam uses a new agentic concurrency model, based on petri nets, which are capable of multi-tasking complex, multi-threaded workflows.

By combining this agent framework with Beam’s cloud compute, you can build powerful, parallelized applications.

Core Concepts

Our agent framework has three important components: locations, transitions, and markers.

For this example, suppose we’re modeling an eCommerce store.

  • Locations — these are specific states or conditions that hold tokens. For example, in_shopping_cart or in_queue.
  • Transitions — events or actions that cause state changes. For example place_order or accept_payment. Each transition has:
    • Inputs — the locations the data is consumed from.
    • Outputs — the locations the data is sent.
  • Markers — markers are data types. For example, order_12345_red_shoes.

Think of a Petri net as a factory assembly line, where parts (markers) move between workstations (locations), and tasks (transitions) are performed when all required parts are in place.

Initial Setup

Let’s setup a simple hello world chatbot. This chatbot will respond to messages from a user. It will ask the user for their name, and attempt to update the status of their order.

Pre-requisites

Hello World

We’ll start by creating a bot, a transition, and initial state markers:

from beam import Bot


# Create the bot -- make sure to add your own OpenAI API key
bot = Bot(
    model="gpt-4o",
    api_key="YOUR_OPENAI_API_KEY",
    locations=[],
    description="A simple bot to cancel orders.",
)

Managing State

Now we’ll add locations and markers, which represent state.

app.py
from beam import Bot, BotContext, BotLocation
from pydantic import BaseModel


# Marker states for the bot
class UserName(BaseModel):
    name: str


class OrderStatus(BaseModel):
    message: str


# Create the bot -- make sure to add your own OpenAI API key
bot = Bot(
    model="gpt-4o",
    api_key="YOUR_OPENAI_API_KEY",
    locations=[
        BotLocation(marker=UserName),
        BotLocation(marker=OrderStatus),
    ],
    description="A simple bot to cancel orders.",
)

Adding Transitions

Let’s add our first transition. A transition is a state change. It takes our UserName location and returns an OrderStatus location.

app.py
from beam import Bot, BotContext, BotLocation
from pydantic import BaseModel


# Marker states for the bot
class UserName(BaseModel):
    name: str


class OrderStatus(BaseModel):
    message: str


# Create the bot -- make sure to add your own OpenAI API key
bot = Bot(
    model="gpt-4o",
    api_key="YOUR_OPENAI_API_KEY",
    locations=[
        BotLocation(marker=UserName),
        BotLocation(marker=OrderStatus),
    ],
    description="A simple bot to cancel orders.",
)


# This transition prompts the user for their name and cancels their orders
@bot.transition(
    cpu=1,
    memory=128,
    inputs={UserName: 1},
    outputs=[OrderStatus],
    description="Cancels a users order.",
)

Interacting with User Input

Let’s add basic logic in the transition. We’ll accept a username, and update a dict with the user’s order status.

Adding Prompts

We’ll introduce a new concept, called context, which is a class that provides various convenience methods for your bot.

app.py
from beam import Bot, BotContext, BotLocation
from pydantic import BaseModel


# Marker states for the bot
class UserName(BaseModel):
    name: str


class OrderStatus(BaseModel):
    message: str


# Hardcoded user data (mock database)
USER_DATA = {"Alice": "processing", "Bob": "shipped"}

# Create the bot -- make sure to add your own OpenAI API key
bot = Bot(
    model="gpt-4o",
    api_key="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",
    locations=[
        BotLocation(marker=UserName),
        BotLocation(marker=OrderStatus),
    ],
    description="A simple bot to cancel orders.",
)


# This transition prompts the user for their name and cancels their orders
@bot.transition(
    cpu=1,
    memory=128,
    inputs={UserName: 1},
    outputs=[OrderStatus],
    description="Cancels a users order.",
)
def cancel_order(context: BotContext, inputs):
    # Get the name provided by the user
    user_name = inputs[UserName][0].name

    # Update the user's order status
    USER_DATA[user_name] = "cancelled"

    # Send a message in the chat
    context.say(f"✅ Order cancelled for {user_name}")

    # Return a marker state
    return {OrderStatus: [OrderStatus(message="order_cancelled")]}

Human-in-the-loop

We can also add a confirmation prompt, so that user input is required before the bot can proceed to the next step.

Let’s add the confirm flag to the transition:

app.py
@bot.transition(
    cpu=1,
    memory=128,
    inputs={UserName: 1},
    outputs=[OrderStatus],
    description="Cancels a users order.",
    confirm=True
)

The bot will only proceed if the user confirms the request.

Adding Transitions

Let’s add a second transition, which will issue a refund to the user after they cancel their order.

This transition will fire when an OrderStatus marker is created.

class RefundStatus(BaseModel):
    message: str

@bot.transition(
    cpu=1,
    memory=128,
    inputs={OrderStatus: 1},
    outputs=[RefundStatus],
    description="Offers a refund to the user after cancelling their order.",
    expose=False,  # The bot won't take this into account when asking the user for input
)
def offer_refund(context: BotContext, inputs):
    # Process the refund (mock logic)
    refund_message = f"Your refund for the cancelled order has been processed. You should see it in your account within 3-5 business days."

    # Send a message in the chat
    context.say(refund_message)

You’ll notice that we’re not returning a marker, because this transition marks the end of our workflow. After this transition runs, there’s no state left to update.

Advanced Usage

Controlling Bot Awareness

Based on the system prompt, the bot automatically knows about all the locations and transitions defined in the network. This means that the bot will understand its role based on the data you add to your transitions.

However, you might not want the bot to know about certain transitions or locations!

If you want certain things hidden from the bot’s context, you can pass expose=false to locations and transitions.

Think of hidden transitions as ‘backstage actions’ — users can still interact with them, but the bot won’t take it into account in its reasoning.

@bot.transition(
    cpu=1,
    memory=128,
    inputs={OrderStatus: 1},
    outputs=[RefundStatus],
    description="Offers a refund to the user after cancelling their order.",
    expose=False,  # This prevents the bot from using this transition in its reasoning
)

Using Context Commands

We provide a number of helper commands using a class called context.

Context variables can be used for prompting the user for input, creating blocking requests to the bot, and sending message to the user.

Available Commands

MethodDescription
context.confirm()Pause a transition until a user says yes or no.
context.prompt()Send a blocking or non-blocking request to the model (e.g., “summarize these reviews”). You can pass an optional wait_for_response=False boolean to make this non-blocking.
context.remember()Add an arbitrary JSON-serializable object to the conversation memory.
context.say()Output text to the user’s chat window.
context.send_file()Send a file to the user created during a transition.
context.get_file()Retrieve a file from the user during a transition.

Development Workflow

Testing

We’ll start by running the bot from our shell, as a temporary development server.

$ beam serve app.py:bot

This command will spin up a container in the cloud for the bot transition, and create an interactive dialogue in your shell.

=> Building image
=> Using cached image
=> Syncing files
=> Files synced
=> Invocation details
websocat 'wss://979563f1-b569-4f2c-8113-dc6ebca007d1.app.beam.cloud'

=> Session started: 062a4f
=> 💬 Chat with your bot below...

You can interact with your bot by typing into the shell. In your shell, you’ll see responses from the bot, as well as event logs from each transition that fires.

{
  "type": "agent_message",
  "value": "Hello! How can I assist you today? If you would like to cancel an order, please provide the user's name to get started.",
  "metadata": {
    "request_id": "b2f58de9-7c93-4eda-9a8d-fc42d4e40561",
    "session_id": "062a4f"
  }
}
# hi - please cancel alice's order
#
{
  "type": "agent_message",
  "value": "I've noted the request to cancel Alice's order. If there's anything else you need, just let me know!",
  "metadata": {
    "request_id": "93c0f077-b663-46a0-b110-d63376c8821f",
    "session_id": "062a4f"
  }
}

{
  "type": "transition_fired",
  "value": "cancel_order",
  "metadata": {
    "session_id": "062a4f",
    "task_id": "bf45f661-daaa-4d70-83f1-0479587fafe9",
    "transition_name": "cancel_order"
  }
}

{
  "type": "transition_started",
  "value": "cancel_order",
  "metadata": {
    "session_id": "062a4f",
    "task_id": "bf45f661-daaa-4d70-83f1-0479587fafe9",
    "transition_name": "cancel_order"
  }
}

{
  "type": "agent_message",
  "value": "\u2705 Order cancelled for alice",
  "metadata": {
    "session_id": "062a4f",
    "task_id": "bf45f661-daaa-4d70-83f1-0479587fafe9",
    "transition_name": "cancel_order"
  }
}

{
  "type": "transition_completed",
  "value": "cancel_order",
  "metadata": {
    "session_id": "062a4f",
    "task_id": "bf45f661-daaa-4d70-83f1-0479587fafe9",
    "transition_name": "cancel_order"
  }
}

Deployment

$ beam deploy app.py:bot --name order-bot

You can login to the Beam Dashboard and use the web UI to chat with your bot, view the network graph, and view the event logs for each task.

Creating Public Chatbots

You can also create sharable pages for your chatbot by adding authorized=False to your bot:

app.py
from beam import Bot, BotContext, BotLocation

bot = Bot(
    model="gpt-4o",
    api_key="YOUR_OPENAI_API_KEY",
    locations=[
        BotLocation(marker=UserName),
        BotLocation(marker=OrderStatus),
    ],
    description="A simple bot to cancel orders.",
    authorized=False,
)

When deployed, you can access a public URL for your bot, which looks like this: