Deploying Tensorflow Models
Saving Models to a Beam Volume
You can mount a Volume
in your app and save your TF models inside of it.
Loading Models from a Beam Volume
You can easily read a file saved in the volume by passing in the path to the volume using the hub.load()
method:
Deploying Tensorflow Models
You can deploy your code as a REST API, Task Queue, or Scheduled Job using the beam deploy
command:
When you run this command, your browser window will open the Beam Dashboard. You can copy the cURL or Python request to call the API:
After making a request, you’ll see metrics appear in the dashboard:
Was this page helpful?