I made a nice template for running Blender bpy scripts on AWS Lambda using SAM. It’s on Github so go get it: ryanhalliday/bpy-lambda

Here is a few more code examples for how you can use it.

Logging

Lambda logging to Cloudwatch is actually pretty nice. Instead of using print() you should do this:

import logging, os

log = logging.getLogger()
log.setLevel(os.getenv('LOG_LEVEL', 'INFO'))

def lambda_handler(event, context):
    # ...
    log.info("My log message")
    # ...

Local testing with AWS resources

If you want to use remote AWS resources from your sam local invoke you can do something like this:

import os, boto3

if os.getenv('AWS_SAM_LOCAL', 'false') == 'true':
    s3_config = Config(
        region_name = os.getenv('AWS_REGION', None),
    )
    s3 = boto3.client('s3', config=s3_config,
        # Generally you should not use these and should instead use the Lambda role. They are intended for testing
        aws_access_key_id=os.getenv('AWS_ACCESS_KEY_ID', None),
        aws_secret_access_key=os.getenv('AWS_SECRET_ACCESS_KEY', None),
    )
else:
    s3 = boto3.client('s3')

def lambda_handler(event, context):
    # ...

Set the environment variables as you need them to have it running differently locally. You could even use Minio instead of S3 for local testing.

Downloading & Uploading resources with S3

The repository includes a small example on how to download files from S3 but here it is slightly modified:

import bpy, boto3

s3 = boto3.client('s3')

def lambda_handler(event, context):
    s3.download_file(os.getenv('S3_BUCKET'), 'WaterBottle.glb', '/tmp/WaterBottle.glb')
    bpy.ops.import_scene.gltf(filepath="/tmp/WaterBottle.glb")
    # ...
    s3.upload_file('/tmp/my_outile.glb', os.getenv('STL_S3_BUCKET'), 'my_outfile.glb')

Telling another service that the job is done

If you are doing jobs via SQS you may still want to know when they are done. I like using a webhook-like callback for this.

You could also use events in S3 or pushing another message back into SQS.

import requests

def lambda_handler(event, context):
    # ...
    # callback_token should be something unique to identify this job
    r = requests.put("https://your-app.com/bpy-callback", data={
            'callback_token': callback_token,
            'other_information': "hello world"
        },
        # headers=headers
    )
    r.raise_for_status()

I like to secure my APIs. Use something like an Authorization header:

headers = { 'Authorization': 'Bearer ' + os.getenv('CALLBACK_KEY') }
# Uncomment headers=headers above

You can store the secrets in AWS Secrets manager and get them out either via the AWS API in your Lambda or through something like this for an environment variable:

Parameters:
  SecretName:
    Type: String
    Description: Secret Name
  SecretKey:
    Type: String
    Description: Secret Key

Resources:
  BpyLambdaFunction:
    # ...
    Properties:
      # ...
      Environment:
        Variables:
            KEY: !Sub '{{resolve:secretsmanager:${SecretName}:SecretString:${SecretKey}}}'

Note: Environment variables will be visible in the AWS Lambda UI so if that is not a safe place for your secrets don’t do this.

An API Gateway use case: Getting model metadata

Getting data from assets I think is the best use case for API Gateway. Most other workloads should be via SQS or similar in my opinion so you aren’t waiting for an API request.

Some useful model metadata might be vertices counts, texture sizes, manifoldness etc.

def lambda_handler(event, context):
    
    # Download and import a model like above, or fetch it from URL etc.

    return {
        "statusCode": 200,
        "body": json.dumps(
            {
                "message": bpy.app.version_string,
                "vert_count": len(bpy.context.object.data.vertices)
            }
        ),
    }

Uncomment the following sections in template.yaml

Resources:
  BpyLambdaFunction:
    # ...
    Properties:
      # ...
      BpyLambdaApi:
        # More info about API Event Source: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#api
        Type: Api
        Properties:
          Path: /bpy
          Method: get
      # ...

Outputs:
  # ...
  BpyLambdaApi:
    Description: "API Gateway endpoint URL for Prod stage for BpyLambda function"
    Value: !Sub "https://${ServerlessRestApi}.execute-api.${AWS::Region}.amazonaws.com/Prod/bpy/"

Enabling extensions

In case you need an extension like 3D Print Toolbox, you can enable them like this:

import addon_utils

addon_utils.enable("object_print3d_utils")

def lambda_handler(event, context):
    # ...

Dead letter queue alarm

You probably want to know when a SQS job fails after being sent to the Lambda.

There are two options, a Cloudwatch Alarm email or another Lambda that sends you an email.

I setup a Cloudwatch Alarm following this tutorial: https://mnapoli.fr/sqs-dead-letter-queue-alarm

However this will only send you an email when something enters the DLQ and will not trigger again until the DLQ is emptied and then receives another message.

I may change to a Lambda eventually to prevent missing any errors but this was very easy to setup for now.

The end

If you use this template you should send me an email - I’d would be nice to know someone else is using it.

I may improve it more in the future but for now it runs great for my needs.