1

I created a zip file with main.py and entrypoint handler function for AWS Lambda using the runtime python 3.7.

The zip file was packaged inside a Amazon Linux image on EC2 using 3.7.3 python.

I had some errors running on the AWS lambda, so decided if there's a way to run the function locally.

My main.py is below:

import datetime
import logging
import os
import re
import subprocess

import boto3
import certbot.main
import raven

logger = logging.getLogger()
logger.setLevel(logging.INFO)

def list_files(folder_path):
  onlyfiles = [f for f in listdir(folder_path) if path.isfile(path.join(folder_path, f))]
  logger.info('## path')
  logger.info(onlyfiles)


def read_and_delete_file(path):
  with open(path, 'r') as file:
    contents = file.read()
  os.remove(path)
  return contents


def provision_cert(email, domains):
  certbot.main.main([
    'certonly',                             # Obtain a cert but don't install it
    '-n',                                   # Run in non-interactive mode
    '--agree-tos',                          # Agree to the terms of service,
    '--email', email,                       # Email
    '--dns-route53',                        # Use dns challenge with route53
    '-d', domains,                          # Domains to provision certs for
    # Override directory paths so script doesn't have to be run as root
    '--config-dir', '/tmp/config-dir/',
    '--work-dir', '/tmp/work-dir/',
    '--logs-dir', '/tmp/logs-dir/',
  ])

  first_domain = domains.split(',')[0]
  first_domain_cert_folder = re.sub('\*\.', '', first_domain)
  path = '/tmp/config-dir/live/' + first_domain_cert_folder + '/'
  logger.info('## path')
  logger.info(path)

  list_files(path)

  return {
    'certificate': read_and_delete_file(path + 'cert.pem'),
    'private_key': read_and_delete_file(path + 'privkey.pem'),
    'certificate_chain': read_and_delete_file(path + 'fullchain.pem')
  }

def should_provision(domains):
  existing_cert = find_existing_cert(domains)
  if existing_cert:
    now = datetime.datetime.now(datetime.timezone.utc)
    not_after = existing_cert['Certificate']['NotAfter']
    return (not_after - now).days <= 30
  else:
    return True

def find_existing_cert(domains):
  domains = frozenset(domains.split(','))

  client = boto3.client('acm')
  paginator = client.get_paginator('list_certificates')
  iterator = paginator.paginate(PaginationConfig={'MaxItems':1000})

  for page in iterator:
    for cert in page['CertificateSummaryList']:
      cert = client.describe_certificate(CertificateArn=cert['CertificateArn'])
      sans = frozenset(cert['Certificate']['SubjectAlternativeNames'])
      if sans.issubset(domains):
        return cert

  return None

def notify_via_sns(topic_arn, domains, certificate):
  process = subprocess.Popen(['openssl', 'x509', '-noout', '-text'],
    stdin=subprocess.PIPE, stdout=subprocess.PIPE, encoding='utf8')
  stdout, stderr = process.communicate(certificate)

  client = boto3.client('sns')
  client.publish(TopicArn=topic_arn,
    Subject='Issued new LetsEncrypt certificate',
    Message='Issued new certificates for domains: ' + domains + '\n\n' + stdout,
  )

def upload_cert_to_acm(cert, domains):
  existing_cert = find_existing_cert(domains)
  certificate_arn = existing_cert['Certificate']['CertificateArn'] if existing_cert else None

  client = boto3.client('acm')
  acm_response = client.import_certificate(
    CertificateArn=certificate_arn,
    Certificate=cert['certificate'],
    PrivateKey=cert['private_key'],
    CertificateChain=cert['certificate_chain']
  )

  return None if certificate_arn else acm_response['CertificateArn']

def handler(event, context):
  try:
    domains = os.environ['LETSENCRYPT_DOMAINS']
    if should_provision(domains):
      cert = provision_cert(os.environ['LETSENCRYPT_EMAIL'], domains)
      upload_cert_to_acm(cert, domains)
      notify_via_sns(os.environ['NOTIFICATION_SNS_ARN'], domains, cert['certificate'])
  except:
    client = raven.Client(os.environ['SENTRY_DSN'], transport=raven.transport.http.HTTPTransport)
    client.captureException()
    raise

The zip file is about 20mb. I found resources on AWS about debugging locally, but frankly, I am a bit lost as to how to get started.

I am not very familiar with AWS and Lambda in general though I am fairly comfortable with Python.

I am on a macOS mojave and use visual studio code as my editor. I am okay to create virtualenv on my mac if that helps.

How do I debug my lambda on my local macbook pro?

1 Answer 1

4

As you've found, you can use AWS SAM (with Docker) to debug locally.

Below is a step-by-step guide on getting started:

Prerequisites

  1. Install Docker: https://docs.docker.com/install/
  2. Install AWS CLI: https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-install.html
  3. Install AWS SAM CLI: https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html
  4. Install pipenv: https://pipenv.kennethreitz.org/en/latest/install/

Create example project

Useful in giving you an understanding of the structure and debugging locally - you can then substitute the example code for your own.

sam init --runtime python3.7

Add dependencies

pipenv shell
pipenv install package-names

Run and debug locally

pipenv lock -r > requirements.txt
sam build --manifest requirements.txt
sam local invoke HelloWorldFunction --event event.json

Deploy to AWS Lambda

Make a new bucket, if needed, in which to store the function code:

aws s3 mb s3://bucket-name

Create and run a .sh script:

#!/bin/bash

pipenv lock -r > requirements.txt && sam build --manifest requirements.txt

sam package \
    --output-template-file packaged.yaml \
    --s3-bucket bucket-name

sam deploy \
    --template-file packaged.yaml \
    --stack-name name-of-lambda-stack \
    --capabilities CAPABILITY_IAM \
    --region us-east-1

Replacing:

  • bucket-name with the name of the S3 bucket to store the function code
  • name-of-lambda-stack with the name of the AWS Lambda Stack to deploy to
  • us-east-1 with another region if desired

Your function is now deployed to AWS Lambda.

Sign up to request clarification or add additional context in comments.

3 Comments

Thanks for the detailed writeup! I am going to give this a shot tomorrow morning :)
Do i need to create a virtual env? I usually do that for my python projects running on my docker host
Yep, these instructions use pipenv as the virtual environment (though you could probably use virtualenv).

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.