I personally would use the SAM cli. This allows you do build, package and deploy your lambda locally with use of a Cloud Formation template - IMHO a better way to go for your application stack.
You define all of your resources -api gateway, resources, routes, methods, integrations and permissions in the template. Then use sam deploy to not only create your infrastructure, but deploy your application code when it changes.
Here is an example SAM Cloud Formation template that you place in the root of your repo (this assumes a lambda function that needs to access a dynamo db table).
AWSTemplateFormatVersion: 2010-09-09
Transform: AWS::Serverless-2016-10-31
Description: Your Stack Name
Resources:
DynamoTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: my-table
KeySchema:
AttributeName: id
KeyType: HASH
BillingMode: PAY_PER_REQUEST
Lambda:
Type: AWS::Serverless::Function
Properties:
FunctionName: your-lambda-name
CodeUri: app/
Handler: app.handler
Runtime: nodejs14.x
Environment:
Variables:
DYNAMO_TABLE: !Ref DynamoTable
Policies:
- AWSLambdaExecute
- Version: 2012-10-17
Statement:
- Effect: Allow
Action:
- dynamodb:GetItem
- dynamodb:Query
- dynamodb:Scan
- dynamodb:PutItem
- dynamodb:UpdateItem
Resource:
- !Sub 'arn:aws:dynamodb:${AWS::Region}:${AWS::AccountId}:table/${DynamoTable}'
Events:
ApiEvent:
Type: Api
Properties:
Path: /path
Method: get
Now deploying your code (or updated stack as you add / change resources) is as simple as:
sam deploy --template-file template.yml --stack-name your-stack-name --resolve-s3 --region your-region
the sam deploy does a lot in one line:
I include the --template-file parameter - but it is really not necessary in this case because the file is called "template.yml" - which the deploy command will look for.
the "--resolve-s3" option will automatically create an s3 bucket to upload your lambda function code to versus you having to define a bucket (and create it) outside of the template. The alternative would be to specify a bucket "--s3-bucket". However you would have to create that BEFORE attempting to create the stack. This is fine - but you should take this into account when you go to delete your stack. The bucket will not be included in that process and you need to ensure the bucket is deleted in addition to the stack.
I typically add in a Makefile and do my build, prune, test, etc as part of my build and deploy.
i.e.
publish:
npm run build && \
npm test && \
npm prune --production && \
sam deploy --stack-name my-stack resolve-s3 --region us-east-1
then
make publish
More than what you asked for I realize - but perhaps someone else will find utility in this.