2

I am now trying to read CSV file(only column name) from S3 bucket using Lambda function. I have created an S3 trigger within Lambda. Here is the sample code;

import json
import boto3
import csv

s3_client = boto3.client('s3')
def lambda_handler(event, context):
    # TODO implement
    bucket = event['Records'][0]['s3']['bucket']['name']
    csv_file = event['Records'][0]['s3']['object']['key']
    response = s3_client.get_object(Bucket=bucket, Key=csv_file)
    lines = response['Body'].read().decode('utf-8').split()
    results = []
    for row in csv.DictReader(lines):
        results.append(row.name())
    print(results)
    return {
        'statusCode': 200,
        'body': json.dumps('Hello from Lambda!')
    }

Whenever I try to upload a new file, i get this error;

[ERROR] ClientError: An error occurred (AccessDenied) when calling the GetObject operation: Access Denied
Traceback (most recent call last):
  File "/var/task/lambda_function.py", line 10, in lambda_handler
    response = s3_client.get_object(Bucket=bucket, Key=csv_file)
  File "/var/runtime/botocore/client.py", line 386, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/var/runtime/botocore/client.py", line 705, in _make_api_call
    raise error_class(parsed_response, operation_name

I added a specific role and provided necessary permissions to my S3 bucket.

Snippet of my S3 bucket policy:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Deny",
            "Principal": "*",
            "Action": "s3:PutObject",
            "Resource": "arn:aws:s3:::my-bucket/*",
}
}

I have provided necessary permissions to my s3 bucket. But still getting this error:

Response
{
  "errorMessage": "An error occurred (AccessDenied) when calling the GetObject operation: Access Denied",
  "errorType": "ClientError",
  "stackTrace": [
    "  File \"/var/task/lambda_function.py\", line 19, in lambda_handler\n    response = s3_client.get_object(Bucket=bucket, Key=csv_file)\n",
    "  File \"/var/runtime/botocore/client.py\", line 386, in _api_call\n    return self._make_api_call(operation_name, kwargs)\n",
    "  File \"/var/runtime/botocore/client.py\", line 705, in _make_api_call\n    raise error_class(parsed_response, operation_name)\n"
  ]
}

Can anyone tell me why am i getting this error?

5
  • Your lambda function's IAM Role does not allow s3:GetObject on the resource. Also check bucket policy which could be denying the same. Usually folks don't need to use bucket policy, so the default should probably be fine for you. Commented Jan 18, 2022 at 5:31
  • I'm having s3 full access. Commented Jan 18, 2022 at 5:32
  • 1
    You wrote that the policy is S3 bucket policy. This is invalid bucket policy. Please double check what is your real bucket policy, and also what is your lambda execution role. Commented Jan 18, 2022 at 9:33
  • Yes, you are right. My S3 bucket policy denies any put object request unless the object in encrypted.. How can I resolve this? Commented Jan 18, 2022 at 9:54
  • @Marcin, i've updated my ques with more info. can you help me with this? Commented Jan 20, 2022 at 3:10

2 Answers 2

2

Assuming that your S3 bucket is the one in-charge of invoking the lambda function. This will require two parties to have permissions.

1). The bucket needs to have a policy that allows it to trigger the function. 2). The lambda that will pull the CSV files from the bucket needs policy too man. In order to achieve the second part, you might want to consider pre-built policy templates available in SAM templates, this will not only make your policy definition more readable but also limits the actions that your lambda can perform on your buckets. The first sample below showcases how to grant S3 CRUD permissions

S3CsvReactor:
  Type: "AWS::Serverless::Function"
  Name: "csv-process-function"
  Properties:
    CodeUri: csv-processor-function/
    Handler: app.execute
    Timeout: 30 # Seconds
    Runtime: Python 3.8
    MemorySize: 512
    Policies:
      - S3CrudPolicy:
          BucketName: "s3-containing-your-csv"

This example below showcases read-only implementation

S3CsvReactor:
  Type: "AWS::Serverless::Function"
  Name: "csv-process-function"
  Properties:
    CodeUri: csv-processor-function/
    Handler: app.execute
    Timeout: 30 # Seconds
    Runtime: Python 3.8
    MemorySize: 512
    Policies:
      - S3ReadPolicy:
          BucketName: "s3-containing-your-csv"

This example below showcases write-only implementation

S3CsvReactor:
  Type: "AWS::Serverless::Function"
  Name: "csv-process-function"
  Properties:
    CodeUri: csv-processor-function/
    Handler: app.execute
    Timeout: 30 # Seconds
    Runtime: Python 3.8
    MemorySize: 512
    Policies:
      - S3WritePolicy:
          BucketName: "s3-containing-your-csv"
Sign up to request clarification or add additional context in comments.

1 Comment

Unfortunately, I am unable to follow your solution above. now sure where and how to create or update above policy
0

Make sure the role can perform the PutObject and GetObject IAM actions on the bucket in the IAM resource specified in the IAM policy. Wrap your logic in a "try/except" block as well as a good practice to catch your errors, you might be surprised that the nature of the error from S3 that is propagating occurs earlier than expected. Furthermore, in the Lambda console select this function then click on the "Monitor" tab to be redirected to the CloudWatch Logs console, then you can read more details about the error especially if you are returning exceptions early.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.