Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to receive a file with an Aws Lambda (python)

I'm trying to figure out how to receive a file sent by a browser through an API call in Python.

The web client is allowed to send any types of files (let's say .txt, .docx, .xlsx, ...). I don't know if I should use binary or not.

The idea was to save the file after on S3. Now I know it's possible to use js libraries like Aws Amplify and generate a temporary url but i'm not too interested in that solution.

Any help appreciated, I've searched extensively a solution in Python but i can't find anything actually working !

My API is private and i'm using serverless to deploy.

files_post:
  handler: post/post.post
  events:
    - http:
        path: files
        method: post
        cors: true
        authorizer: 
          name: authorizer
          arn: ${cf:lCognito.CognitoUserPoolMyUserPool}

EDIT

I have a half solution that works for text files but doesn't for PDF, XLSX, or images, if someone had i'd be super happy

from cgi import parse_header, parse_multipart
from io import BytesIO
import json


def post(event, context):


    print event['queryStringParameters']['filename']
    c_type, c_data = parse_header(event['headers']['content-type'])
    c_data['boundary'] = bytes(c_data['boundary']).encode("utf-8")

    body_file = BytesIO(bytes(event['body']).encode("utf-8"))
    form_data = parse_multipart(body_file, c_data)

    s3 = boto3.resource('s3')
    object = s3.Object('storage', event['queryStringParameters']['filename'])
    object.put(Body=form_data['upload'][0])
like image 361
Tibo Avatar asked Mar 01 '26 11:03

Tibo


1 Answers

There was a very similar question asked here (and answered in the comments).

The short answer is that, to avoid CORS and to prevent file corruption, you need to:

  1. use the AWS SDK to create an empty object in S3
  2. return a pre-signed URL of that object to the front-end
  3. PUT the file to that pre-signed URL

AWS actually maintains a very good example of implementing this scenario in Go.

In Python, a simple implementation would be something like:

import boto3
import os
import csv

s3 = boto3.client('s3')

# Upload the empty file to S3
s3.put_object(Bucket='foobucket', Key='somekey', Body='')

# Generate a pre-signed URL for the empty file
url = s3.generate_presigned_url(
    ClientMethod='get_object',
    Params={
        'Bucket': 'foobucket',
        'Key': 'somekey'
    }
)

You'll likely need to use io.BytesIO to byte stream your file in Python buffer, but prepare to write custom logic using specific Python libraries to properly stream the upload to the buffer.

Once your file is streamed to the buffer, just use the Client.put_object method:

s3.put_object(Bucket=bucket_name, Key=object_key, Body=buffer, ContentType='csv')

Remember to change tour ContentType to suit your needs (this example uses CSV).

like image 53
ford-at-aws Avatar answered Mar 04 '26 01:03

ford-at-aws