DEV Community

Discussion on: Zip files on S3 with AWS Lambda and Node

Collapse
 
bonwon profile image
bonwon

That case is detailed at medium.com/@johnpaulhayes/how-extr....

Thread Thread
 
pavelloz profile image
Paweł Kowalski • Edited

Read the zip file from S3 using the Boto3 S3 resource Object into a BytesIO buffer object

Yeah, buffer.

Thats what ive got, i wanted to have streams to have possibility to support big files, not files that can fit into memory.

This guy is calling 500MB huge because thats the max temp size on lambda (which would be ok, but realistically, saving extracted files to tmp just to upload them to s3 is kind of wasteful and nobody should do that anyways), well, for me thats not huge at all, i was aiming at couple GBs for a good measure.

Also when he writes This method does not use up disk space and therefore is not limited by size. he is wrong. Limit is limited, and it would be far less than 3GB (max possible memory on lambda), depending on file types (binary/text) and their number.