The way this is usually worked around is to use a storage service like Amazon S3. From the Lambda you can fetch the image or video and process it from within the function without having to pass the data itself as a payload.
L.A. based web developer slowly parsing through Stack Overflow. If you like hot web dev tips or stories about being a freelancer, check out my newsletter: https://codenutt.substack.com/p/coming-soon
I see. Not sure that works though. The AWS Severless Image Handler template doesn't even support that. I know because I tried. I tried so hard lol. Even pulling from a S3 bucket, it still hits that limit.
Hmm, so the event object with a reference to the S3 image shouldn't be that large, not sure why that wouldn't work, I would have to see more details I think. For reference though, this is essentially the way I usually handle it (the general idea): github.com/dabit3/s3-triggers/blob...
L.A. based web developer slowly parsing through Stack Overflow. If you like hot web dev tips or stories about being a freelancer, check out my newsletter: https://codenutt.substack.com/p/coming-soon
Thanks for the info! I tried to dig into my error log to see what the specific error was, but I deleted the Cloud Formation, so those are no longer available. I'm pretty sure it was always the error of "exceeded payload limit" and all I was doing was fetching an S3 object from a bucket and resizing it. I'm pretty sure it wasn't a memory issue because the instances never exceeded the memory limit, but I dunno.
I ended up just using a service because I couldn't get it to work, but maybe I'll keep poking around.
Thanks for sharing your method! It's very helpful.
L.A. based web developer slowly parsing through Stack Overflow. If you like hot web dev tips or stories about being a freelancer, check out my newsletter: https://codenutt.substack.com/p/coming-soon
What payload were you using to invoke your lambda. The error message is indicative that the JSON is over the 256 KB limit on lambda. If the lambda is invoked by an event in S3, you then can hydrate the images that is in S3 using the payload described here:
L.A. based web developer slowly parsing through Stack Overflow. If you like hot web dev tips or stories about being a freelancer, check out my newsletter: https://codenutt.substack.com/p/coming-soon
Cool. Thank you! I'm not entirely sure, but whenever I tried to resize an image that was more than 6mbs, it failed. It was pulling straight from an S3 bucket.
L.A. based web developer slowly parsing through Stack Overflow. If you like hot web dev tips or stories about being a freelancer, check out my newsletter: https://codenutt.substack.com/p/coming-soon
L.A. based web developer slowly parsing through Stack Overflow. If you like hot web dev tips or stories about being a freelancer, check out my newsletter: https://codenutt.substack.com/p/coming-soon
Ive been unzipping 3gb+ archives from s3 using streams, so this might help. Ps. i also compress images, tested with 10mb+ so that's probably not an aws issue.
L.A. based web developer slowly parsing through Stack Overflow. If you like hot web dev tips or stories about being a freelancer, check out my newsletter: https://codenutt.substack.com/p/coming-soon
L.A. based web developer slowly parsing through Stack Overflow. If you like hot web dev tips or stories about being a freelancer, check out my newsletter: https://codenutt.substack.com/p/coming-soon
The way this is usually worked around is to use a storage service like Amazon S3. From the Lambda you can fetch the image or video and process it from within the function without having to pass the data itself as a payload.
I see. Not sure that works though. The AWS Severless Image Handler template doesn't even support that. I know because I tried. I tried so hard lol. Even pulling from a S3 bucket, it still hits that limit.
Hmm, so the event object with a reference to the S3 image shouldn't be that large, not sure why that wouldn't work, I would have to see more details I think. For reference though, this is essentially the way I usually handle it (the general idea): github.com/dabit3/s3-triggers/blob...
You also could have run into the memory limit, if so you can increase it
Thanks for the info! I tried to dig into my error log to see what the specific error was, but I deleted the Cloud Formation, so those are no longer available. I'm pretty sure it was always the error of "exceeded payload limit" and all I was doing was fetching an S3 object from a bucket and resizing it. I'm pretty sure it wasn't a memory issue because the instances never exceeded the memory limit, but I dunno.
I ended up just using a service because I couldn't get it to work, but maybe I'll keep poking around.
Thanks for sharing your method! It's very helpful.
Do you know if your method works for images over 6Mbs?
What payload were you using to invoke your lambda. The error message is indicative that the JSON is over the 256 KB limit on lambda. If the lambda is invoked by an event in S3, you then can hydrate the images that is in S3 using the payload described here:
docs.aws.amazon.com/AmazonS3/lates...
Here's the doc on creating an S3 event:
docs.aws.amazon.com/AmazonS3/lates...
Cool. Thank you! I'm not entirely sure, but whenever I tried to resize an image that was more than 6mbs, it failed. It was pulling straight from an S3 bucket.
If you can post the code in a gist somewhere, I'll help you out.
Will do. Much appreciated
What about using streams?
You wouldn't have to download the image in its entirety in order to process it.
Wouldn't it solve your problem?
That sounds like it may work. I'll test it out and let you know! Thanks!
Ive been unzipping 3gb+ archives from s3 using streams, so this might help. Ps. i also compress images, tested with 10mb+ so that's probably not an aws issue.
That's good to know. Thank you! Do you have the code available for that?
Not yet. I need to clean it up and then ill be sharing it on my github - just need to find a moment to do that ;)
Cool. Thanks!