Cover image by @designecologist
In this post, I'll show you some of the most powerful and useful ways to use serverless functions.
Jump to 10 ...
For further actions, you may consider blocking this person and/or reporting abuse
An issue I ran into with Lambda Image processing is the 6MB payload limit. Did you get around that somehow?
The way this is usually worked around is to use a storage service like Amazon S3. From the Lambda you can fetch the image or video and process it from within the function without having to pass the data itself as a payload.
I see. Not sure that works though. The AWS Severless Image Handler template doesn't even support that. I know because I tried. I tried so hard lol. Even pulling from a S3 bucket, it still hits that limit.
Hmm, so the event object with a reference to the S3 image shouldn't be that large, not sure why that wouldn't work, I would have to see more details I think. For reference though, this is essentially the way I usually handle it (the general idea): github.com/dabit3/s3-triggers/blob...
You also could have run into the memory limit, if so you can increase it
Thanks for the info! I tried to dig into my error log to see what the specific error was, but I deleted the Cloud Formation, so those are no longer available. I'm pretty sure it was always the error of "exceeded payload limit" and all I was doing was fetching an S3 object from a bucket and resizing it. I'm pretty sure it wasn't a memory issue because the instances never exceeded the memory limit, but I dunno.
I ended up just using a service because I couldn't get it to work, but maybe I'll keep poking around.
Thanks for sharing your method! It's very helpful.
Do you know if your method works for images over 6Mbs?
What payload were you using to invoke your lambda. The error message is indicative that the JSON is over the 256 KB limit on lambda. If the lambda is invoked by an event in S3, you then can hydrate the images that is in S3 using the payload described here:
docs.aws.amazon.com/AmazonS3/lates...
Here's the doc on creating an S3 event:
docs.aws.amazon.com/AmazonS3/lates...
Cool. Thank you! I'm not entirely sure, but whenever I tried to resize an image that was more than 6mbs, it failed. It was pulling straight from an S3 bucket.
If you can post the code in a gist somewhere, I'll help you out.
Will do. Much appreciated
What about using streams?
You wouldn't have to download the image in its entirety in order to process it.
Wouldn't it solve your problem?
That sounds like it may work. I'll test it out and let you know! Thanks!
Ive been unzipping 3gb+ archives from s3 using streams, so this might help. Ps. i also compress images, tested with 10mb+ so that's probably not an aws issue.
That's good to know. Thank you! Do you have the code available for that?
Not yet. I need to clean it up and then ill be sharing it on my github - just need to find a moment to do that ;)
Cool. Thanks!
Serverless is awesome, and I was fascinated with it..before I started to use it, with limits for functions - you can't use it properly with 3rd party APIs. Cold time, non-trivial deployment, hard debugging, or if you did something wrong - you risk spending all your budget (if you forgot to set the limits in settings). I feel that serverless is good for well-top-notch-experienced developers and teams, enterprise-level companies, not for regular artisans. IMHO for sure, I found that for me much faster, easier, cheaper create a new VPS cluster with nodejs, instead of using serverless functions.
Not really - those were my concerns initially too, but you can get around most of those issues by using the serverless framework and its plugins (e.g. serverless offline plugin). And cold start is easily rectified by setting a cloud watch function to automatically income your lambda every 15 minutes
Cold start can be eliminated by new feature, reserved concurrency or smth. But tbh, i prefer just optimize function code to be fast to boot.
Lambda is also great for web scraping - in combination with CloudWatch Events even more 👍
Or running tests. Someone wrote how they made their tests concurrent and instead of some ridiculous time it os running within 20 seconds on 1000 lambdas ;)
Great article Nader
Thanks!!
You should mention that events are not available if you use Aurora Serverless, which is very annoying because if you need that, this will make your backend more complex and difficult to program.