DEV Community

Discussion on: Should Your Next Backend Be Serverless?: A Perspective From Someone in the Trenches

Collapse
 
brianleroux profile image
xnoɹǝʃ uɐıɹq

The solution to coldstart is to author small single responsibility cloud functions. This is the best practice advice from AWS on Lambda. Coldstart is directly correlated to function payload size and to avoid it: write smaller functions. We've found sub 5mb will load sub second. Usually 150ms cold. (Aside: pinging/lambda warmers DO NOT fix coldstart. They hide it. If you get 2 concurrent requests you will still coldstart 1 of them. Pinging only keeps 1 Lambda warm.)

Collapse
 
rvirani1 profile image
Riaz Virani

Thanks for the feedback. Is this only specific to Lambda or have you found this to be the case on all the major cloud function as a service tools? I found on Vercel that this was always multi second even with single line functions. How difficult is it to keep to the sub 5MB size given that you might use a NPM package that's bigger than that (thinking of some of the Node database clients)?