I did do a lot of digging, and one solution I found was changing the "no. of open files limit" on linux using the ulimit command. By default the limit is something like 1024 files, but this can be changed using ulimit.
I'm not actually seeing this error in real life, these api's are for small private businesses and their employees, so about 50-100 connections per service, with about 200-500 req/sec. I know premature optimization is the root of all evil, but I wanted to make it scalable and future-proof :]
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
I did do a lot of digging, and one solution I found was changing the "no. of open files limit" on linux using the ulimit command. By default the limit is something like 1024 files, but this can be changed using ulimit.
I'm not actually seeing this error in real life, these api's are for small private businesses and their employees, so about 50-100 connections per service, with about 200-500 req/sec. I know premature optimization is the root of all evil, but I wanted to make it scalable and future-proof :]