NOTE: This post is originally published at dev.to
Intro Recently, there was a hackathon at my work place, Wonderkind, and with one of my colleagues, I created a intelligent door bell with AWS Serverless services + a raspberry pi.
Whenever someone clicks on the button of the ‘door bell’, it will capture a image and check through a Amazon Rekognition faces collection to see if the faces on the image are already indexed.
Recently, one of my friend/client came up with the scenario below.
Scenario: There is a lambda function, which calls to a external API and fetch some data. This external API only accepts incoming requests only from pre-configured whitelisted IPs. As per today, AWS doesnt’ support elastic IPs for Lambda. However, there is a simple work around.
Solution: Simple solution is, to create the lambda function within a VPC and make the 3rd party API to be accessible via a NAT gateway.
Lambda by nature is highly scalable. However there ares some limitations you need to consider when there are lot of Lambda functions run simultaneously.
Please note: This will not applicable for all the scenarios, but for a system with a high throughput.
Account Level Concurrent Execution Limit As at now, Lambda has a soft limit of 1000 concurrent executions per region. Which means, at any given moment, sum of lambda executions running belongs to all of your lambda functions in a single region must be less than 1000.