So, out of curiosity, and because I hit a wall on a personal project, I started taking apart the AWS Lambda runtime.

Now, how Lambda is presented is "serverless", which is a lie. What it really does, under the hood, is provide a fast execution environment for your code, isolated. In order for the code to execute, AWS provides, and deploys, your code in their container platform, Firecracker. In effect, if you just upload the Lambda function, AWS dumps it in container, and executes it.

Anyway, in order to test a Lambda function, AWS provides the Lambda Runtime Interface Emulator. They also provide "ready built" containers for various runtimes, and then fail to document what these things actually do. Which means building the container, copying your code in, and running it locally.

So, what happens if the RIE isn't present?

Well, at least in Go, it bails out due to missing environment variables. This is fine, add those, test again, doing the custom curl POST with json payload. But because the custom container doesn't set up correctly, there's some other behavior likely missing.

Unlike most other environments, if you invoke Go with RIE on the Go Lambda image, the application fails due to occupied ports, and requests to the handler fail. Which means internally, in the image, AWS is doing something. they don't document.

Which means, the provided images invoke the RIE, or another wrapper, and that sets variables used by the Lambda execution.

So, I wrote a simple Go app that dumps the environment variables as an event payload, kicked up the container, and pulled stuff out.

Which should allow Go to run completely natively, if the environment is set up correctly.

Anyway, as I continue down this path, what this means is "if I'm going to execute Lambdas", the correct response is to optimize around execution speed and concurrence, while reducing memory footprints as much as possible.

Except, because of how the local Lambda runtime works, the default memory assigned to the container is going to be far in excess of the running Lambda in AWS.

So, if you're going to properly emulate the running Lambda, the environment variable AWS_LAMBDA_FUNCTION_MEMORY_SIZE for the container has to be set to the lowest amount of RAM the function can use, and the function needs to be as efficient as it can be about memory usage, and run concurrently as possible.

TLDR; write efficient code, use stuff that doesn't need a bunch of RAM.

#AWS #lambdafunctions #TheHorrorsPersist