DEV Community

Cover image for The AWS Lambda Setting I Wish I’d Changed Earlier

The AWS Lambda Setting I Wish I’d Changed Earlier

I've been using AWS Lambda for quite a few years, and like many people, I left the architecture on its default settings, i.e. x86 (Without thinking 2x). It worked really well, it scaled as needed, but the truth unfolded when invoices showed up. End of Story. Fine Della Storia.

Upon closer examination of the situation, I paid more attention to the execution time & cost, and then I thought to experiment with Arm64, i.e. AWS Graviton. And the rare cloud optimisations that actually delivered, followed. 🎇 A VISIBLE WIN 🎇
.
.
.
Now the main Question

thinkging

What led me to try Arm64?

The answer's as simple as it gets: A few Lambda handling API Requests & Background jobs were being called 1,000,000s of times per month and frankly, each one looked quite cheap, individually. But when bunched together, they burned a steamy hole in my pocket every month.

Now being a lazy_developer who uses snake cases while programming, I said NO to refactoring the logic or the code. I just wanted a clean performance win. And that's when... Arm64 looked like a low-risk bet to me.

Time taken towards the CHANGE!!!

Now, when I say I moved from x86 to Arm64, the first thing to pop in your head would be, "Nah! It's gonna take forever to change." Right? WRONG. I changed the lambda architecture from x86 to Arm64, redeployed and tested it within a matter of hours (And that's because I chose NOT to read the documentation first 🥲. It could have been sooner). That was it. No code was changed, no dependencies were broken, and no surprises. This alone was... unexpected.

magic

What were my ACTUAL Observations

The moment I started moving only a few Lambdas functions to Arm64, the execution time dropped noticeably, especially for CPU-based logic. And guess what... the Cold Start felt snappier on lightweight Node.js & Python functions. In the end, the monthly lambda cost dip without touching memory allocation felt like a cherry on top.

So, nothing dramatic for every invocation, but at scale, the difference was truly impossible to ignore.

Reality Check: The Cost

cost

Lambda billing isn't rocket science. It's simple maths:

Execution time * Memory * Invocations
Enter fullscreen mode Exit fullscreen mode

If your function runs just 20-30 milliseconds faster, and it is executed millions of times, that tiny change in the execution speed turns into a new Apple Watch (Yes, no kidneys needed 😜). This was the moment Arm64 stopped feeling like an optimisation experiment & started feeling like a worthy replacement to x86 as default choice

Why did Arm64 work the best for ME?

In my small but meaningful experience, Arm64 shone brighter than star Sirius, when I used it for API Gateway-backed Lambdas, Event Driven pipelines, background jobs, schedulers, data processing & automation or even for lightweight ML inference. In short, I opted for Arm64, where I used modern serverless workloads with updated runtimes.

Where I would never use Arm64?

It's not like I am completely against x86 now. I still used it in my project for some scenarios. For example:

  • I kept x86 for older functions with legacy native binaries
  • Or for rare dependencies that weren't Arm-ready, yet
  • Or for the code, that I didn't want to touch, being that close to the deadline. (Programmers, IYKYK). For those who don't, we don't want the project go

boom

What changes in the future?

In the future, if I am starting a new serverless project (Or today, who knows):

  • I will definitely go for Arm64, by default
  • I am gonna only move back to x86, only & only, if something explicitly breaks (Or my client doesn't love their 💵)
  • Will benchmark early, instead of assuming

Conclusion

Arm64 on AWS Lambda isn't a future feature, it's just a teeny tiny and quiet upgrade that's already paying off its cost in real-life systems. It's not theoretical. IT's PRACTICAL. Well, if you care about factors such as performance, cost & efficiency and you're running modern hsuper heavy serverless workloads, listen to me (Or read this carefully): TRY IT ONCE. 9/10 Chances? Like me, you won't turn back 😉.

Once again, this was my personal experience.

Buon Anno

happy new year

HappyCoding

Top comments (0)