Highlights From AWS Re:Invent 2024 | Datadog

Highlights from AWS re:Invent 2024

Author Andrew Krug

Published: 12月 9, 2024

Whether or not you made the journey to this year’s AWS re:Invent, there’s always a variety of great announcements lost amid an action-packed week of keynotes, breakouts, expo hall demos, and networking sessions. No need to worry—we’re always happy to be a big part of the re:Invent experience and share our observations with you.

Datadog at re:Invent 2024

You can also join us on December 17, 2024, for a re:Invent re:Cap livestream by registering here.

This year, re:Invent 2024 leaned heavily into the following key areas:

In addition to announcements in these areas, leading up to re:Invent there’s always a torrent of “pre:Invent” announcements that quietly roll out without the pomp and circumstance present in the keynote. Some of this year’s most impactful were:

  • Resource Control Policies or RCPs: For the first time since permissions boundaries, AWS has changed the evaluation model for access decisions. The resource policy system historically has been challenging for engineers and often resulted in unintended access. RCPs are a long awaited feature that allows teams to describe maximum potential access for resources at the account level. Read more here.
  • AssumeRoot: Earlier this year one of Datadog’s very own researchers noticed new APIs for AssumeRoot popping up in the AWS console. We had hoped this would be a control for securing and ultimately eliminating root account access. (See Staff Security Researcher Nick Frichette’s talk foreshadowing the release of AssumeRoot at fwd:cloudsec EU earlier in 2024.) We’re glad to see that our dreams came true this re:Invent season with the roll out of AssumeRoot, the beginning of the end of dangerous root account credentials. Read more here.

Generative AI

It’s no surprise that generative AI took center stage, and rightfully so as teams experiment with baking it into every part of their business. This theme was so prominent that most other categories of announcements had at least some indirect inclusion of generative AI. The big surprises were Amazon Nova and AWS Trainium.

Nova makes cost-efficient foundational models broadly available, ensuring that engineering teams are building at the right level for performance, content, and safety. Building on all the great learnings from years of operating Amazon Bedrock, this new family of products is going to be an accelerator to teams new to AI.

Responsible AI was front-and-center across the talk tracks as well. Bedrock has had Guardrails for some time. That hasn’t stopped many customers from innovating their own set of controls and systems for ethical use of AI. Datadog Staff Advocate Jason Yee joined CTO Joe Croney from Arc XP, Washington Post, to discuss how they are innovating on ensuring their models are providing a high integrity result across Washington Post web properties. See the session here.

Security and identity

While most of the announcements around security and identity were part of pre:Invent, some great security content was still available throughout the week. S3 Storage Lens launched last year as a major step forward in observing S3 buckets. Storage lens allows you to find data “hot spots,” such as frequent access of object storage, inside of your environment to identify where you can optimize.

In generative-AI applications, the most foundational security concern is certainly data; datasets for AI applications contain loads of sensitive data. That’s why the announcement of queryable object metadata for S3 buckets is a game changer for enhancing the security posture of these large and risky datasets. For the first time, AWS customers can create labels that identify data based on things like classification, regulatory compliance, application, and more inside of the boundary of a single bucket. Combined with a data security posture management (DSPM) tool and newly launched resource control policies, this will be a force multiplier for teams looking to see just what’s going on with all their data.

Continuing the generative-AI theme, there were talks like “Generative AI for security in the real world,” which showed how AWS is using GenAI internally to support the efforts of teams in tasks like incident response and threat hunting. During the session, the speaker acknowledged that this carries a massive risk, but it is still interesting in demonstrating how sweeping AI applicability is. Another prominent theme was the emphasis on strong engineering cultures and “making security simple.” Simple systems are knowable and therefore securable.

For other great talks check out:

Another big topic was software supply chain research, from IDE to production. We know builders and operators are struggling to keep up with the amount of triage and patching required to stay safe on a daily basis. We at Datadog launched our open source tool Software Supply Chain Firewall to help engineering teams stop malicious packages from making it into production beginning at the endpoint. Check out the session, given by Datadog Security Research’s Andrew Krug and Zack Allen, here.

Securing the software supply chain session

Developer Tools

There were a good number of flagship announcements around developer tools this year, with a strong focus on making engineering simpler with AI. One of the most impactful announcements was the launch of the Amazon Q developer agent. Q is a code-productivity tool that enables developers to use AI for tasks such as generating documentation, performing code review, building unit tests, and more. Q has popped up as a common theme, and while teams are excited about it, this likely won’t be replacing your documentation or testing teams anytime soon.

Computing and operations

AWS has continued to win over engineering teams with their diverse compute offerings that help teams “right size” and scale resources as needed. This year we saw continued maturity in those offerings with an emphasis on automation. Werner Vogels championed the principle “simplexity” in the day three keynote. Simplexity involves a focus on adaptability. This incentivizes engineering teams to design for evolution at the outset and heavily embrace automation that reduces toil. “What do we automate?” Vogels said. “That’s the wrong question. The right question is: What don’t we automate?”

One exciting announcement was the launch of EKS Auto Mode. Auto Mode helps engineers scale their EKS infrastructure up and down as needed based on fine-grain criteria gathered from observability tooling. Companies are generally great at scaling out for demand, but reduction of those systems after a period of high activity is often forgotten, resulting in waste. Datadog customer Cambia explored this and more in the session How Cambia supercharged their Amazon EKS-based platform.

Cloud costs

As their environments evolve, teams are going to continue to leverage automation and observability to optimize costs and end-of-life long-running AWS cloud resources. Billing and cost management is often viewed as one of the most tiring jobs in cloud. The rise of FinOps has changed the conversation around incentivizing fiscal responsibility with regard to cloud computing and made cost-consciousness a norm in many organizations. Datadog Senior Advocate Ajuna Kyaruzi and Deeja Cruz, FinOps analyst at Datadog, explored this and more in the session Driving cost optimization at scale.

Optimizing for efficiency was a massive thread through all of these announcements. Cloud users at companies of all sizes are looking to stay lean and ensure they are investing where it matters most. Unused resources are a security risk, a drain on the wallet, and contribute to environments that are difficult to maintain.

See you next year

This is just a short summary of the things that excited us at this year’s re:Invent. For more announcements during AWS re:Invent, visit the AWS News Blog. And be sure to look out for more recaps, summaries, and great content from our team on how you can build on the AWS Cloud. We hope to see you at next year’s re:Invent.