Key Highlights from AWS re:Invent 2024: CEO Keynote with Matt Garman

by Gary Porter, Senior Solution Architect, Professional Services, Rackspace Technology

Key Highlights from AWS re:Invent 2024: CEO Keynote with Matt Garman

 

Watching Matt Garman take the stage at AWS re:Invent 2024 for his inaugural keynote as AWS CEO, I was struck by the significance of the moment. Addressing more than 60,000 attendees in Las Vegas and 400,000 online, Garman shared an ambitious vision for empowering customers — spanning foundational services like compute and storage to groundbreaking advancements in AI. His keynote underscored AWS’s mission to equip organizations with the tools they need to solve complex challenges and scale operations while shaping a brighter future. At the heart of this mission lies a steadfast commitment to innovation and transformation.

“We invent so you can reinvent”

Garman reflected on AWS’s origins and its enduring commitment to innovation. He highlighted the company’s unique approach of working backward from customer needs — a strategy that has consistently delivered impactful solutions. Garman also announced a $1 billion global fund for startups launching in 2025, reaffirming AWS’s dedication to fostering innovation and empowering the next generation of builders.

Building blocks for the future with compute, storage and databases

AWS’s leadership in compute and security took center stage, with Garman describing these areas as the foundation for driving innovation. AWS continues to push the boundaries of compute with its custom-built processors. The fourth-generation Graviton processor delivers 45% better performance on Java workloads while reducing energy consumption by 60%. For example, Pinterest’s transition to Graviton resulted in a 47% workload reduction and a 62% decrease in carbon emissions. Garman emphasized that security is ‘built-in’ at every layer, helping AWS services remain secure by design.

The next frontier, according to Garman, is generative AI workloads. Most of these workloads rely on GPUs, and AWS’s collaboration with NVIDIA has resulted in the P6 instances featuring Blackwell chips, which are 2.5 times faster than previous generations. AWS’s own Trainium processors for AI training continue to advance, with the introduction of Tranium 2 UltraServers (Trn2), designed to expand what large AI models can achieve within a single massive instance. Garman also announced the next-generation Trainium 3 processor, slated for release in 2025, further strengthening AWS’s position as a leader in AI infrastructure.

Shifting gears to storage and databases

Following his discussion on compute, Garman turned his attention to AWS’s storage and database services. Garman highlighted how AWS S3, now storing over 400 trillion objects, continues to evolve with features like Intelligent Tiering, which has saved customers over $4 billion.

But as storage grows, so does its complexity. To address the increasing use of tabular data formats like Apache Parquet, Garman introduced Amazon S3 Tables, enabling the incorporation of multiple data types into S3 using Apache Iceberg for indexing. Metadata management is also getting an upgrade with the announcement of the Amazon S3 Metadata service, now available in preview, which automates metadata processes to further simplify data management.

Amazon Aurora, now celebrating 10 years as one of AWS’s fastest-growing services, continues to innovate at a rapid pace. Tackling the challenge of multi-region consistency, Garman explained how AWS revisited database fundamentals. By leveraging advancements in the Amazon Time Sync Service and streamlining the transaction commit process, Aurora has reduced time-to-consistency from seconds to microseconds. These advancements are now previewed in the new Amazon Aurora DSQL service, which is designed to enable low-latency, multi-region consistency for modern applications. Garman also noted that the same approach has been applied to NoSQL databases, enhancing Amazon DynamoDB Global Tables for faster and more consistent performance.

AI at the core of tomorrow’s applications

Generative AI is transforming industries, and during his keynote, Garman illustrated its potential with several compelling examples. Amazon Bedrock, AWS’s foundational platform for deploying and managing generative AI models, is at the center of this transformation. Customers like Genentech leverage Amazon Bedrock for complex applications such as drug discovery — reducing processes that once took years to mere minutes.

One of the latest advancements is Amazon Bedrock Model Distillation, which simplifies AI models and tunes them for specific use cases. Additionally, Amazon Bedrock Guardrails ensure accuracy by addressing concerns over hallucinations in generative AI outputs.

Expanding capabilities further, Amazon Bedrock Agents enable multi-agent collaboration and workflows, allowing customers to automate complex tasks using natural language instructions. Moody’s used these tools during beta testing and found substantial time savings in its analytics workflows.

AWS also introduced Amazon Nova, a suite of multi-modal AI models optimized for diverse applications. Amazon Nova Lite benchmarks on par with or better than popular models like Llama and Gemini, while Amazon Nova Canvas and Amazon Nova Reel support image and video generation. Garman teased even more ambitious advancements, including multi-modal-to-multi-modal models expected to launch in mid-2025.

Finally, Amazon Q, which is AWS’s AI-powered development and operations platform, took the spotlight for its ability to accelerate application modernization. By automating tasks such as unit testing, documentation and code reviews, Amazon Q reduces time-to-code and simplifies legacy platform migrations, enabling seamless upgrades for Windows and VMware applications.

Simplifying data and AI workflows

Simplifying workflows and unifying data access were central themes in Garman’s keynote. With Amazon Q Business, AWS offers businesses a way to consolidate data across formats and leverage powerful indexing capabilities. These features enable organizations to streamline workflows, automate processes and maintain robust security.

Garman also highlighted advancements in Amazon SageMaker, a central platform for data and analytics in AI model training. Recognizing the complexity of navigating across multiple screens, AWS introduced Amazon SageMaker Unified Studio — a centralized interface where users can configure all AI-related activities in one place. To further simplify data access, AWS also unveiled Amazon SageMaker Lakehouse, which enhances the data layer by unifying access to data across S3, Redshift, SaaS and federated sources.

Bringing innovation to life through customer success

Over the course of Garman’s address, AWS's ability to innovate came to life through compelling customer success stories. Pinterest’s adoption of Graviton processors led to significant cost savings and a 62% reduction in carbon emissions, while JP Morgan Chase demonstrated AWS’s capability to handle complex, high-performance workloads. Genentech showcased how Amazon Bedrock accelerates drug development, reducing the length of processes that once took years to mere minutes. Moody’s highlighted the power of Amazon Bedrock Agents, which helped automate advanced analytics workflows and achieve substantial efficiency gains.

Garman concluded his keynote by reaffirming AWS’s commitment to providing customers with choice and control over their technology. From compute to AI and beyond, AWS is helping customers achieve their business goals while driving innovation across the cloud landscape.

At Rackspace Technology, we’re proud to partner with AWS to deliver these innovations to our customers. Visit our AWS Marketplace profile to explore AWS services available to your organization.

Learn more about how we can help you build the future on AWS