The Gist

  • Amazon enters the generative AI race. With the release of Amazon Bedrock, developers can integrate artificial intelligence (AI) systems into their software using foundation models (FMs) from various labs and companies.
  • Availability of new infrastructure. Built for machine learning and optimized for large-scale generative AI applications.
  • Code suggestions in real-time. Amazon announced the general availability of AWS’s CodeWhisperer.

Amazon Bedrock, a cloud service that enables developers to incorporate artificial intelligence systems into their software, with features comparable to OpenAI's ChatGPT, is among the latest round of new generative AI development tools for Amazon Web Services (AWS), the company unveiled Thursday.

OpenAI. Microsoft. Google. Baidu. Databricks. Now Amazon. Brace yourself for a long race for the top spot in generative AI. And if you're a marketer or customer experience professional, this means you'll have choices — lots — in terms of where to put your content, marketing campaigns and customer data analysis programs.

Andy Jassy, Amazon’s president and CEO, acknowledged that generative AI applications like ChatGPT have captured widespread attention. He added, “We believe most customer experiences and applications will be reinvented with generative AI” but noted that Amazon has been extensively utilizing machine learning for the last 25 years across various areas such as personalized ecommerce recommendations, fulfillment center pick paths, Prime Air drones and Alexa.

“We have been working on our own LLMs for a while now, believe it will transform and improve virtually every customer experience, and will continue to invest substantially in these models across all of our consumer, seller, brand, and creator experiences,” Jassy said. “Additionally, as we’ve done for years in AWS, we’re democratizing this technology so companies of all sizes can leverage Generative AI ... Let’s just say that LLMs and Generative AI are going to be a big deal for customers, our shareholders, and Amazon.”

Scaling Generative AI-Based Applications Using Foundational Models

According to company officials, Amazon Bedrock will make access to high-performing foundational models (FMs) easier and more straightforward with API access to FMs from various labs and companies, including AI21 Labs, Anthropic and Stability AI. And in the coming months, broad access will be available to Amazon’s two new Titan FMs, including a generative large language model (LLM) for tasks such as summarization and text generation, and an embeddings LLM that translates text inputs into numerical representations.

Swami Sivasubramanian, VP of data and machine learning at AWS, shared Amazon’s AI news in a blog post.

“Bedrock is the easiest way for customers to build and scale generative AI-based applications using FMs, democratizing access for all builders,” Sivasubramanian said. “Bedrock will offer the ability to access a range of powerful FMs for text and images — including Amazon’s Titan FMs, which consist of two new LLMs we’re also announcing today — through a scalable, reliable, and secure AWS managed service.”

With the new tech, Sivasubramanian said, customers will have the flexibility to find a model that meets their requirements, personalize FMs utilizing their data, and incorporate and execute them into their applications utilizing the AWS tools and integration.

Learning Opportunities

“Bedrock makes the power of FMs accessible to companies of all sizes so that they can accelerate the use of ML across their organizations and build their own generative AI applications because it will be easy for all developers,” Sivasubramanian said. “None of the customer’s data is used to train the underlying models, and since all data is encrypted and does not leave a customer’s Virtual Private Cloud (VPC), customers can trust that their data will remain private and confidential.”

Bedrock is now available in limited preview.

Related Article: 3 Customer Experience Lessons Brands Can Learn From Amazon

Amazon Enters Generative AI Race — and It's Not Done

In addition to Bedrock, Amazon also announced the general availability of a new infrastructure built for machine learning that includes Amazon EC2 Trn1n instances powered by AWS Trainium with 1600 Gbps of network bandwidth, and Amazon EC2 Inf2 instances powered by AWS Inferentia2, optimized for large-scale generative AI applications.

Further, building on last year’s limited preview, Amazon announced the general availability of AWS’s CodeWhisperer — tech capable of generating code suggestions in real-time — for Python, Java, JavaScript, TypeScript and C# — and 10 new languages.

In a letter to investors published on April 13, Amazon's Jassy confirmed the company will be investing heavily in large language models (LLMs) and generative AI. While noting that machine learning has held “high promise for several decades,” Jassy said it’s only within the last five to 10 years that companies are increasingly adopting its use, spawned by the recent availability of “access to higher volumes of compute capacity at lower prices than was ever available.”

fa-regular fa-lightbulb Have a tip to share with our editorial team? Drop us a line: