News

Generative AI Highlights AWS Summit New York City 2023

Perennial cloud computing leader Amazon Web Services (AWS), perhaps feeling the generative AI heat from fellow cloud giants Microsoft and Google, highlighted several of the company's advancements in that space at the recent AWS Summit New York City 2023.

AWS exec Swami Sivasubramanian quickly made it clear that Gen AI was a focus point of the company and the conference during a keynote presentation last Thursday.

"Generative AI has captured our imaginations for its ability to create images and videos, write stories and even generate code," said Sivasubramanian, VP of Databases, Analytics and ML at AWS. "I believe it will transform every application, industry and business."

Sivasubramanian isn't alone in that thinking, and Microsoft and Google have made more news than AWS amid the Gen AI tsunami started last year by the ChatGPT chatbot from Microsoft partner OpenAI.

However, while AWS might have seemed to be a step behind its cloud rivals in this important new technology, it has been making moves to catch up, such as last month announcing a $100 million investment into a new program, as detailed in the AWS Insider article, "AWS Invests $100 Million in New Generative AI Innovation Center."

Among seven AI-related highlights from the summit summarized by AWS, two concerned Amazon Bedrock, a fully managed service that makes foundation models (FMs) from leading AI startups and Amazon available via an API. Foundation models are large machine learning models trained on a vast quantity of data at scale, making them suitable for a wide range of downstream tasks, according to Wikipedia.

AWS announced a Bedrock expansion that involves additional FMs, a new model provider, and advanced capability to help customers build Gen AI applications. Cohere is the new FM provider, and the latest FMs come from Anthropic and Stability AI. AWS also showcased a new capability for creating fully managed agents in just a few clicks, which it described as "a game-changing feature" that can help builders regardless of their ML expertise.

"Agents for Amazon Bedrock is a new, fully managed capability that makes it easier for developers to create generative-AI based applications that can complete complex tasks for a wide range of use cases and deliver up-to-date answers based on proprietary knowledge sources," AWS said in a July 26 news release. "With just a few clicks, agents for Amazon Bedrock automatically break down tasks and create an orchestration plan -- without any manual coding. The agent securely connects to company data through a simple API, automatically converting data into a machine-readable format, and augmenting the request with relevant information to generate the most accurate response."

As far as the new FMs, they include Claude 2 from Anthropic -- a large language model (LLM) that can generate text for several purposes, including writing code -- and Stable Diffusion XL 1.0, which focuses on images/video and is capable of generating realistic creations for films, television, music and instructional videos.

"These FMs join AWS's existing offerings on Amazon Bedrock, including models from AI21 Labs and Amazon, to help meet customers where they are on their machine learning journey, with a broad and deep set of AI and ML resources for builders of all levels of expertise," AWS said in a post reviewing Gen AI innovations discussed at last week's event.

In addition to the Bedrock announcements, AWS highlighted:

  • Amazon OpenSearch Serverless now supports vector engine, which makes it easier for customers to use vectors for search and generative AI applications without managing a vector database infrastructure. Vector embeddings enable machines to understand relationships across text, images, audio and video content in an ML-digestible format.
  • Amazon QuickSight introduced generative BI, which allows customers to generate business intelligence using natural language questions and share insights with visualizations in seconds.
  • AWS HealthScribe is a new service that uses generative AI to create transcripts and clinical notes from speech recognition, helping health care professionals save time and focus on patients.
  • Amazon EC2 P5 instances are now available, powered by NVIDIA H100 GPUs, which are optimized for training and running large language models and generative AI applications faster than ever.
  • AWS offers on-demand skills training for different audiences who want to learn about, implement and use generative AI, including developers, engineers, data scientists, executives and AWS Partners.

About the Author

David Ramel is an editor and writer for Converge360.

Featured

Subscribe on YouTube