AWS Invests $100 Million in New Generative AI Innovation Center
To help customers build and deploy generative AI solutions, Amazon Web Services (AWS) today launched the Generative AI Innovation Center, a new program that connects AWS AI and machine learning experts with customers around the world.
AWS is investing $100 million in the program, which offers workshops, engagements and training to help customers design and launch new generative AI products, services and processes.
For anyone needing a reminder in this age of advanced machine learning constructs like ChatGPT -- which is based on the GPT- series of large language models (LLMs) from Microsoft partner OpenAI -- generative AI is a type of AI that can create new content and ideas, such as conversations, stories, images, videos and music.
It is powered by LLMs that are pre-trained on vast amounts of data and commonly referred to as foundation models (FMs). With generative AI, customers can reinvent their applications, create entirely new customer experiences, drive unprecedented levels of productivity and transform their business, industry sources say.
ChatGPT started the generative AI craze late last year, giving Microsoft a lead in the cloud AI space thanks to its huge investment in OpenAI. Google, quickly sensing the danger to its flagship search functionality, soon declared a "code red" initiative to catch up, resulting in a new, experimental Bard search engine which is based on the company's own LLM advancements and which could be viewed as the company's answer to Microsoft's "new Bing."
That seemed to leave AWS in a distant third place among cloud giants leveraging advanced AI, so the company has made some moves of its own, with the latest being today's announcement of the Generative AI Innovation Center,
The program helps customers use AWS generative AI services, such as:
- Amazon CodeWhisperer, an AI-powered coding companion
- Amazon Bedrock, a fully managed service that makes foundational models (FMs) from AI21 Labs, Anthropic and Stability AI
- Amazon's own family of machine language FMs, Amazon Titan, accessible via an API
These services enable customers to generate realistic and artistic images, code suggestions, summaries, translations and more with just a few lines of code or natural language prompts, said AWS, which also noted the program helps customers train and run their own models using high-performance infrastructure powered by AWS-designed ML chips and NVIDIA GPUs. Additionally, the program provides guidance on best practices for applying generative AI responsibly and optimizing machine learning operations to reduce costs.
"The center's team of strategists, data scientists, engineers, and solutions architects, as well as experts from the AWS Partner Network, will work step-by-step with customers to build bespoke solutions that harness generative AI," said AWS exec Uwem Ukpong in a June 22 post titled "What generative AI means for businesses and how AWS can help."
Ukpong continued: "For example, health care and life sciences companies can pursue ways to accelerate drug research and discovery. Manufacturers can build solutions to reinvent industrial design and processes. And financial services companies can develop ways to provide customers with more personalized information and advice."
Other moves AWS has made that might be seen as catch-up attempts are detailed in the Virtualization & Cloud review articles "AWS Makes Its Own Generative AI Moves with 'Bedrock' Dev Service, New LLMs" and "While Some Call for AI Pause, AWS Launches Generative AI Accelerator."
David Ramel is an editor and writer for Converge360.