News

Amazon Bedrock Now Ready for Prime Time; QuickSight Gets More Intuitive

AWS solidifies its place in the generative AI horse race with a flurry of announcements.

Amazon Web Services, which has found itself unusually trailing rivals Microsoft and Google in the breakneck AI arms race, this week announced a series of generative AI product milestones.

Foremost is the general availability of Bedrock, coming roughly five months after its debut. AWS' answer to Microsoft's Azure OpenAI service, Bedrock gives organizations access to foundation models (which are like large language models, but even larger) via an API so they can build generative AI-enabled applications with relatively little overhead.

Users of Bedrock, which is HIPAA-eligible and GDPR-compliant, "can easily experiment with a variety of top FMs and customize them privately with their proprietary data," according to AWS. "Additionally, Amazon Bedrock offers differentiated capabilities like creating managed agents that execute complex business tasks -- from booking travel and processing insurance claims to creating ad campaigns and managing inventory -- without writing any code." 

Currently, Bedrock carries foundation models from AWS partners like AI21 Labs, Cohere, Stability AI and Anthropic. Notably, AWS announced a $4 billion investment in Anthropic just days ago, giving AWS a minority stake in the company and making it the most formidable backer of Anthropic's Claude AI chatbot.    Claude is a competitor of ChatGPT, which is stewarded by OpenAI, which in turn is backed by AWS rival Microsoft.       

Also accessible on Bedrock is AWS' own Titan Embeddings model, which is now generally available. Part of the Titan family of general-purpose FMs, Titan Embeddings "converts text into numerical representations (known as embeddings) to power RAG use cases," explained AWS machine learning VP Swami Sivasubramanian in a blog post. RAG, short for retrieval-augmented generation, is "a popular model-customization technique where an FM connects to a knowledge source that it can reference to augment its responses." 

In other Bedrock news, the service will be the first to carry Meta's Llama 2 model when it becomes available "in the next few weeks," according to AWS. And, to familiarize users with Bedrock, AWS has launched a one-hour online training course for the service.       

Other Gen AI News
AWS' BI and data visualization service, Amazon QuickSight, is also getting a generative AI boost. As of this week, using a new so-called "generative BI authoring" capability, QuickSight users can query the service to create visualizations even if their queries are imperfect.  

QuickSight is already able to answer natural-language queries via the QuickSight Q feature that debuted in 2021. The generative BI authoring capability announced this week makes QuickSight's comprehension of natural-language queries even more intuitive. As Sivasubramanian explained, the update "extend[s] the natural-language querying of QuickSight Q beyond answering well-structured questions (e.g., 'what are the top 10 products sold in California?') to help analysts quickly create customizable visuals from question fragments (e.g., 'top 10 products'), clarify the intent of a query by asking follow-up questions, refine visualizations, and complete complex calculations."       

Finally, AWS' AI-powered coding assistant CodeWhisperer will soon let users tailor the code that they are recommended to include "their organization's internal, private code base (e.g., internal APIs, libraries, packages, and classes)."  

While CodeWhisperer has been trained on "billions of lines of publicly available code," per Sivasubramanian, it was previously unable to be trained on individual organizations' private code, making its recommendations less than optimal. An upcoming customization capability (currently in preview) will remediate this by letting organizations connect CodeWhisperer to their proprietary repositories, enabling the service to recommend private code in addition to public code.

"The CodeWhisperer customization capability can save developers hours spent searching and modifying sparsely documented code," said Sivasubramanian, "and helps onboard developers who are new to the company faster."

About the Author

Gladys Rama (@GladysRama3) is the editorial director of Converge360.

Featured

Subscribe on YouTube