Amazon Committing Millions to Develop 'Olympus' LLM: Reports
Amazon is apparently harnessing its significant financial and research powers to develop a large language model (LLM) of its own.
Media reports this week indicate that Amazon is spending millions of dollars to train a high-powered LLM codenamed "Olympus."
Amazon already has its own family of generative AI models under the "Titan" brand, but those don't have anywhere approaching the buzz or adoption of OpenAI's various GPT versions.
Amazon cloud rival Microsoft, having a minority interest in OpenAI, has benefitted greatly from the latter's success. If Amazon's alleged efforts around Olympus come to fruition, it has the potential to disrupt Microsoft's dominance of the generative AI space.
In an article Monday, Reuters, citing two anonymous sources familiar with Amazon's plans, said Olympus supports 2 trillion parameters, nearly double the estimated number of parameters supported by GPT-4. While a higher number of parameters doesn't necessarily make an LLM more accurate, it can contribute to the nuance, complexity and relevance of its output.
A separate report by The Information, which revealed the "Olympus" moniker, says Amazon plans to sell the resulting technology to corporate enterprises, as well as to use it in its online retail business, its Amazon Web Services cloud and its Alexa natural-language assistant.
So far, Amazon has not verified these reports, though it could make a public announcement of the project in December, per The Information's source. That timing would coincide with Amazon's annual cloud conference, AWS re:Invent, which takes place this year Nov. 27 through Dec. 1.