views
The US-based e-commerce giant Amazon is reportedly investing a significant amount of money in training an ambitious large language model (LLM) codenamed ‘Olympus.’ This AI model will be in competition with OpenAI’s ChatGPT and Google’s Bard AI.
According to a report from Reuters, the Olympus AI model has two trillion parameters which could make it one of the largest models being trained.
OpenAI’s GPT-4 models, one of the best models available, is reported to have one trillion parameters. The sources close to Reuters informed that the team is spearheaded by Rohit Prasad, former head of Alexa, who now reports directly to CEO Andy Jassy.
As head scientist of general artificial intelligence (AI) at Amazon, Prasad brought researchers who had been working on Alexa AI and the Amazon science team together to work on training models.
“Amazon has already trained smaller models such as Titan. It has also partnered with AI model startups such as Anthropic and AI21 Labs, offering them to Amazon Web Services (AWS) users,” the report said.
As per Reuters, Amazon is investing in large language models (LLMs) to make its cloud computing platform, AWS, more appealing to enterprise customers. LLMs are the technology behind AI tools that can generate human-like responses after being trained on massive amounts of data.
However, training larger artificial intelligence models is more expensive due to the significant computing power needed.
In an earnings call last April, Amazon executives indicated that the company would boost its investment in LLMs and generative AI while reducing spending on fulfillment and transportation in its retail operations, the report said.
Meanwhile, Amazon rival OpenAI has announced to roll out custom versions of its AI chatbot ChatGPT that people can create for a specific purpose. Called GPTs, these AI models are a new way for anyone to create a tailored version of ChatGPT to be more helpful in their daily life, at specific tasks, at work, or at home, and then share that creation with others.
(With inputs from Reuters)
Comments
0 comment