Explore how Amazon's additional $8B investment in Anthropic drives generative AI model innovation, featuring AWS Trainium chips and cloud infrastructure.
Andrej Karpathy sees LLMs reading books along with humans. A recent Amazon job listing suggests the company may already be working on this.
Speaking at Fortune’s Brainstorm AI conference in San Francisco, Amazon SVP and head of Artificial General Intelligence Rohit Prasad dismissed concerns about the industry’s feared “AI wall.”
Amazon says that it's establishing a new R&D lab in San Francisco, the Amazon AGI SF Lab, to focus on building AI agents.
Additionally, the company also announced that it has made significant progress with Trainium, its custom chip. In fact, AWS CEO Matt Garman revealed that Apple has been an early adopter and long time beta-tester for the company’s Trainium chips, indicating a close relationship between both technology giants.
Streaming services like Netflix have long used machine learning algorithms to spit out recommendations based on viewing history. But with this announcement, it seems like Prime Vi
Ever hit a dead end when looking for the next thing to watch on Prime Video? The algorithm can sometimes be irritating and not know which recommendations you may want to consider next. Amazon is trying to solve this issue and ‘AI Topics’ is the most recent solution for this.
Amazon's Rufus AI shopping assistant, offers an early glimpse into how artificial intelligence could reshape product discovery and purchase behavior online.
Invent, Amazon tips a slew of upgrades in everything from storage and databases to new computing chips and various AI tools, mostly aimed at reducing cost and complexity.
Amazon executives have been unveiling crucial pieces of their AI strategy at their flagship AWS Re:Invent conference in Las Vegas this week. One key element is a new portfolio of in-house-built foundation models, known as FMs or LLMs, dubbed Nova, that can handle text, image, and video queries, respectively.
At its re:Invent conference on Tuesday, Amazon Web Services (AWS), Amazon’s cloud computing division, announced a new family of multimodal generative AI models it calls Nova.
AWS CEO Matt Garman said he sees AI inference as a fourth building block for AWS, joining cloud computing, storage and database services. Amazon’s Trainium, Inferentia, and Graviton processors were a big focus, continuing to evolve as an alternative to Nvidia and other companies.