According to some industry analysts, AMD urgently needs large customers in the field of computational acceleration. Therefore, the event, which featured a demonstration of the new MI300A and MI300X accelerators, could not have taken place without an invitation from AWS, Amazon’s cloud division, to take the stage. The company is even considering deploying new AMD accelerators, AWS admitted to the agency. Reuters.
As AMD CEO Lisa Su explained in an interview with Reuters, her company is trying to win the favor of major cloud providers by providing all the components to create systems that allow services like the ChatGPT AI chatbot to run and at the same time leaving sufficient freedom of choice in the configuration and the use of industry-standard protocols and interfaces.
Dave Brown, Amazon’s vice president of elastic cloud computing, told Reuters that AWS is considering using AMD’s MI300 series accelerators in its infrastructure. In his opinion, negotiations on this issue with AMD are still ongoing and nothing has been finally decided yet, but the company has already taken obvious advantage of the possibility of building these accelerators into existing systems.
AWS refused to work with NVIDIA when the company offered its ready-made DGX cloud platform. Brown said NVIDIA’s proposed business model doesn’t make much sense for AWS. This company prefers to design its systems from the ground up rather than resorting to ready-made solutions. However, since March this year, AWS has been using NVIDIA H100 accelerators, albeit as part of proprietary systems.