Advanced Micro Devices estimated a $45 billion market for its data center artificial intelligence processors this year as it launched a new generation of AI chips on Wednesday. That’s up from the chipmaker’s $30 billion estimate in June.
The semiconductor company also forecasts its MI300 AI accelerator chip to be the fastest product to reach $1 billion in sales in AMD history. It is positioned as an alternative to Nvidia’s H100 GPU and has become famous for processing AI tasks and powering chatbots like Tencent’s $30-a-month Copilot service.
The MI300 accelerator will process AI operations 1.6 times faster than the H100. It’s aimed at cloud service providers and hyperscalers that use large amounts of computing for data analytics, machine learning, deep neural networks, and more. The total addressable market for the MI300 and a second chip geared toward supercomputers is expected to be more than $37 billion this year, up from $23 billion in 2024.
A key differentiator is that the MI300 can be paired with CPUs in servers to boost performance for specific jobs, with more demand for memory and computing capability. AMD said that a server with eight MI300X accelerators and 1.5 terabytes of HMB3 memory would deliver up to 7.5 petaflops for specific AI applications. The processors also support a fast networking technology called InfiniBand that allows them to communicate with each other at up to 10 gigabits per second.
Another version of the MI300 processor aims to train generative AI applications, such as those used to create computer models for autonomous vehicles and virtual assistants. This type of AI is more demanding on the system because it must train multiple images at once and then run them in various scenarios to learn what works best. AMD said that the version of the chip for generative AI will be available next month and deliver 43.2 teraflops in floating point operations (FP32) and 51.2 teraflops in integer operations.
In addition to the MI300 and the new supercomputer processor, AMD is also introducing a software tool for building AI systems that can improve the accuracy of algorithms that drive machine learning. The software can be applied to image and audio data and help find patterns in complex structures such as trees, faces, or bodies. It can also be used to identify speech and text.
However, AMD has yet to see the same revenue liftoff that Nvidia is experiencing because other segments of its business are down. These include its communications and industrial businesses, which stem mainly from the early 2022 acquisition of Xilinx and gaming.
The company is counting on the AI market to counteract a slowdown in other parts of its business. It will likely have to wait a while for its bet on the chip market to pay off, but it hopes to have a more significant presence in the space than its rivals. Until then, the company’s shares are up more than 70% this year.