Advanced Micro Devices Inc. gave Wall Street a bold projection for revenue from its first AI chips for next year, and while the outlook may look like a high hurdle, it’s quite feasible that Chief Executive Lisa Su can deliver on her promise.
Su told analysts on AMD’s
earnings call Tuesday that the company could see data-center revenue from its graphics processing units (GPUs) of about $400 million in the fourth quarter, while that total could exceed $2 billion in 2024 as revenue continues to ramp.
“This growth would make MI300 the fastest product to ramp to a billion dollars in sales in AMD history,” Su said.
Su’s forecast was bolder than what some on Wall Street had been predicting. Wells Fargo analyst Aaron Rakers said in a note late Tuesday that Su’s forecast was “solidly ahead” of his recently raised $1.7 billion estimate for AMD’s data-center GPU revenue.
The upbeat AI talk helped offset a less rosy overall outlook for the fourth quarter, driven by drags from AMD’s gaming and embedded-chip businesses. That forecast initially sent AMD shares down about 5% after hours following the release of results, but the stock finished the extended session off less than 1% as the AI commentary appeared to soothe investors.
Su made a compelling case that AMD will be able to hit its optimistic forecast. She talked about the company’s engagements with customers for its new chip family called the MI300, the strong demand for chips that are designed for artificial intelligence and the “significant” progress AMD has seen so far with the MI300.
“I think we’re very happy with how the technical milestones look, and then also we’ve made significant progress from a customer side,” she said, calling out strong engagement with customers ranging from hyperscalers to enterprise customers to new AI startups. The company didn’t name any potential customers yet.
AI chips could help fuel growth in AMD’s data-center segment, which came in flat at $1.6 billion during the third quarter. AMD has made huge inroads with corporate customers since its return to the server market years ago under Su, and those enterprise customers should be much more willing to try new chips from AMD now.
Su said AMD’s new graphics-processor chips designed to accelerate AI will be targeted at workloads for both inference and training, but she emphasized their performance in inference workloads.
The training area of AI is currently dominated by Nvidia Corp.
which has already seen huge increases in revenue from companies looking to run AI in data centers. But Nvidia’s supply has been constrained, and the CEO of d-Matrix, a chip startup, told MarketWatch recently that customers are looking around for alternatives.
In recent days, investors have gotten nervous that the latest U.S. ban on certain advanced chips to China could affect billions of dollars in future revenue for Nvidia, based on a Wall Street Journal report.
The ban on China sales on could potentially affect AMD as well, and it’s not the only risk for the company. Bernstein Research analyst Stacy Rasgon pointed out some other risks to AMD’s AI ambitions in a note prior to AMD’s earnings on Tuesday. “While we do suspect incremental revenues will begin sooner (and expect the company to talk up new customer wins) we do not expect real volume until the [second half], (which could somewhat limit the growth runway for it next year and also leaves them open to Nvidia’s Blackwell launch around the same time),” Rasgon said earlier this week.
So far, though, Su has not been one to make projections lightly, and it appears that she has high confidence in AMD’s products and the company’s relationships with its customers. AMD is counting on being a big part of the growing AI boom, and management’s willingness to put a number on the opportunity should make investors feel better about the company’s ability to capitalize on the new market.