Liudmila Chernetska
We have previously covered Advanced Micro Devices, Inc.’s (NASDAQ:AMD) here as a post-FQ2’22-earnings article in August 2022. Its aggressive R&D efforts were discussed then, triggering the company’s success in the x86 market share gain. The company’s continued leadership in the highly competitive semiconductor market will be ensured by these efforts, even though they may impact its short-term profitability.
For this article, we will be focusing on AMD’s prospects in the data center and AI applications end-market, which may trigger massive tailwinds in its top/bottom-line growth and stock recovery in the short term. This is partly attributed to ChatGPT’s success thus far, justifying the massive cloud and AI investments by multiple hyperscalers, such as Meta (META), Alphabet (GOOG)(GOOGL), and Microsoft (MSFT).
The Data Centre/AI Investment Thesis Remains Robust
The world’s leading foundry, Taiwan Semiconductor Manufacturing Company Limited (NYSE:TSM) reported that shipments for 5-nm technology expanded tremendously by approximately 12.5% QoQ, to account for 32% of total wafer revenues by FQ4’22. In addition, the HPC end segment also grew by 10% QoQ in the latest quarter, against the 4% reported in FQ3’22, suggesting an acceleration in data center demand.
Now, why does this matter? It is not a coincidence that AMD officially announced the availability of its 5-nm 4th Gen EPYC 9004 Series, also known as Genoa, in November 2022. We speculate that TSM may have been ramping up production to launch the new data center chip indeed. Investors may also be interested to know that the platform has been widely incorporated by many partners, such as Dell Technologies (DELL), Google Cloud, Hewlett Packard Enterprise (HPE), Lenovo (OTCPK:LNVGY), Microsoft Azure, Oracle Cloud Infrastructure (ORCL), Supermicro, and VMware (VMW).
As a result, AMD may deliver a surprisingly excellent FQ4’22 earnings call by the end of January 2023, particularly attributed to the robust demand for data center products. However, the recent hype for ChatGPT may also provide tremendous tailwinds to the company’s recovery, due to the launch of its next generational chips for HPC and AI applications.
Data centers inherently play a critical role in the training of AI models, especially one as widely used as ChatGPT. The former provides the physical infrastructure required to store and process immense amounts of data and algorithms for AI training, such as speech recognition, machine translation, and text generation. The system allows the platform to understand, learn, and respond in a way similar to how a human would. The following is its response when asked, “tell me more about ChatGPT” (on the WebChatGPT platform):
ChatGPT
Notably, the chips used in these settings must also deliver the next-level performance and adaptive computing while similarly being cost and power efficient. Lisa Su, CEO of AMD, said:
AI has been around for quite some time, but we’re at an inflection point, touching all of our technologies, from chips for consumer devices up to the largest chips we build for data centers. You need AI capability in every one of those devices. (Barrons)
The trend is also witnessed by many other players, such as Meta, which has guided robust capital expenditures of up to $39B in 2023, primarily attributed to servers and network infrastructures. Dave Wehner, CFO of Meta, said:
There is some increased capital intensity that comes with moving more of our infrastructure to AI. It requires more expensive servers and networking equipment, and we are building new data centers specifically equipped to support next-generation AI hardware… Second, we are making ongoing investments in our data center footprint… We believe the additional data center capacity will provide us greater flexibility with the types of servers we purchase and allow us to use them for longer, which we expect to generate greater cost efficiencies over time. (Seeking Alpha)
With Alphabet similarly scrambling to release AI prototypes and products by supposedly mid-2023, it is not overly bullish to expect an expanded investment toward data centers, AI training, and language models in the short term. Interested investors may refer here, for our previous coverage of Google’s LaMDA Vs. ChatGPT.
Naturally, this comes on top of its Google Cloud offering, which commands 11% of the global cloud provider market share as of Q3’22. Therefore, it is unsurprising that the company reports an elevated capital expenditure of $30.2B over the last twelve months [LTM], growing by 27.4% sequentially, primarily attributed to server infrastructures. Even C.C. Wei, CEO of TSM, has also guided the following:
We forecast the semiconductor cycle to bottom sometime in first half and see a recovery in second half 2023, … with the rebound boosted by new product launches such as artificial intelligence-enabled goods. (Reuters)
MSFT is also taking a large bite out of this intensely competitive space, with a reported $10B investment towards OpenAI and a new version of Bing search engine based on ChatGPT, to be launched by March 2023. Notably, the popular AI platform is also based on MSFT’s Azure, which commands 21% of the cloud market share in the latest quarter. The company also grew its capital expenditure by 13.2% sequentially, to $21.5B over the LTM.
Ultimately, the biggest winner may be AMD in supplying these companies with cutting-edge AI and data server chips, AMD Instinct MI300 accelerators by H2’23. The latter is designed for power-efficient AI training performance and HPC workloads, by combining its CDNA 3 GPU, “Zen 4” CPU, and HBM chiplets.
It is unknown which chip platform OpenAI had previously used to train its AI model, ChatGPT. However, in the recent Keynote at CES 2023, Lisa Su has guided that its newly launched MI300 will be able to train the language model more efficiently within weeks, against the months/ thousands of GPUs/ millions of dollars of electricity taken thus far. Furthermore, the CEO presented that the new chip will be able to support larger and more advanced AI applications, due to the tremendous improvement in performance by 8x and power efficiency by 5x, compared to the previous AMD Instinct MI250X accelerators.
Lastly, we must also highlight that the world’s fastest supercomputer, Frontier, an HPE Cray EX system run by the US Department of Energy, relies on 3rd-Gen AMD EPYC™ CPUs and MI250X. With a benchmark performance of 1.102EFLOP/s by November 2022, the exascale supercomputer delivers an impressive 2.5x speed compared to the second rank based on Fujitsu’s A64FX microprocessor.
Through these endeavors, we can surmise Lisa Su’s ambition in conquering the data center/AI market, significantly boosted by its recent Xilinx acquisition for adaptive computing solutions and Pensando for data center technology. Its focus on delivering cutting-edge technology is one of many reasons we continue to hold the AMD stock as one of our core technology stocks.
So, Is AMD Stock A Buy, Sell, or Hold?
AMD 1Y EV/Revenue and P/E Valuations
AMD is currently trading at an EV/NTM Revenue of 4.70x and NTM P/E of 22.36x, lower than its 3Y mean of 7.00x and 38.28x, respectively. The consensus estimates are pessimistic with a price target of $84.40 as well, suggesting a minimal upside potential of 19.2% from current levels.
On the other hand, we are more bullish, since the PC destruction is only temporal. This is due to the GPUs product replacement cycle of every five years and PC CPUs of six years. The next cycle will likely happen from H1’24 onwards, as a result of the hyper-pandemic demand for corporate and personal devices back in 2020. It may also coincide with the recovery of macroeconomics, as the Fed’s recent minutes suggest a pivot from 2024 onwards, potentially lifting the market sentiments then.
While it is uncertain if AMD may hit hyper-pandemic heights of $153.76 again, we reckon that the low $100s are not overly ambitious, based on the projected FY2024 EPS of $4.76 and current P/E valuations. Those numbers suggest an excellent upside potential of 49.7%, despite the recent 29.7% recovery from October 2022 bottom.
Combined with its highly promising data center/AI offerings, we are re-rating the AMD stock as a Buy now.