close
close
migores1

The Ultimate Artificial Intelligence (AI) Action to Buy Hand Over Fist Right Now (Hint: It’s Not Nvidia)

This chip stock is not only cheaper than Nvidia, but is set to grow at a faster rate than the AI ​​pioneer.

The proliferation of artificial intelligence (AI) has proven to be a huge catalyst for the semiconductor industry, and this is not surprising, since training and moving AI models into production is only possible through chips that are deployed in large numbers in data centers.

That is why Nvidia (NVDA -0.59%) has witnessed a massive increase in demand for its graphics processing units (GPUs). The ability of GPUs to process huge amounts of data simultaneously has made them the default choice for cloud service providers to train large language models (LLMs). This explains why Nvidia’s data center business has grown at a phenomenal rate in recent quarters and is expected to grow even larger in the future.

However, GPUs aren’t the only type of chip that’s in high demand due to the rise in AI adoption. Micron technology (MU -0.45%)a memory chip maker, has profited greatly from the growing adoption of AI. This article takes a closer look at how AI has driven Micron’s business and examines why the memory specialist looks like the better AI stock to buy right now compared to Nvidia.

Micron Technology is benefiting from the adoption of AI in several areas

Micron Technology released its fiscal fourth quarter 2024 results (for the three months ended August 29) on September 25. The company’s revenue rose 93% year over year to $7.75 billion, beating the consensus estimate of $7.65 billion. Additionally, Micron posted a profit of $1.18 per share in the latest quarter, compared to a loss of $1.07 per share in the same period last year. Analysts were expecting the company to post $1.11 per share in earnings.

It was no surprise to see Micron smashing Wall Street estimates as memory market dynamics improved due to artificial intelligence. For example, demand for Micron’s data center memory chips is outstripping supply, and that’s not surprising since these chips are used by the likes of Nvidia while making their GPUs. More specifically, AI-centric GPUs are equipped with high-bandwidth memory (HBM) chips due to their ability to quickly process huge amounts of data.

That’s why Micron expects the HBM market to generate $25 billion in annual revenue in 2025, up from just $4 billion last year. The company also adds that it sold “several hundred million dollars” worth of HBM last year and has already sold its HBM capacity for next year.

Meanwhile, the adoption of AI is also leading to improved demand for solid state drives (SSDs) used in data centers. Micron’s data center SSD revenue tripled in fiscal 2024. It won’t be surprising to see Micron maintain impressive growth in this segment going forward, as the deployment of AI servers is expected to drive average annual growth of 60 % of SSDs for data centers. demand in the coming years, according to market research firm TrendForce.

However, that’s not where Micron’s AI-related catalysts end. The adoption of AI-enabled computing will be another solid growth driver for the company, leading to impressive volume growth in both compute and storage chips. On the most recent earnings conference call, Micron CEO Sanjay Mehrotra noted:

AI PCs require more memory and storage capacity. For example, major PC manufacturers have recently announced AI-enabled PCs with a minimum of 16GB of DRAM for the value segment and between 32GB and 64GB for the midrange and premium segments, versus an average on all computers around 12GB last year.

A similar scenario is playing out in the smartphone market, where Micron points out that Android original equipment manufacturers (OEMs) are launching AI smartphones with 12 gigabytes (GB) to 16 GB of dynamic random access memory (DRAM), up from with an average capacity of 8 GB. seen in 2023 flagship smartphones.

IDC estimates that the generative AI smartphone market could grow at an annual rate of 78% through 2028. On the other hand, Gartner estimates that AI PC shipments could grow by 165% next year. The staggering growth in these end markets presents a bright multi-year opportunity for Micron to grow its sales and earnings due to the increase in memory consumption by AI-enabled high-end devices such as smartphones and PCs.

All of this points to Micron being a more diversified AI stock than Nvidia, as it stands to gain from adopting this technology in more ways beyond data centers. Even better, Micron’s forecast suggests it’s on track to grow faster than its illustrious peer.

Micron’s guidance and valuation make it a top AI stock to buy right now

Micron is coming off a tremendous fiscal quarter with remarkable growth in its top and bottom lines. The company expects the momentum to continue in the first quarter of fiscal 2025. It generated revenue of $8.7 billion in the current quarter, along with adjusted earnings of $1.74 per share at the midpoint.

Topline guidance points to 85% year-over-year growth, while the result would be a significant improvement over the year-ago period’s non-GAAP loss of $0.95 per share. By comparison, Nvidia expects its top line to grow 80% year-over-year in the current quarter. That was one of the reasons why investors hit the panic button after the latest results, as the chipmaker has consistently posted triple-digit growth in recent quarters.

Additionally, Micron’s valuation means investors looking to add an AI stock to their portfolios right now would do well to buy it right away through Nvidia. Micron’s price-to-earnings ratio of 11 is significantly lower than Nvidia’s forward earnings multiple of 44, making the former an unusual investment right now given its stunning growth.

Harsh Chauhan has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Nvidia. The Motley Fool recommends Gartner. The Motley Fool has a disclosure policy.

Related Articles

Back to top button