NVIDIA Corp (NASDAQ:NVDA) is set to release its first-quarter earnings, and the company is already making waves in the AI sector. The tech giant’s AI infrastructure is in high demand, as evidenced by the increased capital expenditures of its major clients.
What Happened: Mega-cap tech companies are making substantial investments in Nvidia’s AI technology. According to an analysis from Business Insider, Nvidia’s H100 GPU chip, priced at over $40,000, plays a crucial role in powering AI advancements like ChatGPT and Anthropic.
Notably, Elon Musk announced on Tesla‘s earnings call that the company will double its H100 GPU chips by the end of the year. This expansion will further enhance Tesla’s Full Self-Driving software.
“We’ve installed and commissioned, meaning they’re actually working, 35,000 H100 computers or GPUs,” Musk said last month. “Roughly 35,000 H100s are active, and we expect that to be probably 85,000 or thereabouts by the end of this year.”
Meta Platforms has also increased its capex forecast for 2024, citing the buildout of its “infrastructure investments to support our AI roadmap.” The company has already purchased 850,000 H100 GPUs from Nvidia, with a retail value of approximately $30 billion.
See Also: Tesla’s Dogecoin Adoption Sends Crypto Market Into Frenzy, Meme Coin Surges By Over 21%
Microsoft and Alphabet have similar plans, with Microsoft aiming to amass 1.8 million GPUs by the end of 2024, according to an internal document. Alphabet’s first-quarter CAPEX doubled from the prior year, driven primarily by investment in technical infrastructure.
Amazon didn’t provide specifics about its capital expenditure plans, but it did mention an anticipated increase in spending. “We anticipate our overall capital expenditures to meaningfully increase year-over-year in 2024, primarily driven by higher infrastructure capex to support growth in AWS, including generative AI,” said Amazon CFO Brian Olsavsky.
In total, Microsoft, Alphabet, Meta, and Amazon are projected to allocate $205 billion towards capital expenditures this year, marking a 40% rise from 2023 figures as reported by UBS. A significant portion of these funds is expected to be directed toward Nvidia for its H100 and Blackwell AI chips.
Recent earnings reports from Nvidia’s competitor, Advanced Micro Devices, Inc (NASDAQ:AMD), indicate that Nvidia is capturing the majority of this market share, outpacing its rivals. AMD projected that its MI300 AI chip would generate around $4 billion in revenue for 2024, significantly less than Nvidia’s anticipated revenue of over $100 billion this year.
In a similar vein, Intel Corp (NASDAQ:INTC) has introduced its Gaudi 3 AI chip as a competitor to Nvidia, but it forecasts sales of only $500 million for this year.
Investors will need to wait until after the market closes on May 22 to find out Nvidia’s actual earnings results.
Why It Matters: Despite the impending earnings report, Nvidia’s stock has been the subject of contrasting opinions among analysts and Reddit users. While analysts maintain a bullish stance, predicting a 31.50% potential upside, some Reddit users foresee a drop to $800.
Amid this, Nvidia has been enhancing its experimental ChatRTX chatbot by adding more AI models for RTX GPU owners. This update includes Google’s Gemma, ChatGLM3, and OpenAI’s CLIP model, expanding the chatbot’s capabilities.
Furthermore, Nvidia’s supplier SK Hynix is experiencing a surge in demand for its high-bandwidth memory chips, crucial for AI chipsets. The company’s HBM chips are already sold out for 2024 and are nearly sold out for 2025, reflecting the booming demand for AI services.
Despite these positive developments, Nvidia’s stock has been on a roller coaster ride through early 2024, experiencing a 96% bull rally that halted at $974, leading to a 13% dip in value. The company is currently in a consolidation phase, with its stock price oscillating between $756 and $974.
Read Next: Bitcoin, Ethereum, Dogecoin Rally, As Market Sentiment Soars On Macro Data: ‘Above $67,000 We Fly Like A Bird,’ Claims Trader
Image Via Shutterstock
Engineered by Benzinga Neuro, Edited by Kaustubh Bagalkote
The GPT-4-based Benzinga Neuro content generation system exploits the extensive Benzinga Ecosystem, including native data, APIs, and more to create comprehensive and timely stories for you. Learn more.