Learn
Technology Update: NVIDIA Misses ‘Whisper Number’ But Beats Consensus Estimates!
By Daniel Morgan, Synovus Trust Senior Portfolio Manager
Synovus Trust Company, N.A.
Nvidia reported record results for revenue during the third quarter, ending October 24 (3Q25), of $35.1 billion, up 17% from the previous quarter and 94% from a year ago. For the quarter, GAAP earnings per diluted share was $0.78, up 16% from the previous quarter and up 111% from a year ago.
“The age of AI is in full steam, propelling a global shift to NVIDIA computing,” said Jensen Huang, founder and CEO of NVIDIA in a press release. “Demand for Hopper and anticipation for Blackwell — in full production — are incredible as foundation model makers scale pretraining, post-training and inference. NVIDIA’s outlook for the upcoming fourth quarter of fiscal 2025 (4Q25) calls for revenue to rise to an expected $37.5 billion, plus or minus 2%. That was in line with the consensus revenue estimate of $37.0 billion. But many analysts were looking for NVIDIA to guide revenues for the 4Q25 above the $39 billion “whisper number.”
During the third quarter, the company announced the availability of NVIDIA Hopper H200-powered instances in several cloud services, including AWS, CoreWeave and Microsoft Azure, with Google Cloud and Oracle Cloud Infrastructure coming soon. NVIDIA continues to build out the Sovereign business as cloud leaders in India, Japan and Indonesia are building AI infrastructure with NVIDIA-accelerated computing, while consulting leaders are helping speed AI adoption across industries with NVIDIA AI Enterprise software. On a cautionary front, management commented that “both Hopper and Blackwell systems have certain supply constraints, and the demand for Blackwell is expected to exceed supply for several quarters in fiscal 2026.” As future revenue and profit growth are constrained by supply issues, not lack of demand. This report led to NVIDIA shares see-sawing in trading as Investors digested the results.
Putting aside the ebb and flows of Investors near-term perceptions of the recent 3Q25 results, the future for NVIDIA looks really bright. The top Tech players (Amazon, Google, Meta, Microsoft) are projected to spend collectively up to $185 billion in AI CapEx in 2024; most will fail, but a few will also succeed. A large portion of that spending will be on AI in the datacenter. CapEx from hyperscales is set with a large percentage being dedicated toward compute, further rising in 2024 and 2025. As more applications for generative AI are rolled out and the need for accelerated compute expands, hyperscalers will be forced to continuously deploy new compute stacks and expand their data centers. Enterprise and consumer internet companies are also driving demand. The data center supply chain continues to ramp very rapidly. NVIDIA continues to be extremely responsive to the surging demand, having grown the datacenter business nearly four times over the course of the past four quarters. NVIDIA’s Data Center segment now generates more revenues than both Intel and Advanced Micro Device’s combined datacenter units! NVIDIA’s datacenter business is expected to more than double (+132% Year over Year (YoY)) its revenues in FY2025 to $110.7 billion — from $47.5 billion in FY2024, with AI being the major catalyst!
The competition is still trying to play catch-up! NVIDIA’s management does not seem overly concerned about competition given its strong performance lead. NVIDIA today accounts for more than 70% of AI semiconductor sales with Google (TensorFlow TPU v6), Amazon (Trainium 3/ Inferencia, Microsoft (Colbalt 100) and Meta (MTIA v1) all producing their own AI chips. NVIDIA continues to dominate performance benchmarks and the company’s eco-system advantage would be hard to match. Management recently highlighted that NVIDIA GPUs already co-exist with custom silicon solutions such as TPUs. While Advanced Micro Device’s upcoming MI325X looks promising for AI inferencing workloads due to large amounts of integrated memory (192GB), NVIDA management pointed that the company’s Grace Hopper chip supports nearly 600GB of memory. Advanced Micro Device’s MI325X GPU-only version is designed to go toe to toe with NVIDIA’s H100 Hopper AI accelerator; and it is optimized for large language models and generative AI. Further, Advanced Micro Device is working directly with Microsoft to develop additional AI chips under the code name Athena. Broadcom’s Jericho3-AI is designed to connect supercomputers and features a high-performance fabric for AI environments. Marvell is expected to be major beneficiary of the aggressive spending on generative AI by its cloud customers and is on track to double its AI-based revenue to $400 million in FY24, driven by strong demand pull for Marvell’s 800G PAM4 DSP chipsets and the 400ZR DCI solutions.
NVIDIA’s product roadmap looks strong as the company migrates from the AI Hopper to the new AI Blackwell family of chips. NVIDIA recently announced a new family of AI GPUs called Blackwell. This Blackwell series will include the B100, B200 and GB200, which will be a substantial upgrade from H100/H200. NVIDIA remains on track to ship its next-generation Blackwell GPU platform in high-volume production in 4Q25, as yields continue to improve. The Blackwell (B200) will command a significant price premium over earlier chips. Analysts have been concerned about product transitions, but Hopper and Blackwell chips are software and backward compatible, making upgrades less disruptive. In addition, with each new generation comes both higher performance and more power efficiency, making it more economically attractive to continue to migrate to the latest technology. Any new Blackwell orders now that are not in queue will be shipped late next year, as it is booked out 12 months or so — which continues to place a strong short-term driver for Hopper through the year.
Stepping away from NVIDIA for a moment, the overall semiconductor market continues to be fairly valued, as measured by the Philadelphia Semiconductor Index (SOX) current versus historical price/book ratio. The price/book ratio (as opposed to sales or earnings growth metrics) is a better measure of the true value of semiconductor companies. First of all, a clear indicator that the SOX or chip sector is vastly overvalued and poised for a correction is if the P/B ratio trades in the 8x range. In both instances that P/B traded in the 8x handle — that occurred in the summer of 2000 (peak P/B 8.8x) and April of 2021 (peak P/B 8.2x) — the SOX Index eventually dropped precipitously. Secondly, during a mild recession or minor disruption, the P/B ratio for the SOX will trade as low as 3.5x to 4x. This occurred during the minor setbacks like the “China/U.S. Trade Sanctions” (P/B fell to 3.5x) and the “COVID-19 Pandemic” (P/B fell to 4x). So, with the current P/B ratio of the SOX Index at 5.3x, what conclusions can we make? My gut feeling is that we are in the beginning stages of a broad-based rebound in the chip space led by AI/PC Computing/Smartphones with Auto/Industrial lagging. The P/B ratio is nowhere close to previous periods of “irrational enthusiasm” like during the pre “Dot.com Bust,” Summer of 2000 (peak P/B 8.8x) and the “COVID-19 Supply Shock” in April of 2021 (peak P/B 8.2x) peaks!
Important disclosure information
Asset allocation and diversifications do not ensure against loss. This content is general in nature and does not constitute legal, tax, accounting, financial or investment advice. You are encouraged to consult with competent legal, tax, accounting, financial or investment professionals based on your specific circumstances. We do not make any warranties as to accuracy or completeness of this information, do not endorse any third-party companies, products, or services described here, and take no liability for your use of this information.