From AI Factories to GaN Powerhouses: Why Nvidia, Navitas & Lam Are Shaping the Next Semiconductor Wave
Welcome, AI & Semiconductor Investors,
Nvidia’s Ian Buck laid out how Blackwell’s 20 PFLOPS of FP4 inferencing will supercharge reasoning-based models worldwide. Meanwhile, Navitas set the stage for 800V GaN/SiC infrastructure, and Lam’s CFO mapped a $40 billion NAND transition, boosting margins toward 50%.— Let’s Chip In
What The Chip Happened?
🚀 Nvidia’s AI Factory Frenzy: Buck Envisions a Global GPU Boom
⚡Navitas Supercharges Data Centers with GaN and SiC
🔧 NAND Overhauls and Nearly 50% Margins: Lam’s CFO Talks Big Opportunities
[UiPath Delivers Mixed Growth Amid Agentic Automation Launch]
Read time: 7 minutes
Get 15% OFF Finchat — MY FAVORITE STOCK MARKET DATA PLATFORM — Transcripts to these events are available there.
Nvidia (NASDAQ: NVDA)
🚀 Nvidia’s AI Factory Frenzy: Buck Envisions a Global GPU Boom
What The Chip: On June 4, 2025, Nvidia’s Head of Accelerated Computing, Ian Buck, spoke at Bank of America’s Global Technology Conference. He discussed the explosion of generative AI demand, highlighted new “AI factories” worldwide, and explained how large, reasoning-based models are driving unprecedented growth in data center GPUs.
The Situation Explained:
🤖 Reasoning Models Steal the Show: Buck cited DeepSeek as an inflection point for AI. Large language models now “think out loud,” generating far more tokens and spiking demand for compute. Buck noted, “We’re seeing an explosion in the number of tokens generated…13x more tokens.”
🚀 Blackwell GPU Advances: Nvidia’s upcoming Blackwell GPU offers 20 petaflops of FP4 performance per chip, drastically cutting inference costs through advanced numerical precision. Buck emphasized that “Nvidia thrives at things that are hard,” underscoring the company’s engineering edge.
🌐 Sovereign AI Factories: Governments globally are funding national AI supercomputers (so-called “AI factories”), each valued at around $1 billion. Buck mentioned 100 such AI factories are in progress, reflecting how countries view AI as strategic infrastructure.
💧 Data Center Overhaul: Multi-GPU, multi-node inference demands massive power and cooling investments. Liquid-cooled systems allow GPUs to cluster closely. Buck explained that achieving strong returns requires “bringing them all together in one small space” for fast communication.
📉 Competition & ASICs: While hyperscalers are building custom ASICs, Buck said the sheer variety of AI workloads favors Nvidia’s flexible GPU platform. Still, he acknowledged competition is real, pointing out that each cloud provider is “defining its niche” to optimize costs.
💻 Continuous Model Upgrades: The AI research community constantly retrains large models to add knowledge and refine reasoning. Each iteration boosts token usage—and compute needs—driving strong ongoing demand for GPU hardware.
⚖️ Enterprise Uptake: Buck highlighted how enterprises require flexible hardware able to handle both current and future AI breakthroughs. He stated, “AI factories are going up in the right curve… People want their $1 billion CapEx to keep paying off for 5 years.”
🏭 AI & HPC Convergence: Nvidia also sees synergy in HPC (high-performance computing) and AI. Buck referenced the new NERSC supercomputer partnership, merging simulation with AI techniques for scientific discoveries and pharma research.
Why AI/Semiconductor Investors Should Care: Nvidia’s conference remarks show how AI’s shift toward reasoning-based models is fueling exponential token growth, pushing data centers to expand GPU deployments. Large enterprises, supercomputing labs, and entire nations are racing to build AI factories—each representing significant capital expenditure. For investors, this underscores continued demand for advanced compute solutions but also highlights looming challenges such as fierce competition, rising power requirements, and escalating infrastructure costs.
Get 15% OFF Finchat — MY FAVORITE STOCK MARKET DATA PLATFORM — Transcripts to these events are available there.
Navitas Semiconductor Corporation (NASDAQ: NVTS)
🔋 Navitas Supercharges Data Centers with GaN and SiC
What The Chip: Navitas just presented at Baird’s 2025 Global Consumer, Technology & Services Conference and highlighted a massive opportunity for its gallium nitride (GaN) and silicon carbide (SiC) power chips. On June 3, CEO Gene Sheridan spoke about how new AI-driven data centers are driving high-voltage, high-efficiency demands—right in Navitas’ sweet spot.
The Situation Explained:
⚡ GaN + SiC = Future of Power: Sheridan explained, “We’re the only company on the planet that has GaN and SiC, without the distraction of legacy silicon.” GaN handles fast switching at lower voltages; SiC covers ultra-high-voltage applications, like connecting directly to power grids.
♻️ AI Data Center Demand: Heavy-duty AI chips—like NVIDIA’s Hopper, Blackwell, and future Rubin processors—require huge power. Navitas aims to replace older silicon-based components with GaN and SiC solutions that slash power losses and minimize heat.
🔌 Moving to 800 Volts: NVIDIA’s push for an 800V data center platform (instead of 48V or 12V) means less current is needed to power racks. That translates to far lower transmission losses—and far more GaN/SiC chips per server rack.
📈 Early Success, Bigger Pipeline: Navitas has 70+ customer projects in the data center space, with more to ramp in 2026–2027. They’ve already announced a collaboration with NVIDIA—an important signal for future design wins with hyperscalers and server OEMs.
🤝 Cross-Licensing with Infineon: In a strategic move, Navitas cross-licensed GaN patents with Infineon. “We’re proactively setting up common specs to make dual sourcing easy,” Sheridan said. This could expand adoption across the board as big customers want multiple suppliers.
📉 Inventory & Market Headwinds: While mobile GaN chargers ramped quickly, Navitas saw headwinds in 2023 from weaker EV, solar, and industrial segments. However, management expects a strong rebound in late 2025 into 2026 as inventories rebalance.
⚙️ Potential for Modules: Navitas sells controllers (the “brains”) as well as GaN/SiC transistors (the “muscle”). Bundling both into modules could boost revenue per system and simplify design for power supply manufacturers.
Why AI/Semiconductor Investors Should Care: Massive AI workloads need super-efficient power solutions. Data centers are shifting from silicon to GaN and SiC to reduce wasted electricity and shrink server footprints. Navitas is well-positioned with dual GaN/SiC technology at a time when the whole industry demands higher voltages and lower losses. If management executes on design wins with hyperscalers, it could mean significant growth—and a much bigger slice of the AI hardware market.
Lam Research Corporation (NASDAQ: LRCX)
🔧 NAND Overhauls and Nearly 50% Margins: Lam’s CFO Talks Big Opportunities
What The Chip: On June 3, 2025, Lam’s CFO Douglas Bettinger spoke at Bank of America’s Global Technology Conference about the company’s plans for a multi-year NAND upgrade cycle, improving margins, and robust DRAM demand fueled by AI. He highlighted how “close to customer” manufacturing and advanced product lines like Akara and Halo bolster Lam’s competitive edge.
The Situation Explained:
🔎 NAND Upgrade Boom: Bettinger cited a $40 billion industry-wide spending opportunity over the next few years as NAND transitions past 200 layers to support QLC (quad-level cell) and enterprise SSDs. “We’re only in the first inning,” he said about these high-value upgrades.
💹 Margin Expansion: Gross margin climbed from 46% in late 2022 to 49.5% in recent guidance. Bettinger attributed the jump to a stronger product lineup and Lam’s “close to customer” strategy, which lowers logistical costs and increases operational efficiency.
⚙️ DRAM Demand Rises: High Bandwidth Memory (HBM) for AI accelerators drives DRAM spending. Lam owns critical “drill and fill” steps in Through-Silicon Via (TSV) processes, a key enabler for stacking more memory die to handle data-intensive tasks.
🏭 China Factor: While China’s share of Lam’s revenue dipped this year to around 30%, Bettinger believes the region’s customer set will eventually blend into the broader global market, stating, “Five years from now, we may not be having this conversation.”
🏷️ Tariff & Trade Watch: U.S. export restrictions and new tariffs (including 232 steel and aluminum measures) add near-term cost uncertainty. However, Lam expects stable operations once the rules are fully clarified.
🚀 Services & Recurring Revenue: Lam’s large installed base generates more revenue after tool sales than at the point of sale, thanks to spare parts, upgrades, and support. This stabilizes free cash flow, which Bettinger noted remains a strategic strength.
💰 Financial Firepower: The company targets returning at least 85% of free cash flow to investors and raises dividends annually. Bettinger underscored that even when core markets slowed in 2023, Lam posted record free cash flow.
Why AI/Semiconductor Investors Should Care: Lam’s advanced etch and deposition technologies underpin the push toward 3D device architectures and higher memory densities, essential for AI’s rising compute demand. As NAND migrates beyond 200 layers, and DRAM stacks get taller with HBM, Lam’s tools become more critical, fueling sustainable revenue and margin expansion. For investors, Lam’s combination of product leadership, growing free cash flow, and shareholder returns positions the company favorably in an increasingly AI-driven semiconductor market.
Youtube Channel - Jose Najarro Stocks
X Account - @_Josenajarro
Get 15% OFF Finchat — MY FAVORITE STOCK MARKET DATA PLATFORM
Disclaimer: This article is intended for educational and informational purposes only and should not be construed as investment advice. Always conduct your own research and consult with a qualified financial advisor before making any investment decisions.
The overview above provides key insights every investor should know, but subscribing to the premium tier unlocks deeper analysis to support your Semiconductor, AI, and Software journey. Behind the paywall, you’ll gain access to in-depth breakdowns of earnings reports, keynotes, and investor conferences across semiconductor, AI, and software companies. With multiple deep dives published weekly, it’s the ultimate resource for staying ahead in the market. Support the newsletter and elevate your investing expertise—subscribe today!
[Paid Subscribers] UiPath Delivers Mixed Growth Amid Agentic Automation Launch
Date of Event: Monday, May 29, 2025
Executive Summary
*Reminder: We do not talk about valuations, just an analysis of the earnings/conferences
UiPath, Inc. (NYSE: PATH), a global leader in robotic process automation (RPA) and agentic automation, reported its first-quarter fiscal 2026 results on May 29, 2025. The company posted revenue of $357 million, an increase of 6% year-over-year, while Annualized Renewal Run-rate (ARR) reached $1.693 billion, up 12% compared to the same period last year. According to UiPath Founder and Chief Executive Officer Daniel Dines, this quarter was a milestone in product evolution, driven by the rollout of UiPath’s next-generation agentic automation platform.
Key financial metrics underscored progress and ongoing challenges. Net new ARR for the quarter was $27 million—one of its weaker first-quarter intakes—and dollar-based net retention rate slid to 108% from the 119–121% range observed two years ago. Meanwhile, operating efficiency improved as non-GAAP operating income stood at $70 million, up from $47 million in the same period last year, helping to narrow the company’s GAAP operating loss to $16 million.
Notable management comments focused on the new agentic automation offerings and the shift toward more advanced, AI-powered workflows. “We’re encouraged by the early response from customers, partners, and the broader ecosystem,” stated Daniel Dines, highlighting that the company sees agentic automation as the next phase of enterprise-scale AI and automation capabilities. Chief Operating Officer and Chief Financial Officer Ashim Gupta reinforced that UiPath remains dedicated to maintaining “operational discipline to drive sustainable growth and profitability” through fiscal 2026.
Growth Opportunities
One central theme emerging from UiPath’s earnings call and shareholder materials is its focus on scaling the recently launched agentic automation platform. This platform, which integrates advanced AI agents, RPA robots, and human workflows, is designed to unify automation capabilities under a single enterprise-grade system called Maestro. Management noted that early customer trials involve thousands of AI agent prototypes and more than 250,000 cumulative agent runs.