What The Chip Happened?

What The Chip Happened?

Share this post

What The Chip Happened?
What The Chip Happened?
šŸ¤” Could This Chinese AI Lab Shake Up Nvidia and the AI Market?
Copy link
Facebook
Email
Notes
More

šŸ¤” Could This Chinese AI Lab Shake Up Nvidia and the AI Market?

Jose Najarro's avatar
Jose Najarro
Jan 25, 2025
āˆ™ Paid

Share this post

What The Chip Happened?
What The Chip Happened?
šŸ¤” Could This Chinese AI Lab Shake Up Nvidia and the AI Market?
Copy link
Facebook
Email
Notes
More
1
Share

Welcome, AI & Semiconductor Investors,
C
ould a low-cost, cutting-edge language model out of China rewrite the rules for AI development—and challenge Nvidia’s dominance? DeepSeek’s surprising efficiency and distillation techniques have investors speculating whether this marks a new chapter in AI innovation.

Also, discover how OpenAI’s new Operator could revolutionize online interactions and why SK Hynix’s record-breaking AI memory revenue cements its role in the AI infrastructure boom. Let’s dive in!

What The Chip Happened?

šŸ¤”Could This Chinese AI Lab Shake Up Nvidia and the AI Market?
šŸ¤– Operator Launch: Browser-Based AI Agent Powers Everyday Tasks
šŸ† SK Hynix: Sustaining the AI Memory Boom!

Read time: 7 minutes


DeepSeek (Private) Nvidia (NASDAQ: NVDA)
šŸ¤” Could This Chinese AI Lab Shake Up Nvidia and the AI Market?

A 16:9 futuristic illustration depicting a Chinese AI lab with a whale logo, symbolizing its potential to disrupt Nvidia and the AI market. The scene features a high-tech lab with glowing AI servers and holographic displays showcasing innovative technology. At the center is a prominent whale logo glowing in blue, surrounded by interconnected data streams and advanced AI processors. In the background, Nvidia-branded servers and GPUs appear in a dimmer light, symbolizing the competitive challenge. The overall atmosphere conveys innovation, competition, and the potential for market disruption.

What The Chip: DeepSeek, a little-known AI lab out of China, claims it built cutting-edge large language models at a fraction of the usual cost—potentially upending the AI arms race. Investors are now asking whether these surprising developments spell trouble for companies like Nvidia and the tech giants betting billions on AI.

Details:

šŸ¤” Surprising Efficiency: DeepSeek’s recent open-source language model reportedly cost only about $6 million and two months to build, using Nvidia’s slightly less powerful H800 chips instead of the top-shelf H100. This raised eyebrows because other industry players are spending far more on similar projects.

šŸ”‘ Analysts Take: According to Dylan Patel, Lead Analyst at SemiAnalysis, DeepSeek actually has access to over 50,000 Hopper GPUs, contradicting the notion they’re running everything on meager resources. Patel says people should ā€œstop acting like they only have that 10k A100 cluster.ā€

šŸ’” Distillation Magic: Experts like Benchmark’s Chetan Puttagunta point to model distillation—using large models to train smaller ones—as a cost-cutting secret. This could make AI more accessible and potentially disruptive to the established players.

šŸ‡ŗšŸ‡ø American Competition: Meanwhile, U.S. giants like Microsoft, Meta, and others continue to invest heavily in AI, with some reports hinting at over $500 billion allocated to infrastructure through various ā€œStargate Initiativeā€ programs.

šŸ“‰ Potential Nvidia Headwinds?: If DeepSeek’s approach is both real and replicable, it may reduce the need for the most expensive GPUs, potentially dampening demand for Nvidia’s flagship products. However, skeptics wonder if these claims are overhyped. The flip-side is an increase in AI use (due to cheaper models) will accelerate the adoption of AI, pushing more needs for AI compute.

ā“ Unanswered Questions: Skepticism abounds about how DeepSeek’s approach remains under the radar. Many wonder why no engineer at Google, Microsoft, Meta, or other AI powerhouses came forward with a similar low-cost solution.

šŸ’° Earnings Watch: Upcoming earnings calls for AI-heavy players like Microsoft, Meta, and Tesla could spark tough questions: ā€œAre we overspending on AI infrastructure, or is DeepSeek an outlier?ā€

Why AI/Semiconductor Investors Should Care: If DeepSeek’s purported breakthroughs are genuine, they could reshape how—and how much—companies spend on AI research and development. At the same time, the hype may fade if deeper analysis shows these claims aren’t as groundbreaking as advertised. Either way, the conversation around cost-effective AI is ramping up, and investors would be wise to keep their eyes on both technical evidence and big tech’s next moves.


Moore Semiconductor Investing
šŸ“— [NEW!!] Unlock Q4 Semiconductor Earnings --- 60% OFF
(NEW EARNINGS)

What The Chip: Get a front-row seat to the financials shaping the semiconductor industry. This continuously updated e-book by Jose Najarro distills the latest Q4 quarterly insights—from wafer production trends to AI chip breakthroughs—into a single comprehensive resource.

Details:

šŸ”µ Dynamic Updates: Start with giants like TSMC and ASML, then expand to 30+ companies as their Q4 2024 earnings roll in. Earnings are restarting!!

šŸ”µ Huge Value for OVER Half the Price: For a limited time, the e-book is discounted from $49.07 USD to $19.99 USD, offering a robust market guide at a significant value.

šŸ”µ Broad Coverage: From traditional chipmakers to cutting-edge AI semiconductor players, get the full picture as it emerges.

Why AI/Semiconductor Investors Should Care: This evolving earnings handbook gives you a strategic edge. Understanding quarterly earnings data is crucial for gauging industry health, discovering new growth leaders, and aligning your investment approach with emerging technological waves.

Disclaimer: For educational and informational purposes only. Not financial advice. Consult with a qualified professional before making any investment decisions. Updates are only for the Quarter of Earnings.


OpenAI (PRIVATE) Microsoft (NASDAQ: MSFT)
šŸ¤– Operator Launch: Browser-Based AI Agent Powers Everyday Tasks

A 16:9 futuristic illustration showcasing OpenAI's newly unveiled AI agent named 'Operator,' designed to use its own browser to perform tasks for users. The scene features a sleek AI interface labeled 'Operator,' surrounded by glowing holographic browser windows displaying various tasks being performed. A central AI figure, symbolizing the Operator, interacts with the environment, seamlessly managing multiple tasks. The background includes a futuristic workspace with interconnected data streams and advanced technology, emphasizing efficiency and user empowerment. The atmosphere conveys cutting-edge innovation and the transformative potential of the Operator AI agent.

What The Chip: OpenAI has just unveiled Operator, an AI agent that can use its own browser to perform tasks for users—from ordering groceries to creating memes. Currently a research preview limited to Pro subscribers in the U.S., Operator could soon redefine how people and businesses interact online.

Details:

šŸ¤– Browser Superpowers: Operator is powered by OpenAI’s new Computer-Using Agent (CUA) model, trained to interact with graphical user interfaces and execute tasks like filling out online forms or clicking buttons—no additional APIs needed.

šŸ† Early Achievements: Despite being in its early stages, it already boasts state-of-the-art results in benchmarks like WebArena and WebVoyager.

āš™ļø Safety & Privacy Focus: Features like ā€œtakeover modeā€ and ā€œwatch modeā€ prompt users to confirm actions before submitting sensitive info, and a dedicated ā€œmonitor modelā€ can step in if suspicious actions arise.

šŸ›’ Real-World Partners: Companies like DoorDash, Instacart, and Uber are testing ways Operator can speed up customer transactions. ā€œOpenAI's Operator is a technological breakthrough that makes processes like ordering groceries incredibly easy,ā€ says Instacart’s Chief Product Officer Daniel Danker.

🌐 Civic Collaboration: City of Stockton’s IT Director, Jamil Niazi, says they’re exploring Operator to simplify city service enrollment. ā€œAs we learn more about Operator during its research preview, we'll be better equipped to identify ways that AI can make civic engagement even easier for our residents.ā€

šŸ” Data Controls: Users can opt out of training data collection, delete browsing history, and log out of all sites with a single click.

ā³ Future Plans: Operator capabilities will expand to ChatGPT Plus, Team, and Enterprise users, and the underlying CUA model may soon be made available via API to let developers build their own browser-based agents.

Why AI/Semiconductor Investors Should Care: OpenAI’s foray into a browser-based AI assistant underscores the growing appetite for AI that can handle practical, everyday tasks without specialized integrations. This opens new demand for cutting-edge chips to support advanced reasoning and vision models, and it positions AI as a critical driver of innovation in e-commerce, service automation, and beyond. Operators like this could become the next major wave of consumer-facing AI, broadening addressable markets and fueling the ongoing AI hardware boom.


SK Hynix (000660.KS)
šŸ† SK Hynix: Sustaining the AI Memory Boom!

A 16:9 futuristic illustration showcasing SK Hynix HBM (High Bandwidth Memory) being used for AI chips. The scene features glowing stacks of HBM modules connected to advanced AI processors, with vibrant data streams symbolizing high-speed performance. The SK Hynix logo is prominently displayed on the memory stacks, highlighting their role in AI innovation. The background includes a high-tech lab environment with robotic arms assembling AI chips and holographic displays showing performance metrics. The atmosphere conveys cutting-edge technology and the critical role of HBM in AI chip advancements.

What The Chip: SK Hynix just posted its best-ever quarterly and annual financial results, driven by booming AI memory product demand—particularly for its HBM (High Bandwidth Memory) and enterprise SSDs. Investors see the company leveraging these record-high figures to maintain momentum in the fast-growing AI server market.

Details:

šŸ“ˆ Record Financials: 2024 revenues hit 66.1930 trillion won (up over 21 trillion won from the previous record), with operating profit at 23.4673 trillion won. The company credits robust AI memory shipments for these milestones.

šŸ’¾ HBM Dominance: Fourth quarter DRAM revenue from HBM alone topped 40%, underlining SK Hynix’s leadership in high-performance memory crucial for AI workloads.

šŸ”§ Profitability-Focused Strategy: ā€œWhile maintaining the profitability-first commitment, the company will make flexible investment decisions in line with the market situation,ā€ said CFO Kim Woohyun, highlighting a disciplined approach to capacity expansion.

šŸ—ļø Infrastructure & Investment: They’re expanding HBM3E supply, developing HBM4, and constructing new fabs in Cheong-ju (M15X) and Yong-in to meet future demand.

šŸ“‰ Bearish Spot: Conventional memory pricing (DDR4, LPDDR4) faces pressure from sluggish consumer demand and Chinese competitors, with SK Hynix planning to maintain tight supply and shift focus to higher-margin products.

šŸ“ Dividend Hike: Annual fixed dividend is raised 25% to 1,500 won/share as the company rewards shareholders amid booming AI memory business.

šŸ’” Stable Financial Position: By year-end 2024, cash equivalents grew to 14.2 trillion won, while debt decreased to 22.7 trillion won—significantly improving the debt ratio to 31%.

šŸŒŽ Global AI Growth: Big tech’s large-scale AI server investments are expected to keep pushing demand for advanced memory technology like HBM and high-density server DRAM.

Why AI/Semiconductor Investors Should Care: SK Hynix’s strong results underscore the importance of advanced memory solutions for AI data centers, now and in the years ahead. Their leading HBM and DRAM technologies position them as a key supplier in the AI revolution, potentially offering sustained returns if AI-driven demand holds strong—even as consumer segments remain uncertain.


Youtube Channel - Jose Najarro Stocks
[NEW] Semiconductor Q4 Earnings Book — 60% OFF
X Account - @_Josenajarro

Disclaimer: This article is intended for educational and informational purposes only and should not be construed as investment advice. Always conduct your own research and consult with a qualified financial advisor before making any investment decisions.


The overview above provides key insights every investor should know, but subscribing to the premium tier unlocks deeper analysis to support your Semiconductor, AI, and Software journey. Behind the paywall, you’ll gain access to in-depth breakdowns of earnings reports, keynotes, and investor conferences across semiconductor, AI, and software companies. With multiple deep dives published weekly, it’s the ultimate resource for staying ahead in the market. Support the newsletter and elevate your investing expertise—subscribe today!

[Paid Subscribers] SK Hynix Achieves Record Earnings Driven by AI Memory Momentum

Executive Summary

SK Hynix Inc., a Korea-based semiconductor supplier with core products in Dynamic Random Access Memory (DRAM), NAND flash memory, and CMOS Image Sensors (CIS), reported robust results for its fourth quarter (Q4) and full-year 2024 on January 22, 2025. Notable for its high-performance memory solutions—especially High-Bandwidth Memory (HBM) chips—SK Hynix recorded its best-ever quarterly and annual financial performance, driven by a prolonged surge in artificial intelligence (AI) memory demand and a shift toward advanced DRAM technologies. In Q4 2024, the company’s KRW 19.7670 trillion in revenue marked a 12% quarter-over-quarter (QoQ) increase and a 75% year-over-year (YoY) jump. Meanwhile, full-year 2024 revenue soared to KRW 66.1930 trillion, more than doubling from 2023.

Management attributed these results to three main factors: sustained AI-driven market expansion, a profitability-oriented operational strategy, and a consistent leadership position in HBM technology. The company’s operating margin reached 41% in Q4 2024—up from 40% in Q3—cementing SK Hynix’s status as a premium supplier of high-value memory solutions. Summarizing this momentum, Kim Woohyun, the Vice President and Chief Financial Officer (CFO), noted:

ā€œWith significantly increased portion of high value-added products, SK hynix has built fundamental to achieve sustainable revenues and profits even in times of market correction.ā€

Looking ahead, management remains focused on HBM development, more advanced DRAM (DDR5, LPDDR5) adoption, and a deliberate pivot away from legacy commodity DRAM. However, the company also cited risks in the form of geopolitical uncertainties, protective trade policies, and soft demand in PC and smartphone segments. Despite near-term challenges, SK hynix’s transformation toward a specialized AI memory portfolio appears to position it for continued strong performance as high-bandwidth solutions become standard across data center and enterprise markets.

Growth Opportunities

AI Memory Expansion

This post is for paid subscribers

Already a paid subscriber? Sign in
Ā© 2025 Jose Najarro
Privacy āˆ™ Terms āˆ™ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More