Micron Will Be More Profitable Than Tesla. Nobody's Prepared For It.
- Rebellionaire Staff
- 2 hours ago
- 6 min read
We're going to say something that's probably going to get a few eyebrows raised, but we genuinely believe it: Micron is going to rake in more net income in its next quarter alone than Tesla will in any single quarter through 2030 - even if Robotaxi and Optimus are flying high and everything goes perfectly.
Before you slam the tab shut: we're Tesla enthusiasts. We love Robotaxi, we think Optimus has huge potential, and we're 100% convinced that Tesla is going to blast off to even greater heights. This isn't a hit piece on Tesla at all - it's just an attempt to put Micron's numbers into perspective, and those numbers are genuinely shocking.
The Commodity That's About to Become Anything But
For as long as anyone can remember, memory has been a commodity that's just as brutal as it is cyclical. There are only three players in the game - SK Hynix, Samsung, and Micron - and they supply almost all the world's memory. When the good times roll, margins go through the roof. But when the bad times hit, demand dries up, margins plummet, and those expensive factories become more of a liability than an asset.
Investors are all too familiar with this pattern. That's why Micron's currently trading at a 4x forward PE ratio. Which is a fancy way of saying: if Micron stays on the same earnings track it's on now, and pays out every penny of profit as a dividend, you'd get your entire investment back in four years and still own the company.
Which is pretty incredible — and says a lot about investor expectations. They're clearly not convinced the good times will roll on.
We think they're wrong. Here's why.
The Real Shortage — and Just How Deep It Is
This isn't some minor paper shortage. The evidence is mounting.
Elon Musk recently and very publicly named Micron as a company he'd buy all the memory from that they could possibly supply. OpenAI apparently tried to corner the market, buying up as much as they could from one of the major producers. A Google exec got the boot — reportedly because they couldn't secure enough memory for the company's TPU v7 AI processors. Microsoft is struggling to meet its own memory needs, and that's affecting their ability to scale compute.
And then there's Micron's own earnings call, where they said customers are only getting two-thirds of the memory they're asking for. That's the reality right now — before Vera Rubin has even scaled, and before AI5 chips start showing up in Tesla vehicles and Optimus robots.
The Memory Conundrum — and Why Nobody Saw It Coming
There's something that's been flying under the radar — the way the game is changing when it comes to memory.
It's not just that we're deploying more compute. It's that the amount of memory needed per unit of compute is skyrocketing at the same time.
Look at the data center trend. The H100 — the chip that triggered Nvidia's historic run in 2023 — had about 0.6 terabytes of memory. The H200 doubled that. The B200 pushed to 1.44. The Vera Rubin? One chip, 20.7 terabytes. That's 30 times more memory than the H100 in just a few years.
Or put another way: the H100 needed about 62 terabytes of memory per megawatt of data center capacity. Vera Rubin needs 401. Energy capacity is the main constraint on data center buildout right now — and for every unit of that constrained energy, you need more than six times as much memory as you did with the previous generation.
Same story at the edge. Tesla's Hardware 3 chip had 8 gigabytes of RAM — not enough, which is part of why early FSD would lose track of a pedestrian who walked behind a car. Hardware 4 doubled it to 16. AI5, which will power every future Tesla and every Optimus robot, jumps to 144 gigabytes. Nine times the memory of the previous generation.
Scale that across millions of vehicles and robots and the memory demand is staggering. Which is exactly why Elon made that comment about buying everything Micron can make.
The Earnings That Nobody Believed
Micron's most recent quarter was one of the most remarkable we've ever seen from any company.
A year ago: $8 billion in quarterly revenue, $1.41 in earnings per share. This quarter: $24 billion in revenue, gross margins more than doubled to 74.4%, EPS of $12. Then they guided next quarter for $33.5 billion in revenue, 81% gross margins, and around $19 in earnings per share.
Those 81% gross margins would put Micron ahead of Nvidia — a company the market treats as one of the great tech businesses of our era. And yet the market is treating Micron like a broken commodity cyclical. It's jarring.
Analyst after analyst raised their price targets. Earnings estimates went up across the board. The stock dropped over $100 in the following two weeks. That reaction is the market screaming "we don't believe it."
The Tesla Comparison That Left Me Speechless
We built a financial model for Tesla out to Q4 2030, with what we thought were reasonably optimistic assumptions: just under a million vehicles delivered per quarter, 3.8 million Robotaxis generating $20 billion in revenue at 80% gross margins, and 1.6 million Optimus units per quarter at $30,000 a pop.
That's a heck of a lot of execution. TerraFab has to work. Optimus has to ramp to 6.4 million units annualized. Two entirely new industries have to exist and scale — because TSMC and Samsung aren't going to hand Tesla 10 million chips a year. It's an exciting vision. One we believe in.
Under those assumptions, Tesla generates $17.3 billion in net income in Q4 2030.
Micron is guiding for $23 billion in net income next quarter. No new industries. No factories that haven't been announced. You can see for miles on this one.
Why the Bear Case Falls Flat
The immediate counter is: it's memory, it's cyclical, it's all going to end in tears. High margins are the top of the cycle, not some new baseline. And you can trace that history all the way back through every previous memory boom.
If you're reasoning from history, that makes sense. Every memory cycle has ended in a crash. Why should this one be different?
Because when you look at the underlying demand rather than just following the historical pattern, things look very different.
Where exactly do AI inference needs go from here? In the US, AI inference usage grew 5x last year. Data centers are still being built everywhere. The memory required per unit of compute keeps going up with every new chip generation. And the factories to supply that memory can't keep up — they take years to build.
Even Micron's CEO said the shortage isn't going away anytime soon. When Google published research showing a big reduction in memory needed for a given AI task, the market panicked — same as the DeepSeek panic that briefly cratered Nvidia. But that's not how AI demand works. Efficiency gains mean you can do more, not less. That's Jevons Paradox. AI inference usage grew 5x last year while getting dramatically more efficient. The market is showing you that in real time.
The Numbers — If You're Willing to Believe Them
Conservative assumptions: margins come back down, growth slows to a more normal pace. We still get to around $100 in earnings per share by 2027. At a PE of 10 — which is low for any company on this kind of growth trajectory — that's a share price of roughly $1,000, compared to $327 today. That's a 3x return in a couple of years without assuming any multiple expansion.
The PEG ratio for Micron right now is around 0.05. For a growth stock you're usually looking at 1 to 1.5. Micron is at a twentieth of that.
The base case gets you to 3x. The bull case — if the market eventually re-rates this as a secular growth story rather than a commodity cycle — gets you to $2,000 or higher.
One Last Thought
If you're following this channel, odds are you think like we do. You believe in the future Elon Musk has laid out. Robotaxis. Humanoids. An AI-abundant world.
If that future happens, someone has to supply the memory to run it. Right now, that someone is trading at 4x earnings.
We own Micron for clients. Not financial advice. Do your own research. There may be mistakes in here. Trade at your own risk — for entertainment purposes only.

