Tesla’s stock (TSLA) is crashing by as much as 8% today. CEO Elon Musk predicted that the stock would get crushed “like a soufflé being smashed by a sledgehammer” if it didn’t show profit growth, which is what is happening now.
As we reported earlier this month, if Tesla stock doesn’t crash this quarter, Tesla will likely be trading at a 500+ P/E after reporting Q1 2025 earnings. The last time Tesla traded at these levels, Musk warned Tesla employees that the stock would get crushed “like a soufflé being smashed by a sledgehammer” if it didn’t show profit growth.
It looks like the market is finally catching up as Tesla’s stock crashed 8% today:
The automaker’s valuation has now dipped back below $1 trillion.
On the positive side, Tesla launched a new FSD update in China today. The automaker will likely use that to justify the recognition of some deferred revenue, but it’s not all positive, as the update has been received with mixed reviews.
Electrek’s Take
I think the main factor impacting Tesla’s stock is the anticipation of reduced earnings expectations. Even with today’s 8% crash, Tesla’s stock is still trading at a price-to-earnings ratio of around 150, and that’s with the Bitcoin gain last quarter.
If Tesla doesn’t crash more this quarter, with expected reduced earnings in Q1 due to much lower deliveries, it would likely shoot back up to a P/E of 300+.
In comparison, an automaker like Toyota trades at a P/E of 7, and a technology company like Meta trades at a P/E of 40.
These insane price-to-earnings ratios basically never hold, but they certainly don’t hold when earnings are going down, which is what is happening with Tesla:
As you can see from this chart, the stock seems to only be starting to realize that it’s disconnected from its earnings, and it still has quite a bit of catching up to do.
I never thought I’d find myself cheering for Tesla’s stock to continue crashing, but I feel like it’s the only way to save the company now, as the board and shareholders don’t care about anything else. Tesla’s stock crashing is the only way to get them to care about removing Elon Musk.
I expect the stock to continue to crash in the coming weeks as analysts adjust their delivery expectations and then their earnings expectations for Q1. The consensus appear to still be over 400,000 deliveries in Q1, but it looks like it could be below that.
Shareholders are hoping that Tesla’s planned launch of a robotaxi fleet in Austin in June will turn things around for the stock, but as I previously reported, that’s a “moving of the goal post” strategy by Elon – although it’s likely that large parts of the market don’t realize it.
FTC: We use income earning auto affiliate links.More.
Metro Detroit is about to get a big boost of fast EV chargers, with more than 40 new ChargePoint ports set to come online across multiple sites owned by the Dabaja Brothers Development Group.
The first ultra-fast charging site just opened in Canton, Michigan. It’s owned and operated by Dabaja Brothers, who plan to follow it with additional ChargePoint-equipped locations in Dearborn and Livonia.
“We started this project because we saw a gap in our community – there was almost nowhere to charge an EV in Canton, and a similar lack of charging across metro Detroit,” said Yousef Dabaja, owner/operator at Dabaja Brothers.
Each metro Detroit site will feature ChargePoint Express Plus fast charging stations, which can deliver up to 500 kW to a single port, can fast-charge two vehicles at the same time, and are compatible with all EVs. The stations feature a proprietary cooling system to deliver peak charging speeds for sustained periods, ensuring that charging speed remains consistent.
Advertisement – scroll for more content
The stations operate on the new ChargePoint Platform, which enables operators to monitor performance, adjust pricing, troubleshoot issues, and gain real-time insights to keep chargers running smoothly.
Rick Wilmer, CEO at ChargePoint, said, “This initiative will rapidly infill the ‘fast charging deserts’ across the Detroit area, allowing drivers to quickly recharge their vehicles when and where they need to.”
If you’re looking to replace your old HVAC equipment, it’s always a good idea to get quotes from a few installers. To make sure you’re finding a trusted, reliable HVAC installer near you that offers competitive pricing on heat pumps, check out EnergySage. EnergySage is a free service that makes it easy for you to get a heat pump. They have pre-vetted heat pump installers competing for your business, ensuring you get high quality solutions. Plus, it’s free to use!
Your personalized heat pump quotes are easy to compare online and you’ll get access to unbiased Energy Advisors to help you every step of the way. Get started here. – *ad
FTC: We use income earning auto affiliate links.More.
Mercedes-Benz High-Power Charging and Starbucks have officially opened their first DC fast charging hub together, off the I-5 in Red Bluff, California.
The 400 kW Mercedes-Benz chargers are capable of adding up to 300 miles in 10 minutes, depending on the EV, and every stall has both NACS and CCS cables – they’re fully open DC fast chargers.
Mercedes-Benz HPC North America, a joint venture between subsidiaries of Mercedes-Benz Group and renewable energy producer MN8 Energy, first announced in July 2024 that it would install DC fast chargers at Starbucks stores along Interstate 5, the main 1,400-mile north-south interstate highway on the US West Coast from Canada to Mexico. Ultimately, Mercedes plans to install fast chargers at 100 Starbucks stores across the US.
Mercedes-Benz HPC opened its first North American charging site at Mercedes-Benz USA’s headquarters in Sandy Springs, Georgia, in November 2023 as part of an initial $1 billion charging network investment. As of the end of 2024, Mercedes had deployed over 150 operational fast chargers in the US, but it hasn’t disclosed an official number of how many chargers are currently online.
Advertisement – scroll for more content
Andrew Cornelia, CEO of Mercedes-Benz HPC North America, is leaving the company at the end of the month to become global head of electrification & sustainability at Uber.
If you’re looking to replace your old HVAC equipment, it’s always a good idea to get quotes from a few installers. To make sure you’re finding a trusted, reliable HVAC installer near you that offers competitive pricing on heat pumps, check out EnergySage. EnergySage is a free service that makes it easy for you to get a heat pump. They have pre-vetted heat pump installers competing for your business, ensuring you get high quality solutions. Plus, it’s free to use!
Your personalized heat pump quotes are easy to compare online and you’ll get access to unbiased Energy Advisors to help you every step of the way. Get started here. – *ad
FTC: We use income earning auto affiliate links.More.
The race for autonomous driving has three fronts: software, hardware, and regulatory. For years, we’ve watched Tesla try to brute-force its way to “Full Self-Driving (FSD)” with its own custom hardware, while the rest of the automotive industry is increasingly lining up behind NVIDIA.
Here’s a table comparing the two chips with the best possible specs I could find. greentheonly’s teardown was particularly useful. If you find things you think are not accurate, please don’t hesitate to reach out:
Feature / Specification
Tesla AI4 (Hardware 4.0)
NVIDIA Drive Thor (AGX / Jetson)
Developer / Architect
Tesla (in-house)
NVIDIA
Manufacturing Process
Samsung 7nm (7LPP class)
TSMC 4N (custom 5nm class)
Release Status
In production (shipping since 2023)
In production since 2025
CPU Architecture
ARM Cortex-A72 (legacy)
ARM Neoverse V3AE (server-grade)
CPU Core Count
20 cores (5× clusters of 4 cores)
14 cores (Jetson T5000 configuration)
AI Performance (INT8)
~100–150 TOPS (dual-SoC system)
1,000 TOPS (per chip)
AI Performance (FP4)
Not supported / not disclosed
2,000 TFLOPS (per chip)
Neural Processing Unit
3× custom NPU cores per SoC
Blackwell Tensor Cores + Transformer Engine
Memory Type
GDDR6
LPDDR5X
Memory Bus Width
256-bit
256-bit
Memory Bandwidth
~384 GB/s
~273 GB/s
Memory Capacity
~16 GB typical system
Up to 128 GB (Jetson Thor)
Power Consumption
Est. 80–100 W (system)
40 W – 130 W (configurable)
Camera Support
5 MP proprietary Tesla cameras
Scalable, supports 8MP+ and GMSL3
Special Features
Dual-SoC redundancy on one board
Native Transformer Engine, NVLink-C2C
The most striking difference right off the bat is the manufacturing process. NVIDIA is throwing everything at Drive Thor, using TSMC’s cutting-edge 4N process (a custom 5nm-class node). This allows them to pack in the new Blackwell architecture, which is essentially the same tech powering the world’s most advanced AI data centers.
Advertisement – scroll for more content
Tesla, on the other hand, pulled a move that might surprise spec-sheet warriors. Teardowns confirm that AI4 is built on Samsung’s 7nm process. This is mature, reliable, and much cheaper than TSMC’s bleeding-edge nodes.
When you look at the compute power, NVIDIA claims a staggering 2,000 TFLOPS for Thor. But there’s a catch. That number uses FP4 (4-bit floating point) precision, a new format designed specifically for the Transformer models used in generative AI.
Tesla’s AI4 is estimated to hit around 100-150 TOPS (INT8) across its dual-SoC redundant system. On paper, it looks like a slaughter, but Tesla made a very specific engineering trade-off that tells us exactly what was bottling up their software: memory bandwidth.
Tesla switched from LPDDR4 in HW3 to GDDR6 in HW4, the same power-hungry memory you find in gaming graphics cards (GPUs). This gives AI4 a massive memory bandwidth of approximately 384 GB/s, compared to Thor’s 273 GB/s (on the single-chip Jetson config) using LPDDR5X.
This suggests Tesla’s vision-only approach, which ingests massive amounts of raw video from high-res cameras, was starving for data.
Based on Elon Musk’s comments that Tesla’s AI5 chip will have 5x the memory bandwidth, it sounds like it might still be Tesla’s bottleneck.
Here is where Tesla’s cost-cutting really shows. AI4 is still running on ARM Cortex-A72 cores, an architecture that is nearly a decade old. They bumped the core count to 20, but it’s still old tech.
NVIDIA Thor, meanwhile, uses the ARM Neoverse V3AE, a server-grade CPU explicitly designed for the modern software-defined vehicle. This allows Thor to run not just the autonomous driving stack, but the entire infotainment system, dashboard, and potentially even an in-car AI assistant, all on one chip.
Thor has found many takers, especially among Tesla EV competitors such as BYD, Zeekr, Lucid, Xiaomi, and many more.
Electrek’s Take
There’s one thing that is not in there: price. I would assume that Tesla wins on that front, and that’s a big part of the project. Tesla developed a chip that didn’t exist, and that it needed.
It was an impressive feat, but it doesn’t make Tesla an incredible leader in silicon for self-driving.
Tesla is maxing out AI4. It now uses both chips, making it less likely to achieve the redundancy levels you need to deliver level 4-5 autonomy.
Meanwhile, we don’t have a solution for HW3 yet and AI5 is apparently not coming to save the day until 2027.
By then, there will likely be millions of vehicles on the road with NVIDIA Thor processors.
FTC: We use income earning auto affiliate links.More.