Connect with us

Published

on

A jury has found Tesla not at fault in a lawsuit over a 2019 wrongful death which alleged that Autopilot caused a crash, killing one passenger and seriously injuring two.

In question was the death of 37-year-old Micah Lee, who was driving a Model 3 in 2019 in Menifee, CA (in the Inland Empire to the east of Los Angeles), and hit a palm tree at approximately 65 miles per hour, causing his death and the injury of two passengers, including an 8-year-old boy. The lawsuit was brought by the passengers.

The lawsuit alleged that Tesla knowingly marketed unsafe experimental software to the public, and that safety defects within the system led to the crash (in particular, a specific steering issue that was known by Tesla). Tesla responded that the driver had consumed alcohol (the driver’s blood alcohol level was at .05%, below California’s .08% legal limit) and that the driver is still responsible for driving when Autopilot is turned on.

A survivor in the vehicle at the time of the accident claimed that Autopilot was turned on at the time of the crash.

Tesla disputed this, saying it was unclear whether Autopilot was turned on – a difference from its typical modus operandi, which involves pulling vehicle logs and stating definitively whether and when Autopilot was on or off. Though these claims have sometimes been lodged when Autopilot was disengaged moments before a crash, when avoidance was no longer possible for the driver.

After four days of deliberations, the jury decided in Tesla’s favor, with a 9-3 decision that Tesla was not culpable.

While Tesla has won an autopilot injury lawsuit before, in April of this year, this is the first resolved lawsuit that has involved a death. That last lawsuit used the same reasoning – that drivers are still responsible for what happens behind the wheel while Autopilot or Full Self-Driving are engaged (despite the name of the latter system suggesting otherwise). Full Self-Driving was not publicly available at the time of Lee’s crash, though he had purchased the system for $6,000 expecting it to be available in the future.

Both of Tesla’s autonomous systems are “level 2” on the SAE’s driving automation scale, like most other new autonomous driving systems on the market these days. Although Autopilot is intended for highway use, Tesla’s FSD system can be activated in more situations than most cars. But there is no point at which the car assumed responsibility for driving – that responsibility always lies with the driver.

Since the trial began last month, Tesla CEO Elon Musk made a notable comment during his disastrous presence on Tesla’s Q3 conference call. He was asked whether and when Tesla would accept legal liability for autonomous drive systems, as Mercedes has just started doing with its Level 3 DRIVE PILOT system, the first of its kind in the US (read about our test drive of it in LA). Musk responded saying:

Well, there’s a lot of people that assume we have legal liability judging by the lawsuits. We’re certainly not being let that off the hook on that front, whether we’d like to or wouldn’t like to.

Elon Musk, CEO, Tesla

Later in the answer, Musk called Tesla’s AI systems “baby AGI.” AGI is an acronym for “artificial general intelligence,” which is a theorized technology for when computers become good enough at all tasks to be able to replace a human in basically any situation, not just in specialized situations. In short, it’s not what Tesla has and has nothing to do with the question.

Tesla is indeed currently facing several lawsuits over injuries and deaths that have happened in its vehicles, many alleging that Autopilot or FSD are responsible. In one, Tesla tried to argue in court that Musk’s recorded statements on self-driving “might have been deep fakes.”

We also learned recently, at the release of Musk’s biography, that he wanted to use Tesla’s in-car camera to spy on drivers and win autopilot lawsuits. Though that was apparently not necessary in this case.

Electrek’s Take

Questions like the one asked in this trial are interesting and difficult to answer, because they combine the concepts of legal liability, versus marketing materials, versus public perception.

Tesla is quite clear in official communications, like in operating manuals, in the car’s software itself, and so on, that drivers are still responsible for the vehicle when using Autopilot. Drivers accept agreements as such when first turning on the system.

Or at least, I think they do, since the first time I accepted it was so long ago. And that is the rub. People are also used to accepting long agreements whenever they turn on any system or use any piece of technology, and nobody reads those. Sometimes, these terms even include legally unenforceable provisions, depending on the venue in question.

And then, in terms of public perception, marketing, and in how Tesla has deliberately named the system, there is a view that Tesla’s cars really can drive themselves. Here’s Tesla explicitly saying “the car is driving itself” in 2016.

We here at Electrek, and our readership, know the difference between all of these concepts. We know that “Full Self-Driving” was (supposedly) named that way so that people can buy it ahead of time and eventually get access to the system when it finally reaches full self-driving capability (which should happen, uh, “next year”… in any given year). We know that “Autopilot” is meant to be a reference to how it works in airplanes, where a pilot is still required in the seat to take care of tasks other than cruising steadily. We know that Tesla only has a level 2 system, and that drivers still accept legal responsibility.

But when the general public gets a hold of technology, they tend to do things that you didn’t expect. That’s why caution is generally favorable when releasing experimental things to the public (and, early on, Tesla used to do this – giving early access to new Autopilot/FSD features to trusted beta testers, before wide release).

Despite being told before activating the software, and reminded often while the software is on, that the driver must keep their hands on the wheel, we all know that drivers don’t do that. That drivers pay less attention when the system is activated than when it isn’t. Studies have shown this, as well.

And so, while the jury found (probably correctly) that Tesla is not liable here, and while this is perhaps a good reminder to all Tesla drivers to keep paying attention to the road while you have Autopilot/FSD on, you are still driving, so act like it, we still think there is room for discussion about Tesla doing a better job of ensuring attention (for example, it just rolled out a driver attention monitoring feature using the cabin camera, six years after it started including those cameras in the Model 3).

FTC: We use income earning auto affiliate links. More.

Continue Reading

Environment

Elon Musk admits other automakers don’t want to license Tesla’s ‘Full Self-Driving’

Published

on

By

Elon Musk admits other automakers don't want to license Tesla's 'Full Self-Driving'

After years of teasing that other automakers would license Tesla’s Full Self-Driving (FSD) system, Elon Musk has now admitted that no other automakers want to license it.

“They don’t want it!” He says.

For years, the bull case for Tesla (TSLA) has relied heavily on the idea that the company isn’t just an automaker, but an “AI and robotics company”, with its first robot product being an autonomous car.

CEO Elon Musk pushed the theory further, arguing that Tesla’s lead in autonomy was so great that legacy automakers would eventually have no choice but to license Full Self-Driving (FSD) to survive.

Advertisement – scroll for more content

Back in early 2021, during the Q4 2020 earnings call, Musk first claimed that Tesla had “preliminary discussions” with other automakers about licensing the software. He reiterated this “openness” frequently, famously tweeting in June 2023 that Tesla was “happy to license Autopilot/FSD or other Tesla technology” to competitors.  

The speculation peaked in April 2024, when Musk explicitly stated that Tesla was “in talks with one major automaker” and that there was a “good chance” a deal would be signed that year.  

We now know that deal never happened. And thanks to comments from Ford CEO Jim Farley earlier this year, we have a good idea why. Farley, who was likely the other party in those “major automaker” talks, publicly shut down the idea of using FSD, stating clearly that “Waymo is better”.

Now, Musk appears to have given up on the idea of licensing Tesla FSD. In a post on X late last night, Musk acknowledged that discussions with other automakers have stalled, claiming that they asked for “unworkable requirements” for Tesla.

The CEO wrote:

“I’ve tried to warn them and even offered to license Tesla FSD, but they don’t want it! Crazy …

When legacy auto does occasionally reach out, they tepidly discuss implementing FSD for a tiny program in 5 years with unworkable requirements for Tesla, so pointless.”

Suppose you translate “unworkable requirements” from Musk-speak to automotive industry standard. In that case, it becomes clear what happened: automakers demanded a system that does what it says: drive autonomously, which means something different for Tesla.

Legacy automakers generally follow a “V-model” of validation. They define requirements, test rigorously, and validate safety before release. When Mercedes-Benz released its Drive Pilot system, a true Level 3 system, they accepted full legal liability for the car when the system is engaged.

In contrast, Tesla’s “aggressive deployment” strategy relies on releasing “beta” (now “Supervised”) software to customers and using them to validate the system. This approach has led to a litany of federal investigations and lawsuits.

Just this month, Tesla settled the James Tran vs. Tesla lawsuit just days before trial. The case involved a Model Y on Autopilot crashing into a stationary police vehicle, a known issue with Tesla’s system for years. By settling, Tesla avoided a jury verdict, but the message to the industry was clear: even Tesla knows it risks losing these cases in court.

Meanwhile, major automakers, such as Toyota, have partnered with Waymo to integrate its autonomous driving techonology into its consumer vehicles.

Electrek’s Take

The “unworkable requirements for Tesla” is an instant Musk classic. What were those requirements that were unachievable for Tesla? That it wouldn’t crash into stationary objects on the highway, such as emergency vehicles?

How dare they request something that crazy?

No Ford or GM executive is going to license a software stack that brings that kind of liability into their house. If they license FSD, they want Tesla to indemnify them against crashes. Tesla, knowing the current limitations of its vision-only system, likely refused.

To Musk, asking him to pay for FSD’s mistakes is an “unworkable requirement.” It’s always a driver error, and the fact that he always uses hyperbole to describe the level of safety being higher than that of humans has no impact on user abuse of the poorly named driver assistance systems in his view.

FTC: We use income earning auto affiliate links. More.

Continue Reading

Environment

CPSC warns Rad Power Bikes owners to stop using select batteries immediately due to fire risk

Published

on

By

CPSC warns Rad Power Bikes owners to stop using select batteries immediately due to fire risk

In an unprecedented move, the US Consumer Product Safety Commission (CPSC) has issued a public safety warning urging owners of certain Rad Power Bikes e-bike batteries to immediately stop using them, citing a risk of fire, explosion, and potentially serious injury or death.

The warning, published today, targets Rad’s lithium-ion battery models RP-1304 and HL-RP-S1304, which were sold with some of the company’s most popular e-bikes, including the RadWagon 4, RadRunner 1 and 2, RadRunner Plus, RadExpand 5, RadRover 5 series, and RadCity 3 and 4 models. Replacement batteries sold separately are also included.

According to the CPSC, the batteries “can unexpectedly ignite and explode,” particularly when exposed to water or debris. The agency says it has documented 31 fires linked to the batteries so far, including 12 incidents of property damage totaling over $734,000. Alarmingly, several fires occurred when the battery wasn’t charging or when the bike wasn’t even in use.

Complicating the situation further, Rad Power Bikes – already facing significant financial turmoil – has “refused to agree to an acceptable recall,” according to the CPSC. The company reportedly told regulators it cannot afford to replace or refund the large number of affected batteries. Rad previously informed employees that it could be forced to shut down permanently in January if it cannot secure new funding, barely two weeks before this safety notice was issued by the CPSC.

Advertisement – scroll for more content

radrunner 2

For its part, Rad pushed back strongly on the CPSC’s characterization. A Rad Power Bikes Spokesperson explained in a statement to Electrek that the company “stands behind our batteries and our reputation as leaders in the ebike industry, and strongly disagrees with the CPSC’s characterization of certain Rad batteries as defective or unsafe.”

The company explained that its products meet or exceed stringent international safety standards, including UL-2271 and UL-2849, which are standards that the CPSC has proposed as a requirement but not yet implemented. Rad says its batteries have been repeatedly tested by reputable third-party labs, including during the CPSC investigation, and that those tests confirmed full compliance. Rad also claims the CPSC did not independently test the batteries using industry-accepted standards, and stresses that the incident rate cited by the agency represents a tiny fraction of a percent. While acknowledging that any fire report is serious, Rad maintains that lithium-ion batteries across all industries can be hazardous if damaged, improperly used, or exposed to significant water intrusion, and that these universal risks do not indicate a defect specific to Rad’s products.

The company says it entered the process hoping to collaborate with federal regulators to improve safety guidance and rider education, and that it offered multiple compromise solutions – including discounted upgrades to its newer Safe Shield batteries that were a legitimate leap forward in safety in the industry – but the CPSC rejected them. Rad argues that the agency instead demanded a full replacement program that would immediately bankrupt the company, leaving customers without support. It also warns that equating new technology with older products being “unsafe” undermines innovation, noting that the introduction of safer systems, such as anti-lock brakes, doesn’t retroactively deem previous generations faulty. Ultimately, Rad says clear, consistent national standards are needed so manufacturers can operate with confidence while continuing to advance battery safety.

Lithium-ion battery fires have become a growing concern across the US and internationally, with poorly made packs implicated in a rising number of deadly incidents.

While Rad Power Bikes states that no injuries or fatalities have been tied to these specific models, the federal warning marks one of the most serious e-bike battery advisories issued to date – and arrives at a moment when the once-dominant US e-bike brand is already fighting for survival.

FTC: We use income earning auto affiliate links. More.

Continue Reading

Environment

Rivian’s e-bike brand launches $250 smart helmet with breakthrough safety tech and lights

Published

on

By

Rivian's e-bike brand launches 0 smart helmet with breakthrough safety tech and lights

ALSO, the new micromobility brand spun out of Rivian, just announced official pricing for its long-awaited Alpha Wave helmet. The smart helmet, which introduces a brand-new safety tech called the Release Layer System (RLS), is now listed at $250, with “notify for pre-order” now open on ALSO’s site. Deliveries are expected to begin in spring 2026.

The $250 price point might sound steep, but ALSO is positioning the Alpha Wave as a top-tier lid that undercuts other premium smart helmets with similar tech – some of which push into the $400–500 range. That’s because the Alpha Wave is promising more than just upgraded comfort and design. The company claims the helmet will also deliver a significant leap in rotational impact protection.

The RLS system is made up of four internal panels that are engineered to release on impact, helping dissipate rotational energy – a major factor in many concussions. It’s being marketed as a next-gen alternative to MIPS and similar technologies, and could signal a broader shift in helmet safety standards if adopted widely.

Beyond protection, the Alpha Wave also packs a surprising amount of tech. Four wind-shielded speakers and two noise-canceling microphones are built in for taking calls, playing music, or following navigation prompts. And when paired with ALSO’s own TM-B electric bike, the helmet integrates with the bike’s onboard lighting system for synchronized rear lights and 200-lumen forward visibility.

Advertisement – scroll for more content

The helmet is IPX6-rated for water resistance and charges via USB-C, making it easy to keep powered up alongside other modern gear.

Electrek’s Take

This helmet pushes the smart gear envelope. $250 isn’t nothing, but for integrated lighting, audio, and what might be a true leap forward in crash protection, it’s priced to shake things up in the high-end helmet space.

One area I’m not a huge fan of is the paired front and rear lights. Cruiser motorcycles have this same issue, with paired tail lights mounted close together sometimes being mistaken for a conventional four-wheeled vehicle farther away. I worry that the paired “headlights” and “taillights” of this helmet could be mistaken for a car farther down the road instead of the reality of a much closer cyclist. But hey, we’ll have to see.

The tech is pretty cool though, and if the RLS system holds up to its promise, we might be looking at the new bar for premium e-bike head protection.

FTC: We use income earning auto affiliate links. More.

Continue Reading

Trending