Connect with us

Published

on

I just came back from driving about 200 miles (350 km) using Tesla’s (Supervised) Full Self-Driving, and the system is getting better, but it’s also getting more dangerous as it gets better.

The risk of complacency is scary.

Last weekend, I went on a road trip that covered about 200 miles from Shawinigan to Quebec City and back, and I used Tesla’s (Supervised) Full Self-Driving (FSD), v12.5.4.1 to be precise, on almost the entire trip.

Here’s the good and the bad, and the fact that the former is melting into the latter.

The Good

The system is increasingly starting to feel more natural. The way it handles merging, lane changes, and intersections feels less robotic and more like a human driver.

The new camera-based driver monitoring system is a massive upgrade from the steering wheel torque sensor that Tesla has used for years. I only had one issue with it where it kept giving me alerts to pay attention to the road even though I was doing just that, and it eventually shut FSD down for the drive because of it.

But this happened only once in the few weeks since I’ve used the latest update.

For the first time, I can get good chunks of city driving without any intervention or disengagement. It’s still far from perfect, but there’s a notable improvement.

It stopped to let pedestrians cross the street, it handled roundabouts fairly well, and it drives at more natural speeds on country roads (most of the time).

The system is getting good to the point that it can induce some dangerous complacency. More on that later.

As I have been saying for years, if Tesla was developing this technology in a vacuum and not selling it to the public as “about to become unsupervised self-driving”, most people would be impressed by it.

The Bad

Over those ~200 miles, I had five disengagements, including a few that were getting truly dangerous. It was seemingly about to run a red light once and a stop another time.

I say seemingly because it is getting hard to tell sometimes due to FSD often approaching intersections with stops and red traffic lights more aggressively.

It used to drive closer to how I’ve been driving my EVs forever, which consists of slowly decelerating using regenerative braking when approaching a stop. But this latest FSD update often maintains a higher speed, getting into those intersections and brakes more aggressively, often using mechanical brakes.

This is a strange behavior that I don’t like, but I started at least getting the feeling of it, which makes me somewhat confident that FSD would blow that red light and stop sign on those two occasions.

Another disengagement appeared to be due to sun glare in the front cameras. I am getting more of that this time of year as I drive more often during the sunsets, which happen earlier in the day.

It appears to be a real problem with Tesla’s current FSD configuration.

On top of the disengagement, I had an incalculable number of interventions. Interventions are when the driver has to input a command, but it’s not enough to disengage FSD. That’s mainly due to the fact that I keep having to activate my turn signal to tell the system to go back into the right lane after passing.

FSD only goes back into the right lane after passing if there’s a car coming close behind you in the left lane.

I’ve shared this finding on X, and I was disappointed by the response I got. I suspected that this could be due to American drivers being an important part of the training data, and no offense as this is an issue everywhere, but American drivers tend not to respect the guidelines (and law in some places) of the left lane being only for passing on average.

I feel like this could be an easy fix or at the very least, an option to add to the system for those who want to be good drivers even when FSD is active.

I also had an intervention where I had to press the accelerator pedal to tell FSD to turn left on a flashing green light, which it was hesitating to do as I was holding up traffic behind me.

Electrek’s Take

The scariest part for me is that FSD is getting good. If I take someone with no experience with FSD and take them on a short 10-15 mile drive, there’s a good chance I get no intervention, and they come out really impressed.

It is the same with a regular Tesla driver who consistently gets good FSD experiences.

This can build complacency with the drivers and result in paying less attention.

Fortunately, the new driver monitoring system can greatly help with that since it tracks driver attention, unlike Tesla’s previous system. However, it only takes a second of not paying attention to get into an accident, and the system allows you that second of inattention.

Furthermore, the system is getting so good at handling intersections that even if you are paying attention, you might end up blowing through a red light or stop sign, as I have mentioned above. You might feel confident that FSD is going to stop, but with its more aggressive approach to the intersection, you let it go even though it doesn’t start braking as soon as you would like it to, and then before you know it, it doesn’t brake at all.

There’s a four-way stop near my place on the south shore of Montreal that I’ve driven through many times with FSD without issue and yet, FSD v12.5.4 was seemingly about to blow right past it the other day.

Again, it’s possible that it was just braking late, but it was way too late for me to feel comfortable.

Also, while it is getting better, and better at a more noticeable pace lately, the crowdsource data, which is the only data available as Tesla refuses to release any, points to FSD being still years away from being capable of unsupervised self-driving:

Tesla would need about a 1,000x improvement in miles between disengagement.

I’ve lost a lot of faith in Tesla getting there due to things like the company’s recent claim that it completed its September goals for FSD, which included a “3x improvement in miles between critical disengagement” without any evidence that this happened.

In fact, the crowdsource data shows a regression on that front between v12.3 and v12.5.

I fear that Elon Musk’s attitude and repeated claim that FSD is incredible, combined with the fact that it actually getting better and his minions are raving about it, could lead to dangerous complacency.

Let’s be honest. Accidents with FSD are inevitable, but I think Tesla could do more to reduce the risk – mainly by being more realistic about what it is accomplishing here.

It is developing a really impressive vision-based ADAS system, but it is nowhere near on the verge of becoming unsupervised self-driving.

FTC: We use income earning auto affiliate links. More.

Continue Reading

Environment

This 75 MPH electric car with bicycle pedals to charge it is apparently the real deal

Published

on

By

This 75 MPH electric car with bicycle pedals to charge it is apparently the real deal

I know, it sounds too crazy to be true, but the Vigoz by French company Cixi is an honest-to-goodness pedalable vehicle with a top speed of up to 120 km/h (75 mph). And it looks pretty slick, too.

I’m not sure if it’s technically an electric “car” since it only has three wheels, but it’s definitely an electric vehicle. I don’t think we can quite call it an electric bike (or e-trike) either when it’s fully enclosed and looks more like a modern take on a classic Fiat than something you’d see cruising the Paris bike lanes.

But despite the automotive-like exterior, the interior definitely gives you bike mashup vibes. There’s something of a recumbent-style seat that allows riders to lean back while working the bicycle pedals. Yes, that’s right. Bicycle pedals.

The pedals aren’t a direct drive setup, but rather seem to run through a series hybrid generator – something that has become more common on larger cargo e-bikes in the last few years.

Advertisement – scroll for more content

Known as the “Pedaling Energy Recovery System”, the human-integrated drivetrain is said to “convert human power into bicycle propulsion through electricity, enabling the rider to intuitively control speed and braking by pedaling,” according to New Atlas. That sounds nice in theory, but I think when I surround myself with glass and air conditioning, any ‘intuition’ about bicycle controls goes out the retractable windows.

Riders can also theoretically charge the battery by pedaling, but that’s probably about as effective as filling a swimming pool with a shot glass.

The Vigoz seems positioned for real utility use though, and not just as a quirky alternative vehicle. There’s room for a passenger seated in tandem configuration in the rear, or enough space for a decent amount of cargo. Or the pass through design can be use to carry skis, apparently.

And for safety, the company claims that the frame is built with crumple zones that should help reduce the impact forces on occupants in the event of a collision.

So far, the company says that its prototype has reached speeds topping 100 km/h (62mph), but the final maximum speed of the drivetrain is intended to be 120 km/h (75 mph).

It’s not clear exactly when the vehicle will be available to the public, or what unique regulatory concerns it could face on its path to homologation. The fact that Cixi hasn’t listed a price, or even opened a reservations list for future customers, shows that we could be looking at these prototypes for a while longer.

FTC: We use income earning auto affiliate links. More.

Continue Reading

Environment

Sam Altman on OpenAI’s $850 billion in planned buildouts: ‘People are worried. I totally get that’

Published

on

By

Sam Altman on OpenAI's 0 billion in planned buildouts: 'People are worried. I totally get that'

Sam Altman, chief executive officer of OpenAI Inc., during a media tour of the Stargate AI data center in Abilene, Texas, US, on Tuesday, Sept. 23, 2025.

Kyle Grillot | Bloomberg | Getty Images

ABILENE, Texas — Sam Altman stood on a patch of hot Texas dirt, the kind that turns to dust storms on dry days and mud slicks after a sudden rain. Behind him stretched the outlines of what will soon be a massive data center complex in the west-central part of the state, where heavy wind often meets extreme heat.

It was a fitting backdrop for the OpenAI CEO to unveil what he calls the largest infrastructure push of the modern internet era: a 17-gigawatt buildout in partnership with Oracle, Nvidia, and SoftBank.

In less than 48 hours, OpenAI has announced commitments equal to 17 nuclear plants or about nine Hoover Dams. The plan will require the amount of electricity needed to power more than 13 million U.S. homes.

The scale is staggering, even for a company that’s raised a record amount of private market cash and seen its valuation swell to $500 billion. At roughly $50 billion per site, OpenAI’s projects add up to about $850 billion in spending, nearly half of the $2 trillion global AI infrastructure surge HSBC now forecasts.

Altman understands the concern. But he rejects the idea that the spending spree is overkill.

“People are worried. I totally get that. I think that’s a very natural thing,” Altman told CNBC on Tuesday from the site of the first of its mega data centers in Abilene. “We are growing faster than any business I’ve ever heard of before.”

Altman insisted that the building boom is in response to soaring demand, highlighting the tenfold jump in ChatGPT usage over the past 18 months. He said a network of supercomputing facilities is what’s required to maximize the capabilities of AI.

Oracle, OpenAI and SoftBank unveil $400 billion Stargate data center expansions

“This is what it takes to deliver AI,” Altman said. “Unlike previous technological revolutions or previous versions of the internet, there’s so much infrastructure that’s required, and this is a small sample of it.”

The biggest bottleneck for AI isn’t money or chips — it’s electricity. Altman has put money into nuclear companies because he sees their steady, concentrated output as one of the only energy sources strong enough to meet AI’s enormous demand.

Altman led a $500 million funding round into fusion firm Helion Energy to build a demonstration reactor, and backed Oklo, a fission company he took public last year through his own SPAC. 

Critics warn of a bubble, pointing to how companies like Nvidia, Oracle, Broadcom and Microsoft have each added hundreds of billions of dollars in market value on the back of tie-ups with OpenAI, which is burning cash. Nvidia and Microsoft are now worth a combined $8.1 trillion, or equal to about 13.5% of the S&P 500.

Skeptics also say the system looks like a circular financing model. OpenAI is committing hundreds of billions of dollars to projects that rely on partners like Nvidia, Oracle, and SoftBank. Those companies are simultaneously investing in the same projects and then getting paid back through chip sales and data center leases.

Friar has a different perspective, arguing that the entire ecosystem is banding together to meet a historic surge in compute needs. Big tech booms, Friar noted, have always required this kind of bold, coordinated infrastructure buildout.

Altman added that such cycles of overinvesting and underinvesting have marked every past technological revolution. Some people, he said, will surely feel the pain.

“People will get burned on overinvesting and people also get burned on underinvesting and not having enough capacity,” he said. “Smart people will get overexcited, and people will lose a lot of money. People will make a lot of money. But I am confident that long term, the value of this technology is going to be gigantic to society.”

‘More and more demand’

OpenAI’s partners are betting big on that future. Oracle is even reshaping its leadership around it. On Monday, the company promoted Clay Magouyrk and Mike Sicilia to CEO roles, replacing Safra Catz. Magouyrk ran cloud infrastructure and Sicilia was president of Oracle Industries.

“When you think about why make a transition now, it’s really around Oracle’s being set up for success,” Magouyrk told CNBC. “I only see more and more demand from the end users … what looks like near infinite demand for technology.”

Nvidia is fronting equity alongside its chips, including the new Vera Rubin accelerators meant to power the next wave of AI workloads. The Abilene facility is being leased by Oracle.

“Folks like Oracle are putting their balance sheets to work to create these incredible data centers you see behind us,” OpenAI CFO Sarah Friar said in an interview on site.

She explained that OpenAI will pay operating expenses for the data centers when they’re online, while Nvidia’s investments are getting the project up and running.

“But importantly, they will get paid for all those chips as those chips get deployed,” Friar said, referring to the arrangement with Nvidia.

OpenAI's Sarah Friar: 'Full ecosystem' needs to come together to address compute crunch

Friar, who previously helped take Block public as CFO and then guided Nextdoor to the public market as CEO, pointed to the balancing act between equity, debt and operating expenses. She said that the facilities breaking ground now are aimed at bringing new capacity online next year.

“But then it’s about what gets built for 2027, 2028, and 2029,” she said. “What we see today is a massive compute crunch. There’s not enough compute to do all the things that AI can do, and so we need to get it started — and we need to do it as a full ecosystem.”

As for OpenAI’s long-term relationship with Microsoft, “They’re a major partner,” Friar said, adding that the company will continue to be a key supplier of compute capacity.

She hinted that more developments are on the way with Microsoft, and that she’s “pleased that we are where we are, but not fully ready to announce everything yet.”

In Friar’s current role, the numbers are much bigger than they ever were at the two companies she took public. Eventually OpenAI investors will expect returns on their hefty investments, but Altman said that the question of an IPO is “complicated.”

“I assume that someday we will be a public company,” he told CNBC. “I have mixed feelings about it … for now, we’re certainly able to raise a lot of capital in private markets.”

He said that being public could make long-term investments harder, given the need to meet Wall Street’s expectations on a quarterly basis. But it would open up access to a broader base of investors, he said.

“I think that the world should, if people want to, own shares in OpenAI. I think that’s awesome, and I want that to happen,” Altman said.

In the near term, the story is about many billions of dollars plowed into chips and data centers in places like Abilene, and eventually in New Mexico, Ohio and elsewhere.

But OpenAI isn’t just about infrastructure. In May, the company made the stunning announcement that it had acquired Jony Ive’s nascent devices startup for about $6.4 billion. Bringing in the designer of the iPhone and the rest of Apple’s most popular products wasn’t an accident.

While in Texas, Altman hinted at hardware that could reshape how people use computers in their everyday lives.

The OpenAI CEO said computers have never before been able to truly “understand and think,” and that breakthrough creates the chance to invent an entirely new way of using them.

He cautioned that it will take time before OpenAI has anything ready to ship. Even when it gets there, the company plans to release only a “small family of devices,” he said. But the potential, Altman said, is “something big” and worth pursuing.

WATCH: OpenAI CFO: Need partners like Oracle and Microsoft to meet demand

OpenAI CFO: Need partners like Oracle and Microsoft to meet demand

Continue Reading

Environment

OpenAI’s first data center in $500B Stargate project is open in Texas, with sites coming in New Mexico and Ohio

Published

on

By

OpenAI's first data center in 0B Stargate project is open in Texas, with sites coming in New Mexico and Ohio

OpenAI CFO Sarah Friar: 'More compute, more revenue' in response to concern on Oracle, Nvidia deals

ABILENE, Texas — OpenAI and Oracle are betting big on America’s AI future, bringing online the flagship site of the $500 billion Stargate program, a sweeping infrastructure push to secure the compute needed to power the future of artificial intelligence.

The debut site in Abilene, Texas, about 180 miles west of Dallas, is up and running, filled with Oracle Cloud infrastructure and racks of Nvidia chips.

The data center, which is being leased by Oracle, is one of the most notable physical landmarks to emerge from an unprecedented boom in demand for infrastructure to power AI. Over $2 trillion in AI infrastructure has been planned around the world, according to an HSBC estimate this week.

OpenAI is leading the way.

In addition to the $500 billion Stargate project, the startup on Monday announced an equity investment deal with Nvidia that will add an estimated $500 billion worth of data centers in the coming years. Since 2019, Microsoft has invested billions of dollars in OpenAI, providing loads of access to Azure credits. Additionally, OpenAI contracts with smaller cloud companies for additional compute capacity and help operating its infrastructure.

One building on the Abilene site is operational while another is nearly complete. The campus has the potential to ultimately scale past a gigawatt of capacity, OpenAI finance chief Sarah Friar told CNBC. That would be enough electricity to power about 750,000 U.S. homes.

The data center construction plans are important enough that Nvidia CEO Jensen Huang personally engaged in last-minute negotiations with OpenAI CEO Sam Altman over the weekend to get in on the action, CNBC reported earlier on Tuesday.

“People are starting to recognize just the sheer scale that will be required,” Friar said. “We’re just getting going here in Abilene, Texas, but you’ll see this all around the United States and beyond.”

The scale of the project’s construction was necessary to supply the amount of compute required to operate OpenAI’s models, Friar said.

“What we see today is a massive compute crunch,” she said. “There’s not enough compute to do all the things that AI can do.”

OpenAI's Sarah Friar: 'Full ecosystem' needs to come together to address compute crunch

A bold bet on AI infrastructure

OpenAI, Oracle and SoftBank, which is helping fund the project, announced on Tuesday five additional Stargate sites across Texas, New Mexico, Ohio and an additional unnamed site in the Midwest. That brings the size of the initiative to nearly 7 gigawatts and more than $400 billion of investment over the next three years, which includes an existing $300 billion agreement between OpenAI and Oracle.

While companies like Oracle are helping fund the data center construction, OpenAI will ultimately be the one to pay for the computing capacity as an operating expense, Friar said. Although Nvidia is putting in equity to jumpstart the project, Friar said the chipmaker will get paid for all graphics processing units (GPUs) that it provides as those chips get deployed.

Friar said OpenAI will generate $13 billion in revenue this year, and that the company plans to help pay for the construction using its own cash flow and debt financing.

The Stargate name will refer to all OpenAI infrastructure projects going forward, CNBC reported this week. Together with CoreWeave and other partners, the companies say they are ahead of schedule to meet their full 10-gigawatt commitment by the end of 2025.

Friar told CNBC the shovels going into the ground today are laying foundations for compute that won’t come online until 2026, starting with Nvidia next-generation Vera Rubin chips.

Data center buildings are under construction during a tour of the OpenAI data center in Abilene, Texas, U.S., Sept. 23, 2025.

Shelby Tauber | Reuters

“No one in the history of man built data centers this fast,” Friar said, adding that the entire ecosystem has to work together to meet demand.

Critics have questioned the circular funding behind Stargate — OpenAI committing hundreds of billions of dollars to projects while suppliers like Nvidia are also investing directly into those same buildouts.

Friar said history shows that technology booms require bold infrastructure bets.

“When the internet was getting started, people kept feeling like, ‘Oh, we’re over-building, there’s too much,'” Friar said. “Look where we are today, right?”

The project also carries political weight. OpenAI and Oracle first unveiled Stargate alongside President Donald Trump at the White House in January. Friar called Trump “the president of this AI era,” pointing to Washington’s role in framing the technology as both an economic engine and a national security priority. Trump was briefed on the Nvidia investment into OpenAI during a state visit to the U.K. earlier this month.

Oracle says the project will employ more than 6,000 construction workers daily and deliver nearly 1,700 long-term jobs.

In a paper published Tuesday about OpenAI’s infrastructure plans, the company wrote that its data center buildout could help reshape the American power grid with new technologies and help the U.S. exert global influence.

— CNBC’s Kif Leswing contributed to this story.

WATCH: OpenAI CFO: Need partners like Oracle and Microsoft to meet demand

OpenAI CFO: Need partners like Oracle and Microsoft to meet demand

Continue Reading

Trending