Connect with us

Published

on

A Tesla vehicle passes the Wilkie D. Ferguson Jr. U.S. Courthouse as jury selection began in connection with allegations regarding the safety of Tesla’s autopilot system on July 14, 2025 in Miami, Florida.

Joe Raedle | Getty Images

Tesla is facing a crucial verdict in a personal injury trial over a fatal Autopilot crash in 2019, the first time Elon Musk’s automaker has been in front of a jury on such a matter in federal court.

Attorneys for the plaintiffs on Thursday asked the jury to award damages of around $345 million. That includes $109 million in compensatory damages and $236 million in punitive damages. The trial in the Southern District of Florida started on July 14.

The suit centers around who shoulders the blame for a deadly crash that occurred in 2019 in Key Largo, Florida. A Tesla owner named George McGee was driving his Model S electric sedan while using the company’s Enhanced Autopilot, a partially automated driving system.

While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. He accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.

Naibel Benavides, who was 22, died on the scene from injuries sustained in the crash. Her body was discovered about 75 feet away from the point of impact. Her boyfriend, Dillon Angulo, survived but suffered multiple broken bones, a traumatic brain injury and psychological effects.

The plaintiffs have included Benavides’ surviving family members, and Angulo, who testified in the trial. Angulo is seeking compensation for his medical expenses and pain and suffering, while Benavides’ estate is suing for wrongful death, pain and suffering, and other punitive damages.

Lawyers representing the plaintiffs argued that Tesla’s partially automated driving systems, marketed as Autopilot at the time, had dangerous defects, which should have been known and fixed by the company, and that use of Autopilot should have been limited to roads where it could perform safely.

They also argued that Musk and Tesla made false statements to customers, shareholders and the public, overstating the safety benefits and capabilities of Autopilot, which encouraged drivers to overly rely on it.

In opening arguments and throughout the trial, the plaintiffs’ attorneys and expert witnesses cited a litany of Musk’s past promises about Autopilot and Tesla’s autonomous vehicle technology. The lawyers said

Tesla attorneys countered in court that the company had communicated directly with customers about how to use Autopilot and other features, and that McGee’s driving was to blame for the collision. They said in closing arguments that Tesla works to develop technology to save drivers’ lives, and that a ruling against the EV maker would send the wrong message.

The Benavides family had previously sued McGee and settled with him. McGee was charged in October 2019 with careless driving and didn’t contest the charges.

While Tesla has typically been able to settle cases or move Autopilot-related suits into arbitration and out of the public eye, Judge Beth Bloom in the Miami court wrote, in an order in early July, that the case could move ahead to trial.

“A reasonable jury could find that Tesla acted in reckless disregard of human life for the sake of developing their product and maximizing profit,” she wrote in that order.

For closing arguments on Thursday, the Benavides family and Angulo were in the courtroom. They looked away from screens anytime a video or picture of the scene of the crash was displayed.

NBC News’ Maria Piñero reported from Miami.

WATCH: Calling Tesla a car company is overrated

Calling Tesla a car company is overrated, says Fitz-Gerald Group's Keith Fitz-Gerald

Continue Reading

Technology

Hands-on with the Meta Ray-Ban Display glasses

Published

on

By

Hands-on with the Meta Ray-Ban Display glasses

Mark Zuckerberg, chief executive officer of Meta Platforms Inc., wears a pair of Meta Ray-Ban Display AI glasses during the Meta Connect event in Menlo Park, California, US, on Wednesday, Sept. 17, 2025.

David Paul Morris | Bloomberg | Getty Images

When it comes to the new $799 Meta Ray-Ban Display glasses, it’s the device’s accompanying fuzzy, gray wristband that truly dazzles.

I was able to try out Meta’s next-generation smart glasses that the social media company announced Wednesday at its annual Connect event. These are the first glasses that Meta sells to consumers with a built-in display, marking an important step for the company as it works toward CEO Mark Zuckerberg’s vision of having headsets and glasses overtake smartphones as people’s preferred form of computing.

The display on the new glasses, though, is still quite simplistic. Last year at Connect, Meta unveiled its Orion glasses, which are a prototype capable of overlaying complex 3D visuals onto the physical world. Those glasses were thick, required a computing puck and were built for demo purposes only.

The Meta Ray-Ban Display, however, is going on sale to the public, starting in the U.S. on Sept. 30.

Though the new glasses include just a small digital display in their right lens, that screen enables unique visual functions, like reading messages, seeing photo previews and reading live captions while having a conversation with someone.

Controlling the device requires putting on its EMG sensor wristband that detects the electrical signals generated by a person’s body so they can control the glasses via hand gestures. Putting it on was just like strapping on a watch, except for the small, electric jolt I felt when it activated. It wasn’t as much of a shock as you feel taking clothes out of the dryer, but it was noticeable.

Donning the new glasses was less shocking, until I had them on and saw the little display emerge, just below my right cheek. The display is like a miniaturized smartphone screen but translucent so as to not obscure real-world objects.

Despite being a high-resolution display, the icons weren’t always clear when contrasted with my real-world field of view, causing the letters to appear a bit murky. These visuals aren’t meant to wrap around your head in crystal-clear fidelity, but are there for you to perform simple actions, like activating the glasses’ camera and glancing at the songs on Spotify. It’s more utility than entertainment.

The Meta Ray-Ban Display AI glasses with the Meta Neural Band wristband at Meta headquarters in Menlo Park, California, US, on Tuesday, Sept. 16, 2025.

David Paul Morris | Bloomberg | Getty Images

I had the most fun trying to perform hand gestures to navigate the display and open apps. By clenching my fist and swiping my thumb on the surface of my pointer finger, I was able to scroll through the apps like I was using a touchpad.

It took me several attempts at first to open the camera app through pinching my index finger and thumb together, and when the app wouldn’t activate I would find myself pinching twice, mimicking the double clicking of a mouse on a computer. But whereas using a mouse is second nature to me, I learned I have subpar pinching skills that lack the correct cadence and timing required to consistently open the app.

It was a bit strange and amusing to see people in front of me while I continuously pinched my fingers to interact with the screen. I felt like I was reenacting an infamous comedy scene from the TV show “The Kids in The Hall” in which a misanthrope watches people from afar while pinching his fingers and saying, “I’m crushing your head, I’m crushing your head!”

With the camera app finally opened, the display showed what I was looking at in front of me, giving me a preview of how my photos and videos would turn out. It was like having my own personal picture-in-picture feature like you would on a TV.

I found myself experiencing some cognitive dissonance at times as my eyes were constantly figuring out what to focus on due to the display always sitting just outside the center of my field of view. If you’ve ever taken a vision test that involves identifying when you see squiggly lines appearing in your periphery, you have a sense of what I was feeling.

Besides pinching, the Meta Ray-Ban Display glasses can also be controlled using the Meta AI voice assistant, just as users can with the device’s predecessors.

When I took a photo of some of the paintings decorating the demo room’s halls, I was told by support staff to ask Meta AI to explain to me what I was looking at. Presumably, Meta AI would have told me I was looking at various paintings from the Bauhaus art movement, but the digital assistant never activated correctly before I was escorted to another part of the demo.

I could see the Meta Ray-Ban Display’s live captions feature being helpful in noisy situations, as it successfully picked up the voice of the demo’s tour guide while dance music from the Connect event blared in the background. When he said “Let’s all head to the next room,” I saw his words appear in the display like closed-captions on a TV show.

But ultimately, I was most drawn to the wristband, particularly when I listened to some music with the glasses via Spotify. By rotating my thumb and index finger as if I was turning an invisible stereo knob,
I was able to adjust the volume, an expectedly delightful experience.

It was this neural wristband that really drilled into my brain how much cutting-edge technology has been crammed into the new Meta Ray-Ban Display glasses. And while the device’s high price may turn off consumers, the glasses are novel enough to potentially attract developers seeking more computing platforms to build apps for.

WATCH: Next important wearable tech will be glasses, says Meta’s chief product officer.

Meta's chief product officer on its latest AI smart glasses

Continue Reading

Technology

Navan, corporate travel and expense startup, files for initial public offering

Published

on

By

Navan, corporate travel and expense startup, files for initial public offering

By year-end there should be around 20 tech IPOS, says Barclays' Kristin DeClark

Navan, the business travel, payments, and expense management startup, filed on Friday afternoon to go public.

Its S-1 filing with the Securities and Exchange Commission indicates that the company plans to list on the Nasdaq Global Select Market under the symbol “NAVN.”

Navan reported trailing 12-month revenue of $613 million (up 32%) across over 10,000 customers, and gross bookings of $7.6 billion (up 34%), according to the S-1 filing.

Goldman Sachs and Citigroup will act as lead book-running managers for the proposed offering.

Navan ranked No. 39 on this year’s CNBC Disruptor 50 list, and also made the 2024 list.

The IPO market has bounced back this year, with deal activity up 56% across 156 deals (roughly 200 IPO filings in all) and $30 billion in proceeds, up over 23% year over year, according to IPO tracker Renaissance Capital. It has been the best year for IPOs since 2021, though still far below the Covid offering boom years, when over $142 billion (2021) and $78 billion (2020) was raised by IPOs.

This year’s deal flow has been highlighted by hot AI names like Coreweave, as well as some of the startup world’s most highly valued firms from the past decade, such as fintech Klarna and design firm Figma, crypto companies Circle, Bullish and Gemini, and some long-awaited IPO candidates finally hitting the market, such as Stubhub this week, though its shares have slumped since the first day of trading. Top Amazon reseller Pattern went public on Friday.

Other startups are expected to pursue deals given the increased investor appetite.

The Renaissance IPO ETF is up 20% this year.

Launched by CEO Ariel Cohen and co-founder Ilan Twig in 2015, Navan set out to disrupt a business travel sector where incumbents relied on clunky legacy tools and fragmented workflows.

The Palo Alto-based company, formerly called TripActions, refers to itself as an “all-in-one super app” for corporate travel and expenses.

Customers include Unilever, Adobe, Christie’s, Blue Origin and Geico.

It has also been pushing further into AI, with a virtual assistant named Ava handling approximately 50% of user interactions during the six months ended July 31, according to the filing, and a proprietary AI framework called Navan Cognition supporting its platform, as well as proprietary cloud infrastructure.

“We built Navan for the road warriors, for CEOs and CFOs who understand travel’s critical importance to their strategy, the finance teams who demand precision and control, the executive assistants juggling itineraries, and the program admins ensuring seamless events,” the co-founders wrote in an IPO filing letter.

“We saw firsthand the frustration of clunky, outdated systems. Travelers were forced to cobble together solutions, wait for hours on hold to book or change travel, and negotiate with travel agents. They struggled to adhere to company policies, with little visibility into those policies, and after all that, they spent even more time on tedious expense reports after a trip. We felt the pain of finance teams struggling to gain visibility into fragmented travel spending and to enforce policies, and the frustration of suppliers unable to connect directly with the high-value business travelers they sought to serve,” they wrote in the filing.

Revenue grew 33% year-over-year from $402 million in fiscal 2024 to $537 million in fiscal 2025, according to the S-1 filing. The company reported a net loss that decreased 45% year-over-year from $332 million in fiscal 2024 to $181 million in fiscal 2025. Gross margin improved from 60% in fiscal 2024 to 68% in fiscal 2025.

The business travel and expense space is crowded, with fellow Disruptors Ramp and Brex, and TravelPerk, as well as incumbents like SAP Concur and American Express Global Business Travel.

Sign up for our weekly, original newsletter that goes beyond the annual Disruptor 50 list, offering a closer look at list-making companies and their innovative founders.

Continue Reading

Technology

Microsoft raises Xbox prices in U.S. due to economic environment

Published

on

By

Microsoft raises Xbox prices in U.S. due to economic environment

A gamer plays soccer title Pro Evolution Soccer 2019 on an Xbox console.

Sezgin Pancar | Anadolu Agency via Getty Images

Microsoft said on Friday that it will increase the recommended retail price of several Xbox consoles in the U.S. starting in October because of “changes in the macroeconomic environment.”

The company said it would not increase prices for accessories such as controllers and headsets, and that prices in other countries would stay the same.

While Microsoft didn’t explicitly attribute the increase to the Trump administration’s tariffs, many consumer companies have been warning for months that higher prices are on the way. President Donald Trump has issued tariffs this year on multiple countries with a stated goal to bring more manufacturing to the U.S.

“We understand that these changes are challenging, and they were made with careful consideration,” Microsoft said on its website.

It’s the second time Microsoft has raised prices on its consoles in the U.S. this year. Rivals Sony and Nintendo have also raised console prices in the U.S. as Trump’s tariffs went into effect.

Here are the changes, according to a PDF posted on Microsoft’s website:

  • Xbox Series S will start at $399, up from $379 previously. A version with 1TB of storage costs $449.
  • Xbox Series X Digital console now costs $599, a $50 increase. The Xbox Series X with a disc drive also got a $50 increase to $649.
  • The most expensive version, with 2TB of storage, costs $799, up from $729.

WATCH: Fed governor doesn’t see material inflation from tariffs

Federal Reserve Governor Stephen Miran: I don’t see any material inflation from tariffs

Continue Reading

Trending