Connect with us

Published

on

Uber on Wednesday announced several new product updates at the company’s annual Go-Get showcase in New York City that aim to help its customers save money on rides and food.

The product updates reflect Uber’s continued push to drive growth and demand across its mobility and delivery business segments. The new features could help the company attract more riders and users to its app.

Here are the key new offerings the company announced:

Uber Shuttle

Riders looking for a more affordable way to get to the airport, work and live events, such as sports games and concerts, can reserve seats on an Uber Shuttle.

Uber has partnered with local shuttle services that will pick up riders and bring them to their destination. Uber said the shuttle services employ commercially licensed drivers, and users can tip and rate them directly within the Uber app.

The shuttles will have between 14 and 55 seats. Users can reserve up to five seats as early as seven days in advance, and they’ll receive a QR code ticket. Riders can track their shuttle’s location within 25 minutes of departure time.

The company said the shuttle will be a “fraction of the price” of a ride with UberX. The trip won’t be impacted by surge pricing.

Uber will start to roll out the feature at Miami’s Hard Rock Stadium and at select concert venues in Chicago, Pittsburgh and Charlotte, N.C., this summer. Uber said it will expand the offering in the future.

Uber Caregiver

Users who rely on caregivers for support in their day-to-day lives can add them directly to their profiles starting this summer. This will allow caregivers to book rides for people they care for and order medical supplies and groceries on their behalf.

The feature will also allow for three-way chats between drivers, riders and caregivers.

Uber said the user’s insurance benefits can be applied when applicable to help minimize out-of-pocket costs. Uber Caregiver will initially support Medicaid recipients, customers who are 65 and older with Medicare Advantage, and customers with commercial insurance from their employers.

Caregivers can sign up to be notified when other insurance providers are added.

Costco on Uber Eats

Uber said Costco will be available as an on-demand option within Uber Eats in select locations across the U.S. starting Wednesday.

Users can order products from Costco even if they are not members, but Uber said members will save between 15% and 20% compared with nonmembers.

Costco members can enter their member numbers in the Uber Eats app and are eligible for 20% off of Uber One, the company’s subscription membership.

Schedule UberX Share

Uber said it is launching a new feature on Wednesday that lets users schedule a shared ride in advance. The feature will save users around 25% on average compared with a typical ride on UberX, the company said. Riders can schedule their trips as soon as 10 minutes in advance.

Scheduled UberX Share rides are initially launching in cities with some of the highest return-to-office rates. This includes New York, San Francisco, Los Angeles, Chicago, Atlanta and San Diego. Uber said more locations will follow.

Uber One for Students

Uber will offer its Uber One membership program at a discount for college students. The program normally costs $9.99 per month, but it will be available to students at $4.99 a month.

The company said students will also get access to free items and special deals, such as daily discounts on their orders from Taco Bell, Domino’s and Starbucks.

The Uber One Student Plan is launching in the U.S. in May. It will roll out in Canada, New Zealand, Mexico and Australia in July, as well as in Japan and France in September.

Uber Eats Lists

Uber is introducing a new feature called “Lists” to Uber Eats that allows users to curate and share lists of restaurants and go-to spots. The company shared examples like “date night desserts” and “toddler-approved” meals.

Continue Reading

Technology

Tesla car sales in China fall 11.5% as competition intensifies

Published

on

By

Tesla car sales in China fall 11.5% as competition intensifies

A Tesla showroom with its logo and electric vehicles on display, including the Model 3 and Model Y, is seen on January 12, 2025, in Chongqing, China. 

Cheng Xin | Getty Images

Sales of Tesla‘s cars to China fell in January, as competition from domestic rivals continued to heat up.

Tesla sold 63,238 units of its electric cars in January, down 11.5% from the 71,447 cars sold in the same month last year.

Shares of Tesla were down about 1.5% in premarket trading.

Chinese rival BYD meanwhile sold 296,446 pure electric and plug-in hybrid vehicles last month, up 47% year-on-year.

Other Chinese rivals of Tesla, including Changan Automobile and Xpeng, also posted growth in sales.

Tesla has attempted to use price cuts as an incentive to retain Chinese’ buyers interest in its car. Late last year, Tesla slashed the price of its Model Y car and also extended a zero-interest five-year loan plan until the end of January.

Last month, the U.S. giant also announced a revamped version of the Model Y — one of its best-selling EV autos — in China. This also came with a 0% interest plan.

Tesla has not introduced a new model since it began delivering the Cybertruck in late 2023, which starts at nearly $80,000. Investors have been yearning for a new mass-market model from the company to reinvigorate sales.

Tesla has said a new affordable model could be launched in the first half of 2025.

Meanwhile, the automaker is pushing to launch its driver assist system, which it markets as “Full Self Driving,” in China this year, as rivals also roll out similar features.

Continue Reading

Technology

DeepSeek has rattled large AI players — but smaller chip firms see it as a force multiplier

Published

on

By

DeepSeek has rattled large AI players — but smaller chip firms see it as a force multiplier

Dado Ruvic | Reuters

DeepSeek has rattled the U.S.-led AI ecosystem with its latest model, shaving hundreds of billions in chip leader Nvidia’s market cap. While the sector leaders grapple with the fallout, smaller AI companies see an opportunity to scale with the Chinese startup.

Several AI-related firms told CNBC that DeepSeek’s emergence is a “massive” opportunity for them, rather than a threat. 

“Developers are very keen to replace OpenAI’s expensive and closed models with open source models like DeepSeek R1…” said Andrew Feldman, CEO of artificial intelligence chip startup Cerebras Systems.

The company competes with Nvidia’s graphic processing units and offers cloud-based services through its own computing clusters. Feldman said the release of the R1 model generated one of Cerebras’ largest-ever spikes in demand for its services. 

“R1 shows that [AI market] growth will not be dominated by a single company — hardware and software moats do not exist for open-source models,” Feldman added. 

Open source refers to software in which the source code is made freely available on the web for possible modification and redistribution. DeepSeek’s models are open source, unlike those of competitors such as OpenAI.

DeepSeek also claims its R1 reasoning model rivals the best American tech, despite running at lower costs and being trained without cutting-edge graphic processing units, though industry watchers and competitors have questioned these assertions.

“Like in the PC and internet markets, falling prices help fuel global adoption. The AI market is on a similar secular growth path,” Feldman said. 

Inference chips 

DeepSeek could increase the adoption of new chip technologies by accelerating the AI cycle from the training to “inference” phase, chip start-ups and industry experts said.

Inference refers to the act of using and applying AI to make predictions or decisions based on new information, rather than the building or training of the model.

“To put it simply, AI training is about building a tool, or algorithm, while inference is about actually deploying this tool for use in real applications,” said Phelix Lee, an equity analyst at Morningstar, with a focus on semiconductors.  

While Nvidia holds a dominant position in GPUs used for AI training, many competitors see room for expansion in the “inference” segment, where they promise higher efficiency for lower costs.

AI training is very compute-intensive, but inference can work with less powerful chips that are programmed to perform a narrower range of tasks, Lee added.

A number of AI chip startups told CNBC that they were seeing more demand for inference chips and computing as clients adopt and build on DeepSeek’s open source model. 

“[DeepSeek] has demonstrated that smaller open models can be trained to be as capable or more capable than larger proprietary models and this can be done at a fraction of the cost,” said Sid Sheth, CEO of AI chip start-up d-Matrix. 

“With the broad availability of small capable models, they have catalyzed the age of inference,” he told CNBC, adding that the company has recently seen a surge in interest from global customers looking to speed up their inference plans. 

Robert Wachen, co-founder and COO of AI chipmaker Etched, said dozens of companies have reached out to the startup since DeepSeek released its reasoning models.

“Companies are [now] shifting their spend from training clusters to inference clusters,” he said. 

“DeepSeek-R1 proved that inference-time compute is now the [state-of-the-art] approach for every major model vendor and thinking isn’t cheap – we’ll only need more and more compute capacity to scale these models for millions of users.”

Jevon’s Paradox 

Analysts and industry experts agree that DeepSeek’s accomplishments are a boost for AI inference and the wider AI chip industry. 

“DeepSeek’s performance appears to be based on a series of engineering innovations that significantly reduce inference costs while also improving training cost,” according to a report from Bain & Company.

“In a bullish scenario, ongoing efficiency improvements would lead to cheaper inference, spurring greater AI adoption,” it added. 

This pattern explains Jevon’s Paradox, a theory in which cost reductions in a new technology drive increased demand.

Financial services and investment firm Wedbush said in a research note last week that it continues to expect the use of AI across enterprise and retail consumers globally to drive demand.

Speaking to CNBC’s “Fast Money” last week, Sunny Madra, COO at Groq, which develops chips for AI inference, suggested that as the overall demand for AI grows, smaller players will have more room to grow.

“As the world is going to need more tokens [a unit of data that an AI model processes] Nvidia can’t supply enough chips to everyone, so it gives opportunities for us to sell into the market even more aggressively,” Madra said.

Continue Reading

Technology

Amazon plans to spend $100 billion this year to capture ‘once in a lifetime opportunity’ in AI

Published

on

By

Amazon plans to spend 0 billion this year to capture ‘once in a lifetime opportunity’ in AI

Amazon CEO Andy Jassy speaks during a keynote address at AWS re:Invent 2024, a conference hosted by Amazon Web Services, at The Venetian Las Vegas on December 3, 2024 in Las Vegas, Nevada.

Noah Berger | Getty Images Entertainment | Getty Images

Amazon said Thursday it plans to boost its capital expenditures to $100 billion in 2025, as it continues its investments in artificial intelligence.

The capex figure exceeds last year’s spending of roughly $83 billion. Amazon CEO Andy Jassy had predicted in October that the company’s 2025 capex would surpass last year’s figure, primarily driven by growth in generative AI.

“We spent $26.3 billion in capex in Q4, and I think that is reasonably representative of what you expect an annualized capex rate in 2025,” Jassy said on call with investors after the company released its fourth-quarter earnings report. “The vast majority of that capex spend is on AI for AWS.”

Amazon has been rushing to invest in data centers, networking gear and hardware to meet vast demand for generative AI, which has exploded in popularity since OpenAI released its ChatGPT assistant in late 2022. Amazon has introduced a flurry of AI products, including its own set of Nova models, Trainium chips, a shopping chatbot, and a marketplace for third-party models called Bedrock.

Other tech companies are also spending big on AI. Google parent Alphabet said Tuesday it expects to invest about $75 billion in capital expenditures this year. Last month, Microsoft said it planned to spend $80 billion in fiscal 2025 on the buildout of data centers to support AI workloads. Meta said it will spend as much as $65 billion on capital expenditures as it works to construct more data center and computing infrastructure.

Amazon gave an update on its spending plans after reporting mixed results for the fourth quarter. The company projected weaker-than-expected sales for the current period, which overshadowed a beat on the top and bottom lines in the fourth quarter. Shares fell more than 4% in extended trading.

Jassy tried to reassure investors on the call that the jump in spending will be worthwhile, calling it a “once-in-a-lifetime type of business opportunity.”

“I think that both our business, our customers and shareholders will be happy, medium to long-term, that we’re pursuing the capital opportunity and the business opportunity in AI,” Jassy said. “We also have capex that we’re spending this year in our stores business, really with an aim towards trying to continue to improve the delivery speed and our cost to serve.”

Tech companies are facing fresh skepticism of their AI spending plans after the early success of Chinese AI startup DeepSeek. The lab claims it only took two months and less than $6 million to develop its R1 model, which it says rivals OpenAI’s o1. Markets were roiled by the launch last week, with chipmakers Nvidia and Broadcom losing a combined $800 billion in market cap.

WATCH: Multiple drivers for Amazon long-term

Multiple drivers for Amazon long-term, including robotics, says T. Rowe Price's Wang

Continue Reading

Trending