Brian Chesky, co-founder and chief executive officer of Airbnb Inc., during a news conference in Los Angeles, California, US, on Wednesday, May 1, 2024.
The stock debuted on the Nasdaq in December 2020, and its sharpest rally to this point came in February 2023. The shares are up 22% this year.
The online rental platform posted earnings of 73 cents per share on $2.48 billion in revenue. That topped analyst estimates of 58 cents per share of earnings and $2.42 billion in revenue, according to LSEG. Revenue increased 12% from a year ago.
“Airbnb is a fundamentally stronger company today than it was several years ago,” the company said in a letter to shareholders. “We’re continuing to build on this momentum in 2025, executing a multi-year strategy to perfect the core service, accelerate growth in global markets, and launch and scale new offerings.”
The company also swung to a profit, reporting net income of $461 million, or 73 cents per share. In the year-ago quarter, Airbnb reported a loss of $349 million, or 55 cents a share. Adjusted profit totaled $765 million, reflecting 4% year over year growth.
Gross booking value, which measures host earnings, taxes, and service and cleaning fees, rose to $17.6 billion and topped a StreetAccount forecast of $17.2 billion. Airbnb also reported 111 million nights and experience booked for the period, representing 12% year-over-year growth. That was above the 108.7 million StreetAccount estimate.
During an earnings call with investors, finance chief Ellie Mertz said Airbnb will invest $200 million to $250 million to scale new business opportunities it plans to announce in May.
“We want the Airbnb app — kind of similar to Amazon — to be one place to go for all of your traveling and living needs,” CEO Brian Chesky said on the call. He also said that each business the company plans to roll out could take three to five years to scale but should strengthen its core business.
“A great business could get to $1 billion of revenue,” he said. “And you should be able to expect one or a couple of businesses to launch every single year for the next five years.”
Despite the strong fourth-quarter results, Airbnb offered light guidance for the current quarter of $2.23 billion to $2.27 billion in revenue. That trailed a $2.3 billion estimate from LSEG. The company said the first quarter of 2024 benefitted from Easter and an extra day in February.
Airbnb also commented on the recent wildfires that ravaged the Los Angeles area last month, saying that its nonprofit Airbnb.org housed over 19,000 people and 2,300 pets and has received $27 million in donations. That includes $18 million from its founders.
The Alibaba office building in Nanjing, Jiangsu province, China, on Aug 28, 2024.
CFOTO | Future Publishing | Getty Images
Alibaba Cloud launched Thursday its latest AI model in its “Qwen series,” as large language model competition in China continues to heat up following the “DeepSeek moment.”
The new “Qwen2.5-Omni-7B” is a multimodal model, which means it can process inputs, including text, images, audio and videos, while generating real-time text and natural speech responses, according to an announcement on Alibaba Cloud’s website.
The company says that the model can be deployed on edge devices like mobile phones, offering high efficiency without compromising performance.
“This unique combination makes it the perfect foundation for developing agile, cost-effective AI agents that deliver tangible value, especially intelligent voice applications,” Alibaba said.
For example, it could be used to help a visually impaired person navigate their environment through real-time audio description, the company added.
The new model is open-sourced on the platforms Hugging Face and Github, following a growing trend in China after DeepSeek made its breakthrough R1 model open-source.
Open-source generally refers to software in which the source code is made freely available on the web for possible modification and redistribution. Over the past years, Alibaba Cloud says it has open-sourced over 200 generative AI models.
Amid China’s AI fervor accelerated by DeepSeek, Alibaba and other generative AI competitors have been releasing new, cost-effective models and products at an unprecedented pace.
Last week, Chinese tech giant Baidureleased a new multimodal foundational model and its first reasoning-focused model.
Alibaba, meanwhile, debuted its updated Qwen 2.5 artificial intelligence model in late January and released a new version of its AI assistant tool Quark earlier this month.
The company has strongly committed to its AI strategy, announcing last month a plan to invest $53 billion in its cloud computing and AI infrastructure over the next three years, exceeding what it spent in the space over the past decade.
Kai Wang, Asia senior equity analyst at Morningstar, told CNBC that large Chinese tech players such as Alibaba, which build data centers to meet the computing needs of AI in addition to building their own LLMs, are well positioned to benefit from China’s post-DeepSeek AI boom.
Alibaba secured a major win for its AI business last month when it confirmed that the company was partnering with Apple to roll out AI integration for iPhones sold in China.
On Wednesday, the group also reported an expanded strategic partnership with BMW to accelerate the integration of its AI into the carmaker’s next-generation intelligent vehicles.
With 250,000 highly-desired Nvidia graphics processors, CoreWeave has become one of the most prominent “GPU clouds,” a status it hopes investors will value when it debuts on the public markets.
But the world of artificial intelligence hardware is moving so quickly that it raises questions about how long those chips will remain on the cutting edge and in demand. It’s a concern that could impact investor demand for shares of CoreWeave, one of the most anticipated IPOs in years.
CoreWeave, which rents out remote access to computers based on Nvidia AI chips,said in a financial filing this monththat most of its AI chips are from Nvidia’s Hopper generation. Those chips, such as the H100, were state-of-the-art in 2023 and 2024. They were scarce as AI companies bought or rented all the chips they could get in the wake of OpenAI ushering in the generative AI age with the release of ChatGPT in late 2022.
But these days, Nvidia CEO Jensen Huang says that his company’s Hopper chips are getting blown out of the water by their successors – the Blackwell generation of GPUs, which have been shipping since late 2024. Hopper chips are “fine” for some circumstances but “not many,” Huang joked at Nvidia’s GTC conference last week.
“In a reasoning model, Blackwell is 40 times the performance of Hopper. Straight up. Pretty amazing,” Huang said. “I said before that when Blackwell starts shipping in volume, you couldn’t give Hoppers away.”
That’s great for Nvidia, which needs to find ways to keep selling chips to the companies committed to the AI race, but it’s bad news for GPU clouds like CoreWeave. That’s because the New Jersey company models the future trajectory of its business based on how much it anticipates being able to rent Nvidia chips out for over the next five to six years.
Huang may have been kidding, but Nvidia spent much of its event detailing just how much better its Blackwell chips are. In Nvidia’s view, the best way to decrease the high cost of serving AI is by buying faster chips.
Blackwell systems are in full production and shipping to customers, and Nvidia plans to introduce an upgraded version of Blackwell in late 2026. When new chips come out, the older chips — the kind CoreWeave has a quarter of a million of — go down in price, Huang said. So too does the price of renting them.
Older chips don’t just stop working when new ones come out. Most companies, including CoreWeave, plan to use Hopper chips for six years. But Nvidia is telling customers that its newer, faster chips are capable of producing more AI content, which leads to more revenues at a better margin for clouds.
An H100 would have to be priced 65% lower per hour than an Nvidia Blackwell GB200 NVL system for the two systems to be competitive in price per output to a renter. Put another way, the H100 would have to rent at 98 cents per hour to match the price per output of a Blackwell rack system priced at $2.20 per hour per GPU, SemiAnalysis estimated, speaking generally about AI rentals.
H100s rented for as much as $8 per hour back in 2023 and often required long commitments and lead times, but now, usage of those chips can be summoned in minutes with a credit card. Some services now offer rented H100 access for under $2 per hour.
The industry could be entering a period where the useful life of AI chips is reduced, Barclays analyst Ross Sandler wrote in a note on Friday. He was focused on hyperscalers — Meta, Google and Amazon — but the trend affects smaller cloud providers like CoreWeave, too.
“These assets are becoming obsolete at a much more rapid pace given how much innovation and speed improvements happen with each generation,” Sandler wrote.
This threatens company earnings if they end up depreciating older equipment faster, he said.
CoreWeave says that if there were to be changes to the “significant” assumptions it makes about the useful lifetime of its AI infrastructure, it could hurt its business or future prospects. CoreWeave has also borrowed nearly $8 billion to buy Nvidia chips and build its data centers, sometimes using the GPUs it amassed as collateral.
Analysts and investors are also increasingly asking questions about the useful lifespan of these new AI systems and whether their financial depreciation schedules should be accelerated because the technology is improving so fast.
CoreWeave says in its filing that it seeks to offer state-of-the-art infrastructure and says it will continue spending to expand and improve its data centers.
“Part of this process entails cycling out outdated components of our infrastructure and replacing them with the latest technology available,” the New Jersey company said. “This requires us to make certain estimates with respect to the useful life of the components of our infrastructure and to maximize the value of the components of our infrastructure, including our GPUs, to the fullest extent possible.”
CoreWeave and Nvidia maintain a good relationship. CoreWeave will certainly buy more chips from Nvidia, which owns more than 5% of the New Jersey company.
“We’re super proud of them,” Huang said last week.
But Nvidia’s road map for releasing new chips that it proudly touts will make their predecessors obsolete is a threat to CoreWeave’s ambitions.
U.S. President Donald Trump speaks to the media in the Oval Office at the White House in Washington, D.C., U.S., March 26, 2025.
Evelyn Hockstein | Reuters
After President Donald Trump said on Wednesday he would impose 25% tariffs on “all cars that are not made in the United States,” he said his key advisor, Tesla CEO Elon Musk, had not weighed in on the matter, “because he may have a conflict.”
He added that Musk had never “asked me for a favor in business whatsoever.”
Musk serves as a senior advisor to President Donald Trump, having earlier contributed $290 million to propel him back to the White House. While Musk remains at the helm of his companies, including SpaceX and Tesla, he is also leading the Department of Government Efficiency (DOGE), which is an effort to slash federal government spending, personnel and consolidate or eliminate various federal agencies and services.
Earlier this month, President Donald Trump turned the South Lawn of the White House into a temporary Tesla showroom. The company delivered five of its electric vehicles there for the president to inspect after he had declared, in a post on Truth Social, that he would buy a Tesla to show support for Musk and the business. Musk stood by his side while Trump called the vehicles “beautiful” and praised the unorthodox design of the angular, steel Tesla Cybertruck.
When asked by reporters whether the new tariffs would be good for Musk’s autos business, Tesla, President Donald Trump said they may be “net neutral or they may be good.” He pointed to Tesla’s vehicle assembly plants in Austin, Texas and Fremont, California and opined that, “anybody that has plants in the United States — it’s going to be good for them.”
Tesla recently wrote, in a letter to the U.S. Trade Representative, that “even with aggressive localization” of its supply chain domestically, “certain parts and components are difficult or impossible to source within the United States.” The company urged the USTR to “consider the downstream impacts of certain proposed actions taken to address unfair trade practices.”
Tesla and other automakers commonly buy headlamps, automotive glass, brakes, body panels, suspension parts, and printed circuit boards for various electrical systems in their vehicles from foreign suppliers in Mexico, Canada and China, especially.
Musk and Tesla did not immediately respond to a request for comment about how the new 25% tariffs may impact their business.
Tesla faces an onslaught of competition with more automakers selling fully electric models than ever before. However, the company’s most formidable rival in battery electric vehicles, BYD in China, has never been authorized to sell its electric cars in the United States.
Detroit automakers General Motors and Ford saw their shares falling in after hours trading, while EV makers Tesla and Rivian saw shares nearly flat or slightly higher in response to the tariffs announcement.