Tech titans Mark Zuckerberg and Elon Musk are in a fierce business rivalry that has spilt over into a playground spat, with the two men offering to fight each other in a cage.
Mandel Ngan | AFP | Getty Images
Meta has officially debuted its Twitter-like messaging app Threads, which the company is pitching as Instagram’s “text-based conversation app.”
Mark Zuckerberg, Meta’s CEO and co-founder, announced the debut of Threads on Wednesday, marking the official release of the social networking giant’s new text-focused messaging app. Threads represents Meta’s attempt to capture the wave of users who have left Twitter amid the often unpredictable ownership of Tesla and SpaceX CEO Elon Musk.
The Threads app is now available to download for free on the Apple App Store and Google Play online store in over 100 countries, Meta said in a blog post. Threads shares Twitter’s visual aesthetic as a text-based social messaging app in which users can post short messages that others can like, share, and comment upon, according to screenshots of Threads that are available on Apple’s App Store.
People will be able to follow the same Threads accounts that they follow on Instagram and reply to other public posts in a way akin to how people use Twitter.
The official release comes after Instagram released on Monday a pre-order for Threads on the Apple App Store, which said that at the time that the app was expected to debut on July 6. Many Instagram users were also recently able to obtain invitations to access Threads from within their Instagram accounts.
Although Threads is linked to Instagram, with users able to use their existing Instagram usernames, the messaging service is a separate app that people will need to download.
“Threads is where communities come together to discuss everything from the topics you care about today to what’ll be trending tomorrow,” Instagram said in a description of Threads on the Apple App Store. “Whatever it is you’re interested in, you can follow and connect directly with your favorite creators and others who love the same things — or build a loyal following of your own to share your ideas, opinions and creativity with the world.”
Meta said in the blog post that people’s individual feeds on the new messaging app will include “threads” that were posted by other users that they follow, in addition to recommended content shared from creators who users may not know.
People will be able to publish Threads posts that are up to 500 characters long, and while the app is geared toward text, people will also be able so share links, photos and videos that can be as long as 5 minutes. Instagram users will also be able to share their Threads posts via the app’s story feature in addition to “any other platform you choose,” the blog post said.
Meta said that it developed Threads “with tools to enable positive, productive conversations,” and people will be able to manage who is mentioning or is replying to them within the app.
“Like on Instagram, you can add hidden words to filter out replies to your threads that contain specific words,” the blog post said. “You can unfollow, block, restrict or report a profile on Threads by tapping the three-dot menu, and any accounts you’ve blocked on Instagram will automatically be blocked on Threads.”
Racing into the gap as Twitter implodes
The release of Threads comes as Twitter has suffered a wave of mishaps under the ownership of Tesla CEO Elon Musk, leaving the popular social messaging app vulnerable to competing apps.
Most recently, Musk said that Twitter users will only be able to see a certain number of Tweets per day in an attempt to deal with “extreme levels of data scraping” and “system manipulation” on the messaging service.
Numerous Twitter users publicly complained about Musk imposing a temporary so-called “rate limit” on Twitter, saying that the Tweet limits make the app a less engaging experience.
BlueSky, a rival social messaging app that is backed by Twitter co-founder Jack Dorsey, said that it recorded “record-high traffic” after Musk announced the Twitter rate limit, and it temporarily paused sign-ups to deal with the influx of new users, who must currently be invited to use the app.
Like BlueSky, Threads will use decentralized technology that theoretically lets users control and manage their data across other apps that incorporate the same underlying software.
Whereas BlueSky is built on the decentralized networking technology dubbed the AT Protocol, Threads will eventually incorporate another decentralized technology called ActivityPub, Instagram head Adam Mosseri said in a Threads post on Wednesday that was briefly available to the public. The ActivityPub software also powers another Twitter-like messaging app called Mastadon, which has also experienced an influx of new users seeking an alternative to Twitter.
Mosseri said that his team wasn’t able to include support for ActivityPub in time for Threads’ official release because of “a number of complications that come along with a decentralized network.” But he reiterated that support is coming.
“If you’re wondering why this matters, here’s a reason: you may one day end up leaving Threads, or, hopefully not, end up de-platformed,” Mosseri said. “If that ever happens, you should be able to take your audience with you to another server. Being open can enable that.”
Meta added in its blog post that ActivityPub will enable people without Threads accounts to view Threads and interact with Threads users who have public profiles via other social apps that incorporate the same decentralized technology.
“If you have a public profile on Threads, this means your posts would be accessible from other apps, allowing you to reach new people with no added effort,” Meta said in the blog post. “If you have a private profile, you’d be able to approve users on Threads who want to follow you and interact with your content, similar to your experience on Instagram.”
Meta said that Threads is the company’s first app “envisioned to be compatible with an open social networking protocol,” which it believes could usher “in a new era of diverse and interconnected networks.”
In 2019, Meta, then known as Facebook, debuted a messaging app for Instagram users that was also called Threads. Unlike the current iteration of Threads that caters to text-based messages, the previous Threads app was instead centered around people sending short video and photo messages to their friends like they were using Snapchat.
Meta eventually shuttered Threads in 2021, and redirected people to use Instagram to see all their previous Threads messages.
Okta on Tuesday topped Wall Street’s third-quarter estimates and issued an upbeat outlook, but shares fell as the company did not provide guidance for fiscal 2027.
Shares of the identity management provider fell more than 3% in after-hours trading on Tuesday.
Here’s how the company did versus LSEG estimates:
Earnings per share: 82 cents adjusted vs. 76 cents expected
Revenue: $742 million vs. $730 million expected
Compared to previous third-quarter reports, Okta refrained from offering preliminary guidance for the upcoming fiscal year. Finance chief Brett Tighe cited seasonality in the fourth quarter, and said providing guidance would require “some conservatism.”
Okta released a capability that allows businesses to build AI agents and automate tasks during the third quarter.
CEO Todd McKinnon told CNBC that upside from AI agents haven’t been fully baked into results and could exceed Okta’s core total addressable market over the next five years.
“It’s not in the results yet, but we’re investing, and we’re capitalizing on the opportunity like it will be a big part of the future,” he said in a Tuesday interview.
Revenues increased almost 12% from $665 million in the year-ago period. Net income increased 169% to $43 million, or 24 cents per share, from $16 million, or breakeven, a year ago. Subscription revenues grew 11% to $724 million, ahead of a $715 million estimate.
For the current quarter, the cybersecurity company expects revenues between $748 million and $750 million and adjusted earnings of 84 cents to 85 cents per share. Analysts forecast $738 million in revenues and EPS of 84 cents for the fourth quarter.
Returning performance obligations, or the company’s subscription backlog, rose 17% from a year ago to $4.29 billion and surpassed a $4.17 billion estimate from StreetAccount.
This year has been a blockbuster period for cybersecurity companies, with major acquisition deals from the likes of Palo Alto Networks and Google and a raft of new initial public offerings from the sector.
Marvell Technology Group Ltd. headquarters in Santa Clara, California, on Sept. 6, 2024.
David Paul Morris | Bloomberg | Getty Images
Semiconductor company Marvell on Tuesday announced that it will acquire Celestial AI for at least $3.25 billion in cash and stock.
The purchase price could increase to $5.5 billion if Celestial hits revenue milestones, Marvell said.
Marvell shares rose 13% in extended trading Tuesday as the company reported third-quarter earnings that beat expectations and said on the earnings call that it expected data center revenue to rise 25% next year.
The deal is an aggressive move for Marvell to acquire complimentary technology to its semiconductor networking business. The addition of Celestial could enable Marvell to sell more chips and parts to companies that are currently committing to spend hundreds of billions of dollars on infrastructure for AI.
Marvell stock is down 18% so far in 2025 even as semiconductor rivals like Broadcom have seen big valuation increases driven by excitement around artificial intelligence.
Celestial is a startup focused on developing optical interconnect hardware, which it calls a “photonic fabric,” to connect high-performance computers. Celestial was reportedly valued at $2.5 billion in March in a funding round, and Intel CEO Lip-Bu Tan joined the startup’s board in January.
Optical connections are becoming increasingly important because the most advanced AI systems need those parts tie together dozens or hundreds of chips so they can work as one to train and run the biggest large-language models.
Currently, many AI chip connections are done using copper wires, but newer systems are increasingly using optical connections because they can transfer more data faster and enable physically longer cables. Optical connections also cost more.
“This builds on our technology leadership, broadens our addressable market in scale-up connectivity, and accelerates our roadmap to deliver the industry’s most complete connectivity platform for AI and cloud customers,” Marvell CEO Matt Murphy said in a statement.
Marvell said that the first application of Celestial technology would be to connect a system based on “large XPUs,” which are custom AI chips usually made by the companies investing billions in AI infrastructure.
On Tuesday, the company said that it could even integrate Celestial’s optical technology into custom chips, and based on customer traction, the startup’s technology would soon be integrated into custom AI chips and related parts called switches.
Amazon Web Services Vice President Dave Brown said in a statement that Marvell’s acquisition of Celestial will “help further accelerate optical scale-up innovation for next-generation AI deployments.”
The maximum payout for the deal will be triggered if Celestial can record $2 billion in cumulative revenue by the end of fiscal 2029. The deal is expected to close early next year.
In its third-quarter earnings on Tuesday, Marvell earnings of 76 cents per share on $2.08 billion in sales, versus LSEG expectations of 73 cents on $2.07 billion in sales. Marvell said that it expects fourth-quarter revenue to be $2.2 billion, slightly higher than LSEG’s forecast of $2.18 billion.
Amazon Web Services’ two-track approach to artificial intelligence came into better focus Tuesday as the world’s biggest cloud pushed forward with its own custom chips and got closer to Nvidia . During Amazon ‘s annual AWS Re:Invent 2025 conference in Las Vegas, Amazon Web Services CEO Matt Garman unveiled Trainium3 — the latest version of the company’s in-house custom chip. It has four times more compute performance, energy efficiency, and memory bandwidth than previous generations. AWS said that early results of customers testing Trainium3 are reducing AI training and inference costs by up to 50%. Custom chips, like Trainium, are becoming more and more popular for the big tech companies that can afford to make them. And, their use cases are broadening. For example, Google’s tensor processing units (TPUs), co-designed by Broadcom , have also been getting a lot of attention since last month’s launch of the well-received Gemini 3 artificial intelligence model. It is powered by TPUs. There was even a report that Meta Platforms was considering TPUs in addition to Nvidia ‘s graphics processing units (GPUs), which are the gold standard for all-purpose AI workloads. At the same time, Amazon also announced that it’s deepening its work with Nvidia. In Tuesday’s keynote, Garman introduced AWS Factories, which provides on-premise AI infrastructure for customers to use in their own data centers. The service combines Trainium accelerators and Nvidia graphics processing units, which allows customers to access Nvidia’s accelerated computing platform, full-stack AI software, and GPU-accelerated applications. By offering both options, Amazon aims to keep accelerating AWS cloud capacity and, in turn, revenue growth to stay on top during a time of intense competition from Microsoft ‘s Azure and Alphabet ‘s Google Cloud, the second and third place horses in the AI race, by revenue. Earlier this year, investors were concerned when second-quarter AWS revenue growth did not live up to its closest competitors. In late October’s release of Q3 results, Amazon went a long way to putting those worries to rest. Amazon CEO Andy Jassy said at the time , “AWS is growing at a pace we haven’t seen since 2022, re-accelerating to 20.2% YoY.” He added, “We’ve been focused on accelerating capacity — adding more than 3.8 gigawatts (GW) in the past 12 months.” Tuesday’s announcements come at a pivotal time for AWS as it tries to rapidly expand its computing capacity after a year of supply constraints that put a lid on cloud growth. As great as more efficient chips are, they don’t make up for the capacity demand that the company is facing as AI adoption ramps up, which is why adding more gigawatts of capacity is what Wall Street is laser-focused on. Fortunately, Wall Street argues that the capacity headwind should flip to a tailwind. Wells Fargo said Trainium3 is “critical to supplementing Nvidia GPUs and CPUs in this capacity build” to close the gap with rivals. In a note to investors on Monday, the analysts estimate Amazon will add more than 12 gigawatts of compute by year-end 2027, boosting total AWS capacity to support as much as $150 billion in incremental annual AWS revenue if demand remains strong. In a separate note, Oppenheimer said Monday that AWS has already proven its ability to improve capacity, which has already doubled since 2022. Amazon plans to double it again by 2027. The analysts said that such an expansion could translate to 14% upside to 2026 AWS revenue and 22% upside in 2027. Analysts said each incremental gigawatt of compute added in recent quarters translated to roughly $3 billion of annual cloud revenue. Bottom line While new chips are welcome news that helps AWS step deeper into the AI chip race, Amazon’s investment in capacity and when that capacity will be unlocked is what investors are more locked in on because that’s how it will fulfill demand. The issue is not a demand issue; it’s a supply issue. We are confident in AWS’ ability to add the capacity. In fact, there’s no one company in the world that could deal with this kind of logistics problem, at this scale, better than Amazon. Amazon shares surged nearly 14% to $254 each in the two sessions following the cloud and e-commerce giant’s late Oct. 30 earnings print. The stock has since given back those gains and then some. As of Tuesday’s close, shares were up 6.5% year to date, a laggard among its “Magnificent Seven” peers, and underperforming the S & P 500 ‘s roughly 16% advance in 2025. (Jim Cramer’s Charitable Trust is long AMZN, NVDA. See here for a full list of the stocks.) As a subscriber to the CNBC Investing Club with Jim Cramer, you will receive a trade alert before Jim makes a trade. Jim waits 45 minutes after sending a trade alert before buying or selling a stock in his charitable trust’s portfolio. If Jim has talked about a stock on CNBC TV, he waits 72 hours after issuing the trade alert before executing the trade. THE ABOVE INVESTING CLUB INFORMATION IS SUBJECT TO OUR TERMS AND CONDITIONS AND PRIVACY POLICY , TOGETHER WITH OUR DISCLAIMER . NO FIDUCIARY OBLIGATION OR DUTY EXISTS, OR IS CREATED, BY VIRTUE OF YOUR RECEIPT OF ANY INFORMATION PROVIDED IN CONNECTION WITH THE INVESTING CLUB. NO SPECIFIC OUTCOME OR PROFIT IS GUARANTEED.