A banner for Snowflake Inc. is displayed at the New York Stock Exchange to celebrate the company’s initial public offering, Sept. 16, 2020.
Brendan McDermid | Reuters
Buried on page 280 of Instacart’s IPO filing last week was a paragraph that caused a brouhaha between two companies that have nothing to do with grocery delivery.
One of Instacart’s board members is Frank Slootman, the CEO of Snowflake, a publicly traded company that helps businesses store and manage hefty workloads in the cloud. Slootman joined Instacart’s board in 2021 and, because of that relationship, the company has to disclose its business ties to Snowflake.
On first blush, the Instacart spending figure looks troubling for Snowflake.
Instacart said it “made payments to Snowflake” of $13 million in 2020, a number that increased to $28 million in 2021 and $51 million in 2022 for the company’s “cloud-based data warehousing services.” The 2023 numbers appear to show a reversal, with Instacart saying “we anticipate we will pay Snowflake approximately $15 million” for the full year.
That would be a frightening 71% drop in payments.
But Snowflake would later say that those figures don’t tell the real story, a fact that’s mostly backed up by a footnote even deeper in the prospectus.
In the meantime, chaos ensued.
Employees of Snowflake rival Databricks pounced. They took to social media to highlight the apparent decline in spending on Snowflake and to suggest that it was the result of Instacart moving workloads to Databricks infrastructure.
Snowflake staffers fired back, claiming the numbers were being taken out of context, and accused Databricks of consistently spinning the narrative that it was taking business from Snowflake.
Many of the posts on Reddit, LinkedIn and X, the site formerly known as Twitter, have since been deleted.
Instacart did some deleting of its own.
In May, the company published a blog post titled “How Instacart Ads Modularized Data Pipelines With Lakehouse Architecture and Spark.” The post, which described software underpinning Instacart’s ads infrastructure, included discussion of a migration to Databricks’ Lakehouse technology and the cost savings that followed.
However, that blog was taken down as questions began to swirl following the IPO filing. A reader looking for the post now ends up on a page that says, “You’ve landed in the 404 errorverse.” Databricks also took down a case study detailing Instacart’s use of its technology, though its website still has presentations from earlier this year on the topic.
Representatives from Instacart, Snowflake and Databricks declined to to comment.
The controversy, which only came to light because Slootman is on Instacart’s board, has fanned the flames of a fierce rivalry between two companies battling it out in one of the hottest corners of technology, where cloud, data and artificial intelligence collide. It’s a conflict that’s made its way to social media plenty of times in the past, so much so that one Reddit user wrote a post a few months ago, titled “Databricks and Snowflake: Stop fighting on social.” A commenter responded, “Is this the pro-wrestling of data engineering?”
FALMOUTH, MA – APRIL 8: Instacart shopper Loralyn Geggatt makes a delivery to a customer’s home during the COVID-19 pandemic in Falmouth, MA on April 7, 2020. Some Amazon, Instacart and other workers protested for better wages, hazard pay and sick time. (Photo by David L. Ryan/The Boston Globe via Getty Images)
Boston Globe
Snowflake went public in 2020, raising over $3 billion in the biggest U.S. IPO ever for a business software company. Even after last year’s market plunge, Snowflake has a market cap of over $50 billion.
Databricks is still private, but it’s one of the most richly valued venture-backed companies. Private investors valued the company at $38 billion in 2021, and Bloomberg reported last week that the company was in talks to raise funding at a $43 billion valuation.
To expand in AI, Snowflake recently acquired AI search engine Neeva for $185 million, while Databricks spent $1.3 billion on generative AI startup MosaicML.
What’s the real story with Instacart?
That brings us back to Instacart.
While Databricks is picking up business from the grocery-delivery company, the footnote in Instacart’s S-1 spelling out the relationship with Snowflake shows that the spending decline in 2023 is not the most relevant figure.
Rather, when it comes to how Instacart accounts for operating expenses — its actual usage of Snowflake — that amount was $28 million in 2021, $28 million 2022, and then $11 million in the first half of 2023. That’s still a drop this year, but on an annualized basis it would be around 21% instead of 71%.
To add to the confusion, the footnote under “Related Party Transactions” didn’t name Slootman or Snowflake, referring only to a “an executive officer of a software vendor.”
With the online chatter picking up, Snowflake wanted to clear up the picture, at least from its point of view. On Wednesday, the company published a four-paragraph blog post titled, “Snowflake and Instacart: The Facts.”
“In the past few days, the scope and trajectory of Instacart’s use of Snowflake has been misrepresented by some on social media,” the post begins. Nowhere is Databricks mentioned in the post, a consistent theme for Snowflake, which doesn’t name Databricks as a competitor in its financial filings.
Snowflake went on to say that it was working with Instacart to “optimize for efficiency,” a phrase that implies doing more with less, and that its technology is “used extensively by nearly every team within Instacart, including the catalog team, machine learning, ads, shoppers, retailers, customers, and logistics organizations.”
The post then highlights the usage figures from the filing footnote and claims that, “In some social media posts, payment schedules have been incorrectly conflated with actual usage to suggest a large decline in spending — this is not the case.”
In other words, if there’s a decline in spending, it’s not because we’re losing business to an unnamed company.
The good news for Snowflake is that the IPO process callsfor multiple prospectus updates. Instacart, which is trying to unlock a tech IPO market that’s been largely frozen for 20 months, will get a chance to clear up the matter with investors very soon.
— CNBC’s Jonathan Vanian and Jordan Novet contributed to this report.
Okta on Tuesday topped Wall Street’s third-quarter estimates and issued an upbeat outlook, but shares fell as the company did not provide guidance for fiscal 2027.
Shares of the identity management provider fell more than 3% in after-hours trading on Tuesday.
Here’s how the company did versus LSEG estimates:
Earnings per share: 82 cents adjusted vs. 76 cents expected
Revenue: $742 million vs. $730 million expected
Compared to previous third-quarter reports, Okta refrained from offering preliminary guidance for the upcoming fiscal year. Finance chief Brett Tighe cited seasonality in the fourth quarter, and said providing guidance would require “some conservatism.”
Okta released a capability that allows businesses to build AI agents and automate tasks during the third quarter.
CEO Todd McKinnon told CNBC that upside from AI agents haven’t been fully baked into results and could exceed Okta’s core total addressable market over the next five years.
“It’s not in the results yet, but we’re investing, and we’re capitalizing on the opportunity like it will be a big part of the future,” he said in a Tuesday interview.
Revenues increased almost 12% from $665 million in the year-ago period. Net income increased 169% to $43 million, or 24 cents per share, from $16 million, or breakeven, a year ago. Subscription revenues grew 11% to $724 million, ahead of a $715 million estimate.
For the current quarter, the cybersecurity company expects revenues between $748 million and $750 million and adjusted earnings of 84 cents to 85 cents per share. Analysts forecast $738 million in revenues and EPS of 84 cents for the fourth quarter.
Returning performance obligations, or the company’s subscription backlog, rose 17% from a year ago to $4.29 billion and surpassed a $4.17 billion estimate from StreetAccount.
This year has been a blockbuster period for cybersecurity companies, with major acquisition deals from the likes of Palo Alto Networks and Google and a raft of new initial public offerings from the sector.
Marvell Technology Group Ltd. headquarters in Santa Clara, California, on Sept. 6, 2024.
David Paul Morris | Bloomberg | Getty Images
Semiconductor company Marvell on Tuesday announced that it will acquire Celestial AI for at least $3.25 billion in cash and stock.
The purchase price could increase to $5.5 billion if Celestial hits revenue milestones, Marvell said.
Marvell shares rose 13% in extended trading Tuesday as the company reported third-quarter earnings that beat expectations and said on the earnings call that it expected data center revenue to rise 25% next year.
The deal is an aggressive move for Marvell to acquire complimentary technology to its semiconductor networking business. The addition of Celestial could enable Marvell to sell more chips and parts to companies that are currently committing to spend hundreds of billions of dollars on infrastructure for AI.
Marvell stock is down 18% so far in 2025 even as semiconductor rivals like Broadcom have seen big valuation increases driven by excitement around artificial intelligence.
Celestial is a startup focused on developing optical interconnect hardware, which it calls a “photonic fabric,” to connect high-performance computers. Celestial was reportedly valued at $2.5 billion in March in a funding round, and Intel CEO Lip-Bu Tan joined the startup’s board in January.
Optical connections are becoming increasingly important because the most advanced AI systems need those parts tie together dozens or hundreds of chips so they can work as one to train and run the biggest large-language models.
Currently, many AI chip connections are done using copper wires, but newer systems are increasingly using optical connections because they can transfer more data faster and enable physically longer cables. Optical connections also cost more.
“This builds on our technology leadership, broadens our addressable market in scale-up connectivity, and accelerates our roadmap to deliver the industry’s most complete connectivity platform for AI and cloud customers,” Marvell CEO Matt Murphy said in a statement.
Marvell said that the first application of Celestial technology would be to connect a system based on “large XPUs,” which are custom AI chips usually made by the companies investing billions in AI infrastructure.
On Tuesday, the company said that it could even integrate Celestial’s optical technology into custom chips, and based on customer traction, the startup’s technology would soon be integrated into custom AI chips and related parts called switches.
Amazon Web Services Vice President Dave Brown said in a statement that Marvell’s acquisition of Celestial will “help further accelerate optical scale-up innovation for next-generation AI deployments.”
The maximum payout for the deal will be triggered if Celestial can record $2 billion in cumulative revenue by the end of fiscal 2029. The deal is expected to close early next year.
In its third-quarter earnings on Tuesday, Marvell earnings of 76 cents per share on $2.08 billion in sales, versus LSEG expectations of 73 cents on $2.07 billion in sales. Marvell said that it expects fourth-quarter revenue to be $2.2 billion, slightly higher than LSEG’s forecast of $2.18 billion.
Amazon Web Services’ two-track approach to artificial intelligence came into better focus Tuesday as the world’s biggest cloud pushed forward with its own custom chips and got closer to Nvidia . During Amazon ‘s annual AWS Re:Invent 2025 conference in Las Vegas, Amazon Web Services CEO Matt Garman unveiled Trainium3 — the latest version of the company’s in-house custom chip. It has four times more compute performance, energy efficiency, and memory bandwidth than previous generations. AWS said that early results of customers testing Trainium3 are reducing AI training and inference costs by up to 50%. Custom chips, like Trainium, are becoming more and more popular for the big tech companies that can afford to make them. And, their use cases are broadening. For example, Google’s tensor processing units (TPUs), co-designed by Broadcom , have also been getting a lot of attention since last month’s launch of the well-received Gemini 3 artificial intelligence model. It is powered by TPUs. There was even a report that Meta Platforms was considering TPUs in addition to Nvidia ‘s graphics processing units (GPUs), which are the gold standard for all-purpose AI workloads. At the same time, Amazon also announced that it’s deepening its work with Nvidia. In Tuesday’s keynote, Garman introduced AWS Factories, which provides on-premise AI infrastructure for customers to use in their own data centers. The service combines Trainium accelerators and Nvidia graphics processing units, which allows customers to access Nvidia’s accelerated computing platform, full-stack AI software, and GPU-accelerated applications. By offering both options, Amazon aims to keep accelerating AWS cloud capacity and, in turn, revenue growth to stay on top during a time of intense competition from Microsoft ‘s Azure and Alphabet ‘s Google Cloud, the second and third place horses in the AI race, by revenue. Earlier this year, investors were concerned when second-quarter AWS revenue growth did not live up to its closest competitors. In late October’s release of Q3 results, Amazon went a long way to putting those worries to rest. Amazon CEO Andy Jassy said at the time , “AWS is growing at a pace we haven’t seen since 2022, re-accelerating to 20.2% YoY.” He added, “We’ve been focused on accelerating capacity — adding more than 3.8 gigawatts (GW) in the past 12 months.” Tuesday’s announcements come at a pivotal time for AWS as it tries to rapidly expand its computing capacity after a year of supply constraints that put a lid on cloud growth. As great as more efficient chips are, they don’t make up for the capacity demand that the company is facing as AI adoption ramps up, which is why adding more gigawatts of capacity is what Wall Street is laser-focused on. Fortunately, Wall Street argues that the capacity headwind should flip to a tailwind. Wells Fargo said Trainium3 is “critical to supplementing Nvidia GPUs and CPUs in this capacity build” to close the gap with rivals. In a note to investors on Monday, the analysts estimate Amazon will add more than 12 gigawatts of compute by year-end 2027, boosting total AWS capacity to support as much as $150 billion in incremental annual AWS revenue if demand remains strong. In a separate note, Oppenheimer said Monday that AWS has already proven its ability to improve capacity, which has already doubled since 2022. Amazon plans to double it again by 2027. The analysts said that such an expansion could translate to 14% upside to 2026 AWS revenue and 22% upside in 2027. Analysts said each incremental gigawatt of compute added in recent quarters translated to roughly $3 billion of annual cloud revenue. Bottom line While new chips are welcome news that helps AWS step deeper into the AI chip race, Amazon’s investment in capacity and when that capacity will be unlocked is what investors are more locked in on because that’s how it will fulfill demand. The issue is not a demand issue; it’s a supply issue. We are confident in AWS’ ability to add the capacity. In fact, there’s no one company in the world that could deal with this kind of logistics problem, at this scale, better than Amazon. Amazon shares surged nearly 14% to $254 each in the two sessions following the cloud and e-commerce giant’s late Oct. 30 earnings print. The stock has since given back those gains and then some. As of Tuesday’s close, shares were up 6.5% year to date, a laggard among its “Magnificent Seven” peers, and underperforming the S & P 500 ‘s roughly 16% advance in 2025. (Jim Cramer’s Charitable Trust is long AMZN, NVDA. See here for a full list of the stocks.) As a subscriber to the CNBC Investing Club with Jim Cramer, you will receive a trade alert before Jim makes a trade. Jim waits 45 minutes after sending a trade alert before buying or selling a stock in his charitable trust’s portfolio. If Jim has talked about a stock on CNBC TV, he waits 72 hours after issuing the trade alert before executing the trade. THE ABOVE INVESTING CLUB INFORMATION IS SUBJECT TO OUR TERMS AND CONDITIONS AND PRIVACY POLICY , TOGETHER WITH OUR DISCLAIMER . NO FIDUCIARY OBLIGATION OR DUTY EXISTS, OR IS CREATED, BY VIRTUE OF YOUR RECEIPT OF ANY INFORMATION PROVIDED IN CONNECTION WITH THE INVESTING CLUB. NO SPECIFIC OUTCOME OR PROFIT IS GUARANTEED.