Headquarters of Samsung in Mountain View, California, on October 28, 2018.
Smith Collection/gado | Archive Photos | Getty Images
Samsung Electronics reported a rebound in earnings on Thursday, with operating profit more than doubling from the previous quarter on strength from its chip business.
Here are Samsung’s third-quarter results compared with LSEG SmartEstimate, which is weighted toward forecasts from analysts who are more consistently accurate:
Revenue: 86.1 trillion Korean won ($60.5 billion) vs. 85.93 trillion won
Operating profit: 12.2 trillion won vs. 11.25 trillion won
The South Korean technology giant’s quarterly revenue was up 8.85% from a year earlier, while its first-quarter operating profit climbed 32.9% year-over-year.
Samsung shares popped nearly 4% in early trading in Asia.
The earnings represent a bounce back from the June quarter, which had been weighed down by a massive slump in Samsung’s chip business. Operating profit increased by 160% compared to June, while revenue increased by 15.5% over the same period.
Samsung Electronics, South Korea’s largest company by market capitalization, is a leading provider of memory chips, semiconductor foundry services and smartphones.
Samsung’s chip business reported a 19% increase in sales from the June quarter, with its memory business setting an all-time high for quarterly sales, driven by strong demand from artificial intelligence.
The third-quarter operating profit also beat Samsung’s own guidance of around 12.1 trillion Korean won.
Chip Business
Samsung Electronics’ chip business posted an operating profit of 7.0 trillion Korean won in the third quarter, up 81% from the same period last year, and an over tenfold increase from last quarter.
Chip revenue increased to 33.1 trillion won, up 13% from last year.
Also known as its Device Solutions division, Samsung’s chip business encompasses memory chips, semiconductor design and its foundry units.
The unit benefited from a favorable price environment, while quarterly revenues reached a record high on expanded sales of its high-bandwidth memory (HBM) chips — a type of memory used in artificial intelligence computing.
Samsung has found itself lagging behind memory rival SK Hynix in the HBM market, after it was slow to secure major contracts with leading AI chip Nvidia. However, in a positive sign for the company, it reportedly passed Nvidia’s qualification tests for an advanced HBM chip last month.
A report from Counterpoint Research earlier this month found that Samsung had reclaimed the top spot in the memory market ahead of SK Hynix in the third quarter after falling behind its competitor for the first time the quarter prior.
MS Hwang, research director at Counterpoint Research, told CNBC that Samsung’s third-quarter performance was a clear result of a broader “memory market boom,” as well as rising prices for general-purpose memory.
Heading into 2026, Samsung said its memory business will focus on the mass production of its next-generation HBM technology, HBM4.
Smartphones
Samsung’s mobile experience and network businesses, tasked with developing and selling smartphones, tablets, wearables and other devices, reported a rise in both sales and profit.
The unit posted an operating profit of 3.6 trillion won in the third quarter, up about 28% from the same period last year.
The company said earnings were driven by robust flagship smartphone sales, including the launch of its Galaxy Z Fold7 device.
Samsung forecasted that the rapid growth of the AI industry would open up new market opportunities for both its devices and chip businesses in the current quarter.
Neptune and OpenAI have collaborated on a metrics dashboard to help teams that are building foundation models. The companies will work “even more closely together” because of the acquisition, Neptune CEO Piotr Niedźwiedź said in a blog.
The startup will wind down its external services in the coming months, Niedźwiedź said. The terms of the acquisition were not disclosed.
“Neptune has built a fast, precise system that allows researchers to analyze complex training workflows,” OpenAI’s Chief Scientist Jakub Pachocki said in a statement. “We plan to iterate with them to integrate their tools deep into our training stack to expand our visibility into how models learn.”
OpenAI has acquired several companies this year.
It purchased a small interface startup called Software Applications Incorporated for an undisclosed sum in October, product development startup Statsig for $1.1 billion in September and Jony Ive’s AI devices startup io for more than $6 billion in May.
Neptune had raised more than $18 million in funding from investors including Almaz Capital and TDJ Pitango Ventures, according to its website. Neptune’s deal with OpenAI is still subject to customary closing conditions.
“I am truly grateful to our customers, investors, co-founders, and colleagues who have made this journey possible,” Niedźwiedź said. “It was the ride of a lifetime already, yet still I believe this is only the beginning.”
A person walks by a sign for Micron Technology headquarters in San Jose, California, on June 25, 2025.
Justin Sullivan | Getty Images
Micron said on Wednesday that it plans to stop selling memory to consumers to focus on meeting demand for high-powered artificial intelligence chips.
“The AI-driven growth in the data center has led to a surge in demand for memory and storage,” Sumit Sadana, Micron business chief, said in a statement. “Micron has made the difficult decision to exit the Crucial consumer business in order to improve supply and support for our larger, strategic customers in faster-growing segments.”
Micron’s announcement is the latest sign that the AI infrastructure boom is creating shortages for inputs like memory as a handful of companies commit to spend hundreds of billions in the next few years to build massive data centers. Memory, which is used by computers to store data for short periods of time, is facing a global shortage.
Micron shares are up about 175% this year, though they slipped 3% on Wednesday to $232.25.
AI chips, like the GPUs made by Nvidia and AdvancedMicro Devices, use large amounts of the most advanced memory. For example, the current-generation Nvidia GB200 chip has 192GB of memory per graphics processor. Google’s latest AI chip, the Ironwood TPU, needs 192GB of high-bandwidth memory.
Memory is also used in phones and computers, but with lower specs, and much lower quantities — many laptops only come with 16GB of memory. Micron’s Crucial brand sold memory on sticks that tinkerers could use to build their own PCs or upgrade their laptops. Crucial also sold solid-state hard drives.
Micron competes against SK Hynix and Samsung in the market for high-bandwidth memory, but it’s the only U.S.-based memory supplier. Analysts have said that SK Hynix is Nvidia’s primary memory supplier.
Micron supplies AMD, which says its AI chips use more memory than others, providing them a performance advantage for running AI. AMD’s current AI chip, the MI350, comes with 288GB of high-bandwidth memory.
Micron’s Crucial business was not broken out in company earnings. However, its cloud memory business unit showed 213% year-over-year growth in the most recent quarter.
Analysts at Goldman on Tuesday raised their price target on Micron’s stock to $205 from $180, though they maintained their hold recommendation. The analysts wrote in a note to clients that due to “continued pricing momentum” in memory, they “expect healthy upside to Street estimates” when Micron reports quarterly results in two weeks.
A Micron spokesperson declined to comment on whether the move would result in layoffs.
“Micron intends to reduce impact on team members due to this business decision through redeployment opportunities into existing open positions within the company,” the company said in its release.
Microsoft pushed back on a report Wednesday that the company lowered growth targets for artificial intelligence software sales after many of its salespeople missed those goals in the last fiscal year.
The company’s stock sank more than 2% on The Information report.
A Microsoft spokesperson said the company has not lowered sales quotas or targets for its salespeople.
The sales lag occurred for Microsoft’s Foundry product, an Azure enterprise platform where companies can build and manage AI agents, according to The Information, which cited two salespeople in Azure’s cloud unit.
AI agents can carry out a series of actions for a user or organization autonomously.
Less than a fifth of salespeople in one U.S. Azure unit met the Foundry sales growth target of 50%, according to The Information.
In another unit, the quota was set to double Foundry sales, The Information reported. The quota was dropped to 50% after most salespeople didn’t meet it.
In a statement, the company said the news outlet inaccurately combined the concepts of growth and quotas.
Read more CNBC tech news
“Aggregate sales quotas for AI products have not been lowered, as we informed them prior to publication,” a Microsoft Spokesperson said.
The AI boom has presented opportunities for businesses to add efficiencies and streamline tasks, with the companies that build these agents touting the power of the tools to take on work and allow workers to do more.
OpenAI, Google, Anthropic, Salesforce, Amazon and others all have their own tools to create and manage these AI assistants.
But the adoption of these tools by traditional businesses hasn’t seen the same surge as other parts of the AI ecosystem.
The Information noted AI adoption struggles at private equity firm Carlyle last year, in which the tools wouldn’t reliably connect data from other places. The company later reduced how much it spent on the tools.