Cohere president Martin Kon says a lot of the hot artificial intelligence startups on the market today are building the equivalent of fancy sports cars. His product, he says, is more like a heavy-duty truck.
“If you’re looking for vehicles for your field technical service department, and I take you for a test drive in a Bugatti, you’re going to be impressed by how fast and how well it performs,” Kon told CNBC in an interview. However, he said, the price coupled with the space limitations and lack of a trunk will be a problem.
“What you actually need is a fleet of F-150 pickup trucks,” Kon said. “We make F-150s.”
Founded by ex-Google AI researchers and backed by Nvidia, Cohere is betting on generative AI for the enterprise rather than on consumer chatbots, which have been the talk of the tech industry since OpenAI released ChatGPT in late 2022.
In June, Cohere raised $270 million at a $2.2 billion valuation, with Salesforce and Oracle participating in the funding round. Company executives have attended AI forums at the White House. And Cohere is reportedly in talks to raise up to $1 billion in additional capital.
“We don’t comment on rumors,” Kon told CNBC. “But someone once told me startups are always raising.”
The generative AI field has exploded over the past year, with a record $29.1 billion invested across nearly 700 deals in 2023, a more than 260% increase in deal value from a year earlier, according to PitchBook. It’s become the buzziest phrase on corporate earnings calls quarter after quarter, and some form of the technology is automating tasks in just about every industry, from financial services and biomedical research to logistics, online travel and utilities.
Although Cohere is often mentioned alongside AI heavyweights like OpenAI, Anthropic, Google and Microsoft, the startup’s focus on enterprise-only chatbots has set it apart.
Competitors offer AI products for both consumers and businesses. OpenAI, for instance, launched ChatGPT Enterprise in August, and Anthropic opened up consumer access to its formerly business-only Claude chatbot in July.
Kon, who’s also the company’s operating chief, said that by staying focused just on the enterprise, Cohere is able to run efficiently and keep costs under control even amid a chip shortage, rising costs for graphics processing units (GPUs) and ever-changing licensing fees for AI models.
“I’ve rarely seen, in my career, many companies that can successfully be consumer and enterprise at the same time, let alone a startup,” Kon said. He added, “We don’t have to raise billions of dollars to run a free consumer service.”
Current clients include Notion, Oracle and Bamboo HR, according to Cohere’s website. Many customers fall into the categories of banking, financial services and insurance, Kon said. In November, Cohere told CNBC it saw an uptick in customer interest after OpenAI’s sudden and temporary ouster of CEO Sam Altman.
Kon acknowledges that changing dynamics in the hardware industry have presented persistent challenges. The company has had a reserve of Google chips for well over two years, Kon said, secured in Cohere’s early days to help it pretrain its models.
Now, Cohere is moving toward using more of Nvidia’s H100 GPUs, which are powering most of today’s large language models.
Cohere’s relationships with strategic investors are another area where it differs from generative AI competitors, Kon said. Many companies have raised from the likes of Nvidia and Microsoft with some conditions that are tied to use of their software or chips.
Kon is adamant that Cohere has never accepted a conditional investment, and that every check it’s cashed – including from Nvidia – had no strings attached.
“In our last round, we had multiple checks the same size; we had no conditions associated with any one of them,” Kon said. “We explicitly made that decision so we could say we’re not beholden to anyone.”
Cohere’s decision to focus on enterprise-only chatbots may help the company stay out of the murky territory of misinformation concerns, particularly as election season nears.
In January, the Federal Trade Commission announced an AI inquiry into Amazon, Alphabet, Microsoft, OpenAI and Anthropic. FTC Chair Lina Khan described it as a “market inquiry into the investments and partnerships being formed between AI developers and major cloud service providers.” Cohere was not named.
Kon says the company’s growth so far has largely been around areas like search and retrieval, which require their own separate AI models. He calls it “tool use,” and it involves training models on where, when and how to look for information that an enterprise client needs, even if the model wasn’t trained on that data originally.
Search, Kon said, is a key piece of generative AI that’s getting less attention than other areas.
“That’s certainly, for enterprise, going to be the real unlock,” he said.
In discussing the timeline for expansion, Kon called 2023 “the year of the the proof of concept.”
“We think 2024 is turning into the year of deployment at scale,” he said.
Neptune and OpenAI have collaborated on a metrics dashboard to help teams that are building foundation models. The companies will work “even more closely together” because of the acquisition, Neptune CEO Piotr Niedźwiedź said in a blog.
The startup will wind down its external services in the coming months, Niedźwiedź said. The terms of the acquisition were not disclosed.
“Neptune has built a fast, precise system that allows researchers to analyze complex training workflows,” OpenAI’s Chief Scientist Jakub Pachocki said in a statement. “We plan to iterate with them to integrate their tools deep into our training stack to expand our visibility into how models learn.”
OpenAI has acquired several companies this year.
It purchased a small interface startup called Software Applications Incorporated for an undisclosed sum in October, product development startup Statsig for $1.1 billion in September and Jony Ive’s AI devices startup io for more than $6 billion in May.
Neptune had raised more than $18 million in funding from investors including Almaz Capital and TDJ Pitango Ventures, according to its website. Neptune’s deal with OpenAI is still subject to customary closing conditions.
“I am truly grateful to our customers, investors, co-founders, and colleagues who have made this journey possible,” Niedźwiedź said. “It was the ride of a lifetime already, yet still I believe this is only the beginning.”
A person walks by a sign for Micron Technology headquarters in San Jose, California, on June 25, 2025.
Justin Sullivan | Getty Images
Micron said on Wednesday that it plans to stop selling memory to consumers to focus on meeting demand for high-powered artificial intelligence chips.
“The AI-driven growth in the data center has led to a surge in demand for memory and storage,” Sumit Sadana, Micron business chief, said in a statement. “Micron has made the difficult decision to exit the Crucial consumer business in order to improve supply and support for our larger, strategic customers in faster-growing segments.”
Micron’s announcement is the latest sign that the AI infrastructure boom is creating shortages for inputs like memory as a handful of companies commit to spend hundreds of billions in the next few years to build massive data centers. Memory, which is used by computers to store data for short periods of time, is facing a global shortage.
Micron shares are up about 175% this year, though they slipped 3% on Wednesday to $232.25.
AI chips, like the GPUs made by Nvidia and AdvancedMicro Devices, use large amounts of the most advanced memory. For example, the current-generation Nvidia GB200 chip has 192GB of memory per graphics processor. Google’s latest AI chip, the Ironwood TPU, needs 192GB of high-bandwidth memory.
Memory is also used in phones and computers, but with lower specs, and much lower quantities — many laptops only come with 16GB of memory. Micron’s Crucial brand sold memory on sticks that tinkerers could use to build their own PCs or upgrade their laptops. Crucial also sold solid-state hard drives.
Micron competes against SK Hynix and Samsung in the market for high-bandwidth memory, but it’s the only U.S.-based memory supplier. Analysts have said that SK Hynix is Nvidia’s primary memory supplier.
Micron supplies AMD, which says its AI chips use more memory than others, providing them a performance advantage for running AI. AMD’s current AI chip, the MI350, comes with 288GB of high-bandwidth memory.
Micron’s Crucial business was not broken out in company earnings. However, its cloud memory business unit showed 213% year-over-year growth in the most recent quarter.
Analysts at Goldman on Tuesday raised their price target on Micron’s stock to $205 from $180, though they maintained their hold recommendation. The analysts wrote in a note to clients that due to “continued pricing momentum” in memory, they “expect healthy upside to Street estimates” when Micron reports quarterly results in two weeks.
A Micron spokesperson declined to comment on whether the move would result in layoffs.
“Micron intends to reduce impact on team members due to this business decision through redeployment opportunities into existing open positions within the company,” the company said in its release.
Microsoft pushed back on a report Wednesday that the company lowered growth targets for artificial intelligence software sales after many of its salespeople missed those goals in the last fiscal year.
The company’s stock sank more than 2% on The Information report.
A Microsoft spokesperson said the company has not lowered sales quotas or targets for its salespeople.
The sales lag occurred for Microsoft’s Foundry product, an Azure enterprise platform where companies can build and manage AI agents, according to The Information, which cited two salespeople in Azure’s cloud unit.
AI agents can carry out a series of actions for a user or organization autonomously.
Less than a fifth of salespeople in one U.S. Azure unit met the Foundry sales growth target of 50%, according to The Information.
In another unit, the quota was set to double Foundry sales, The Information reported. The quota was dropped to 50% after most salespeople didn’t meet it.
In a statement, the company said the news outlet inaccurately combined the concepts of growth and quotas.
Read more CNBC tech news
“Aggregate sales quotas for AI products have not been lowered, as we informed them prior to publication,” a Microsoft Spokesperson said.
The AI boom has presented opportunities for businesses to add efficiencies and streamline tasks, with the companies that build these agents touting the power of the tools to take on work and allow workers to do more.
OpenAI, Google, Anthropic, Salesforce, Amazon and others all have their own tools to create and manage these AI assistants.
But the adoption of these tools by traditional businesses hasn’t seen the same surge as other parts of the AI ecosystem.
The Information noted AI adoption struggles at private equity firm Carlyle last year, in which the tools wouldn’t reliably connect data from other places. The company later reduced how much it spent on the tools.