Connect with us

Published

on

New York Times sues OpenAI, Microsoft for copyright infringement

The New York Times on Wednesday filed a lawsuit against Microsoft and OpenAI, creator of the popular AI chatbot ChatGPT, accusing the companies of copyright infringement and abusing the newspaper’s intellectual property to train large language models.

Microsoft both invests in and supplies OpenAI, providing it with access to the company’s Azure cloud computing technology.

The publisher said in a filing in the U.S. District Court for the Southern District of New York that it seeks to hold Microsoft and OpenAI to account for the “billions of dollars in statutory and actual damages” it believes it is owed for the “unlawful copying and use of The Times’s uniquely valuable works.”

The Times said in an emailed statement that it “recognizes the power and potential of GenAI for the public and for journalism,” but added that journalistic material should be used for commercial gain with permission from the original source.

“These tools were built with and continue to use independent journalism and content that is only available because we and our peers reported, edited, and fact-checked it at high cost and with considerable expertise,” the Times said.

The New York Times Building in New York City on February 1, 2022.

Angela Weiss | AFP | Getty Images

“Settled copyright law protects our journalism and content,” the Times added. “If Microsoft and OpenAI want to use our work for commercial purposes, the law requires that they first obtain our permission. They have not done so.”

“We respect the rights of content creators and owners and are committed to working with them to ensure they benefit from AI technology and new revenue models,” an OpenAI representative said in a statement. “Our ongoing conversations with the New York Times have been productive and moving forward constructively, so we are surprised and disappointed with this development. We’re hopeful that we will find a mutually beneficial way to work together, as we are doing with many other publishers.”

A representative for Microsoft didn’t respond to requests for comment.

The Times is represented in the proceedings by Susman Godfrey, the litigation firm that represented Dominion Voting Systems in its defamation suit against Fox News that culminated in a $787.5 million million settlement.

Susman Godfrey is also representing author Julian Sancton and other writers in a separate lawsuit against OpenAI and Microsoft that accuses the companies of using copyrighted materials without permission to train several versions of ChatGPT.

‘Mass copyright infringement’

The Times is one of numerous media organizations pursuing compensation from companies behind some of the most advanced artificial intelligence models, for the alleged usage of their content to train AI programs.

OpenAI is the creator of GPT, a large language model that can produce humanlike content in response to user prompts. It uses billions of parameters’ worth of information, which is obtained from public web data up until 2021.

Media publishers and content creators are finding their materials being used and reimagined by generative AI tools like ChatGPT, Dall-E, Midjourney and Stable Diffusion. In numerous cases, the content the programs produce can look similar to the source material.

OpenAI has tried to allay news publishers’ concerns. In December, the company announced a partnership with Axel Springer — the parent company of Business Insider, Politico, and European outlets Bild and Welt — which would license its content to OpenAI in return for a fee.

The financial terms of the deal weren’t disclosed.

In its lawsuit Wednesday, the Times accused Microsoft and OpenAI of creating a business model based on “mass copyright infringement,” stating that the companies’ AI systems were “used to create multiple reproductions of The Times’s intellectual property for the purpose of creating the GPT models that exploit and, in many cases, retain large portions of the copyrightable expression contained in those works.”

Publishers are concerned that, with the advent of generative AI chatbots, fewer people will click through to news sites, resulting in shrinking traffic and revenues.

The Times included numerous examples in the suit of instances where GPT-4 produced altered versions of material published by the newspaper.

In one example, the filing shows OpenAI’s software producing almost identical text to a Times article about predatory lending practices in New York City’s taxi industry.

But in OpenAI’s version, GPT-4 excludes a critical piece of context about the sum of money the city made selling taxi medallions and collecting taxes on private sales.

In its suit, the Times said Microsoft and OpenAI’s GPT models “directly compete with Times content.”

The AI models also limited the Times’ commercial opportunities by altering its content. For example, the publisher alleges GPT outputs remove links to products featured in its Wirecutter app, a product reviews platform, “thereby depriving The Times of the opportunity to receive referral revenue and appropriating that opportunity for Defendants.”

The Times also alleged Microsoft and OpenAI models produce content similar to that generated by the newspaper, and that their use of its content to train LLMs without consent “constitutes free-riding on The Times’s significant efforts and investment of human capital to gather this information.”

The Times said Microsoft and OpenAI’s LLMs “can generate output that recites Times content verbatim, closely summarizes it, and mimics its expressive style,” and “wrongly attribute false information to The Times,” and “deprive The Times of subscription, licensing, advertising, and affiliate revenue.”

CNBC’s Rohan Goswami contributed to this report.

Don’t miss these stories from CNBC PRO:

Continue Reading

Technology

Nvidia announces new AI chips months after latest launch as market competition heats up

Published

on

By

Nvidia announces new AI chips months after latest launch as market competition heats up

Jensen Huang, co-founder and chief executive officer of Nvidia Corp., during the Nvidia GPU Technology Conference (GTC) in San Jose, California, US, on Tuesday, March 19, 2024. 

David Paul Morris | Bloomberg | Getty Images

Nvidia on Sunday unveiled its next generation of artificial intelligence chips to succeed the previous model, which was announced just months earlier in March.

Nvidia CEO Jensen Huang announced the new AI chip architecture, dubbed “Rubin,” ahead of the COMPUTEX tech conference in Taipei.

Rubin comes months after the March announcement of the upcoming “Blackwell” model, which is still in production and expected to ship to customers later in 2024.

Huang’s announcement of Rubin appears to quicken the company’s already-accelerated pace of AI chip advancement.

Nvidia has pledged to release new AI chip models on a “one-year rhythm,” as Huang put it on Sunday. The company had previously been operating on a slower two-year update timeline for chips.

The turnaround from Blackwell to Rubin was a matter of less than three months, underscoring the competitive frenzy in the AI chip market and Nvidia’s sprint to preserve its dominant spot.

AMD and Intel are two major competitors working to catch up, though their gross margins trailed Nvidia’s in the most recent fiscal quarter. Companies like Microsoft, Google and Amazon are also vying for Nvidia’s top spot, even as they are simultaneously some of Nvidia’s biggest patrons. A flurry of startups are also working to enter the space.

“Today, we’re at the cusp of a major shift in computing,” Huang said Sunday. “With our innovations in AI and accelerated computing, we’re pushing the boundaries of what’s possible and driving the next wave of technological advancement.”

The Rubin chip platform will have new GPUs, the crucial graphic processing technology that helps train and launch AI systems. It will come with other new features like a central processor called “Vera,” though the Sunday announcement did not provide many details.

Shares of Nvidia were relatively flat at Friday’s market close with shares trading at $1,096.

Continue Reading

Technology

Nvidia dominates the AI chip market, but there’s more competition than ever

Published

on

By

Nvidia dominates the AI chip market, but there's more competition than ever

Jensen Huang, co-founder and chief executive officer of Nvidia Corp., during the Nvidia GPU Technology Conference (GTC) in San Jose, California, US, on Tuesday, March 19, 2024. 

David Paul Morris | Bloomberg | Getty Images

Nvidia’s 27% rally in May pushed its market cap to $2.7 trillion, behind only Microsoft and Apple among the most-valuable public companies in the world. The chipmaker reported a tripling in year-over-year sales for the third straight quarter driven by soaring demand for its artificial intelligence processors.

Mizuho Securities estimates that Nvidia controls between 70% and 95% of the market for AI chips used for training and deploying models like OpenAI’s GPT. Underscoring Nvidia’s pricing power is a 78% gross margin, a stunningly high number for a hardware company that has to manufacture and ship physical products.

Rival chipmakers Intel and Advanced Micro Devices reported gross margins in the latest quarter of 41% and 47%, respectively.

Nvidia’s position in the AI chip market has been described as a moat by some experts. Its flagship AI graphics processing units (GPUs), such as the H100, coupled with the company’s CUDA software led to such a head start on the competition that switching to an alternative can seem almost unthinkable.

Still, Nvidia CEO Jensen Huang, whose net worth has swelled from $3 billion to about $90 billion in the past five years, has said he’s “worried and concerned” about his 31-year-old company losing its edge. He acknowledged at a conference late last year that there are many powerful competitors on the rise.

“I don’t think people are trying to put me out of business,” Huang said in November. “I probably know they’re trying to, so that’s different.”

Nvidia has committed to releasing a new AI chip architecture every year, rather than every other year as was the case historically, and to putting out new software that could more deeply entrench its chips in AI software.

But Nvidia’s GPU isn’t alone in being able to run the complex math that underpins generative AI. If less powerful chips can do the same work, Huang might be justifiably paranoid.

The transition from training AI models to what’s called inference — or deploying the models — could also give companies an opportunity to replace Nvidia’s GPUs, especially if they’re less expensive to buy and run. Nvidia’s flagship chip costs roughly $30,000 or more, giving customers plenty of incentive to seek alternatives.

“Nvidia would love to have 100% of it, but customers would not love for Nvidia to have 100% of it,” said Sid Sheth, co-founder of aspiring rival D-Matrix. “It’s just too big of an opportunity. It would be too unhealthy if any one company took all of it.”

Founded in 2019, D-Matrix plans to release a semiconductor card for servers later this year that aims to reduce the cost and latency of running AI models. The company raised $110 million in September.

In addition to D-Matrix, companies ranging from multinational corporations to nascent startups are fighting for a slice of the AI chip market that could reach $400 billion in annual sales in the next five years, according to market analysts and AMD. Nvidia has generated about $80 billion in revenue over the past four quarters, and Bank of America estimates the company sold $34.5 billion in AI chips last year.

Many companies taking on Nvidia’s GPUs are betting that a different architecture or certain trade-offs could produce a better chip for particular tasks. Device makers are also developing technology that could end up doing a lot of the computing for AI that’s currently taking place in large GPU-based clusters in the cloud.

“Nobody can deny that today Nvidia is the hardware you want to train and run AI models,” Fernando Vidal, co-founder of 3Fourteen Research, told CNBC. “But there’s been incremental progress in leveling the playing field, from hyperscalers working on their own chips, to even little startups, designing their own silicon.”

AMD CEO Lisa Su wants investors to believe there’s plenty of room for many successful companies in the space.

“The key is that there are a lot of options there,” Su told reporters in December, when her company launched its most recent AI chip. “I think we’re going to see a situation where there’s not only one solution, there will be multiple solutions.”

Other big chipmakers

Lisa Su displays an AMD Instinct MI300 chip as she delivers a keynote address at CES 2023 in Las Vegas, Nevada, on Jan. 4, 2023.

David Becker | Getty Images

Nvidia’s top customers

How AWS is designing its own chips to help catch Microsoft and Google in generative A.I. race

One potential challenge for Nvidia is that it’s competing against some of its biggest customers. Cloud providers including Google, Microsoft and Amazon are all building processors for internal use. The Big Tech three, plus Oracle, make up over 40% of Nvidia’s revenue.

Amazon introduced its own AI-oriented chips in 2018, under the Inferentia brand name. Inferentia is now on its second version. In 2021, Amazon Web Services debuted Tranium targeted to training. Customers can’t buy the chips but they can rent systems through AWS, which markets the chips as more cost efficient than Nvidia’s.

Google is perhaps the cloud provider most committed to its own silicon. The company has been using what it calls Tensor Processing Units (TPUs) since 2015 to train and deploy AI models. In May, Google announced the sixth version of its chip, Trillium, which the company said was used to develop its models, including Gemini and Imagen.

Google also uses Nvidia chips and offers them through its cloud.

Microsoft isn’t as far along. The company said last year that it was building its own AI accelerator and processor, called Maia and Cobalt.

Meta isn’t a cloud provider, but the company needs massive amounts of computing power to run its software and website and to serve ads. While the Facebook parent company is buying billions of dollars worth of Nvidia processors, it said in April that some of its homegrown chips were already in data centers and enabled “greater efficiency” compared to GPUs.

JPMorgan analysts estimated in May that the market for building custom chips for big cloud providers could be worth as much as $30 billion, with potential growth of 20% per year.

Startups

Cerebras’ WSE-3 chip is one example of new silicon from upstarts designed to run and train artificial intelligence.

Cerebras Systems

Venture capitalists see opportunities for emerging companies to jump into the game. They invested $6 billion in AI semiconductor companies in 2023, up slightly from $5.7 billion a year earlier, according to data from PitchBook.

It’s a tough area for startups as semiconductors are expensive to design, develop and manufacture. But there are opportunities for differentiation.

For Cerebras Systems, an AI chipmaker in Silicon Valley, the focus is on basic operations and bottlenecks for AI, versus the more general purpose nature of a GPU. The company was founded in 2015 and was valued at $4 billion during its most recent fundraising, according to Bloomberg.

The Cerebras chip, WSE-2, puts GPU capabilities as well as central processing and additional memory into a single device, which is better for training large models, said CEO Andrew Feldman.

“We use a giant chip, they use a lot of little chips,” Feldman said. “They’ve got challenges of moving data around, we don’t.”

Feldman said his company, which counts Mayo Clinic, GlaxoSmithKline, and the U.S. Military as clients, is winning business for its supercomputing systems even going up against Nvidia.

“There’s ample competition and I think that’s healthy for the ecosystem,” Feldman said.

Sheth from D-Matrix said his company plans to release a card with its chiplet later this year that will allow for more computation in memory, as opposed to on a chip like a GPU. D-Matrix’s product can be slotted into an AI server along existing GPUs, but it takes work off of Nvidia chips, and helps to lower the cost of generative AI.

Customers “are very receptive and very incentivized to enable a new solution to come to market,” Sheth said.

Apple and Qualcomm

Apple iPhone 15 series devices are displayed for sale at The Grove Apple retail store on release day in Los Angeles, California, on September 22, 2023. 

Patrick T. Fallon | Afp | Getty Images

Don’t miss these exclusives from CNBC PRO

Nvidia and Microsoft has overshadowed weakness in tech stocks, says Solus' Dan Greenhaus

Continue Reading

Technology

Software stocks got pummeled this week after a cluster of troubling earnings reports

Published

on

By

Software stocks got pummeled this week after a cluster of troubling earnings reports

Salesforce executives told investors that deals are shrinking or getting delayed. Dell said its margin is getting smaller. Okta highlighted macroeconomic challenges. And Veeva’s CEO said on his company’s earnings call that generative artificial intelligence has been “a competing priority” for customers.

Add it all up and it was a brutal week for software and enterprise tech.

Salesforce shares plunged almost 20% on Thursday, the biggest drop since 2004, after the cloud software vendor posted weaker-than-expected revenue and issued disappointing guidance. CEO Marc Benioff said Salesforce grew quickly in the Covid age as companies rushed to buy products for remote work. Then customers had to integrate all the new technology, and to eventually rationalize.

“Every enterprise software company kind of has adjusted” since after the pandemic, Benioff said on his company’s earnings call. Businesses that have reported lately are “all basically saying that same thing in different ways.”

Software makers MongoDB, SentinelOne, UiPath and Veeva all pulled down their full-year revenue forecasts this week.

The WisdomTree Cloud Computing Fund, an exchange-traded fund that tracks cloud stocks, slid 5% this week, the sharpest decline since January. Paycom, GitLab, Confluent, Snowflake and ServiceNow all lost at least 10% of their value in the downdraft.

Dell, which sells PCs and data center hardware to businesses, bumped up its full-year forecast on Thursday and said its backlog for AI servers had grown to $3.8 billion from $2.9 billion three months ago. But the growing portion of these servers in the product mix, along with higher input costs, will cause the company’s gross margin to narrow by 150 basis points for the year.

Dell shares slid 13% for the week after hitting fresh highs. The company has been viewed as a beneficiary of the generative AI wave as businesses step up their hardware purchases. Expectations were “elevated,” Barclays analysts wrote in a note on the results.

Okta’s stock price fell almost 9% for the week. Analysts cited weaker-than-expected subscription backlog. The company said economic conditions are hurting the identity software maker’s ability to sign up new customers and get existing ones to expand purchases.

“Macroeconomic headwinds are still out there,” Okta finance chief Brett Tighe said on the company’s earnings call.

One reading of inflation this week came in slightly higher than expected. U.S central bankers are holding steady on the benchmark interest rate, which has been at a 23-year high.

At UiPath, a developer of automation software, the pace of business slumped in late March and in April, in part because of the economy, co-founder Daniel Dines told analysts on Wednesday. Customers were also becoming more hesitant to commit to multi-year deals, said Dines, who is replacing former Google executive Rob Enslin as CEO on June 1, just months after stepping down as co-CEO.

Cybersecurity software vendor SentinelOne is seeing a similar trend.

“There’s no question that buying habits are changing,” SentinelOne CEO Tomer Weingarten told CNBC on Friday, adding that “how customers are evaluating software” is also changing. His company’s stock price plunged 22% for the week after guidance missed estimates.

Then there’s the impact of AI, which is causing businesses to reprioritize.

Veeva CEO Peter Gassner cited “disruption in large enterprises as they work through their plans for AI.” Veeva, which sells life sciences software, lost almost 15% of its value this week on concerns about spending in the back half of the year.

Gassner said on the earnings call that generative AI represents “a competing priority” for Veeva clients.

The news wasn’t bad across the board. Zscaler‘s stock jumped 8.5% on Friday after the security software provider beat expectations for the quarter and raised its full-year forecast.

“We expect demand to remain strong as an increasing number of enterprises are planning to adopt our platform for better cyber and data protection,” CEO Jay Chaudhry said on the company’s earnings call.

—CNBC’s Ari Levy contributed to this report.

WATCH: Earnings are good, but software has to execute better, says FBB Capital’s Mike Bailey

Continue Reading

Trending