On Wednesday, Googlepreviewed what could be one of the largest changes to the search engine in its history.
Google will use AI models to combine and summarize information from around the web in response to search queries, a product it calls Search Generative Experience.
related investing news
2 hours ago
5 hours ago
Instead of “ten blue links,” the phrase that describes Google’s usual search results, Google will show some users paragraphs of AI-generated text and a handful of links at the top of the results page.
The new AI-based search is being tested now for a select group of users and isn’t widely available yet. But website publishers are already worried that if it becomes Google’s default way of presenting search results, it could hurt them by sending fewer visitors to their sites and keeping them on Google.com.
The controversy highlights a long-running tension between Google and the websites it indexes, with a new artificial intelligence twist. Publishers have long worried that Google repurposes their verbatim content in snippets on its own website, but now Google is using advanced machine learning models that scrape large parts of the web to “train” the software to spit out human-like text and responses.
Rutledge Daugette, CEO of TechRaptor, a site focusing on gaming news and reviews, said that Google’s move was made without considering the interests of publishers and Google’s AI amounts to lifting content.
“Their focus is on zero-click searches that use information from publishers and writers who spend time and effort creating quality content, without offering any benefit other than the potential of a click,” Rutledge told CNBC. “Thus far, AI has been quick to reuse others’ information with zero benefit to them, and in cases like Google Bard doesn’t even offer attribution as to where the information it’s using came from.”
Luther Lowe, a longtime Google critic and chief of public policy at Yelp, said that Google’s update is part of a decades-long strategy to keep users on the site for longer, instead of sending them to the sites that originally hosted the information.
“The exclusionary self-preferencing of Google’s ChatGPT clone into search is the final chapter of bloodletting the web,” Lowe told CNBC.
According to SearchEngineLand, a news website that closely tracks changes to Google’s search engine, the AI-generated results are displayed above the organic search results in testing so far.
SGE comes in a differently colored box — green in the example — and includes boxed links to three websites on the right side. In Google’s primary example, all three of the website headlines were cut off.
Google says that the information isn’t taken from the websites, but is instead corroborated by the links. SearchEngineLand said the SGE approach was an improvement and a “healthier” way to link than Google’s Bard chatbot, which rarely linked to publisher websites.
Some publishers are wondering if they can prevent AI firms such as Google from scraping their content to train their models. Companies such as the firm behind Stable Diffusion are already facing lawsuits from data owners, but the right to scrape web data for AI remains an undecided frontier. Other companies, such as Reddit, have announced plans to charge for access to their data.
Leading the charge in the publishing world is Barry Diller, Chairman of IAC, which owns websites including All Recipes, People Magazine and The Daily Beast.
“If all the world’s information is able to be sucked up into this maw, and then essentially repackaged in declarative sentences, in what’s called chat, but it isn’t chat — as many grafs as you want, 25 on any subject — there will be no publishing, because it will be impossible,” Diller said last month at a conference.
“What you have to do is get the industry to say that you cannot scrape our content, until you work out systems where the publisher gets some avenue towards payment,” Diller continued, saying that Google will face this problem.
Diller says he believes publishers can sue AI firms under copyright law and that current “fair use” restrictions need to be redefined. The Financial Times reported on Wednesday that Diller is leading a group of publishers “that is going to say we are going to change copyright law if necessary.” An IAC spokesperson declined to make Diller available for an interview.
One challenge facing publishers is confirming that their content is being used by an AI. Google did not reveal training sources for its large language model that underpins SGE, PaLM 2, and Daugette says while he’s seen examples of quotes and review scores from competitors repurposed on Bard without attribution, it’s hard to tell when the information is from his site without directly linked sources.
Google didn’t respond to a request for comment. “PaLM 2 is trained on a wide range of openly available data on the internet and we obviously value the health of the web ecosystem. And that’s really part of the way we think about how we build our products, to ensure that we have a healthy ecosystem where creators are a part of that thriving ecosystem,” Google VP of Research Zoubin Ghahramani said in a media briefing earlier this week.
Daugette says that Google’s moves make being an independent publisher tough.
“I think it’s really frustrating for our industry to have to worry about our hard work being taken, when so many colleagues are being laid off,” Daugette said. “It’s just not okay.”
Meta Platforms tried to poach OpenAI employees by offering signing bonuses as high as $100 million, with even larger annual compensation packages, OpenAI chief executive Sam Altman said.
While Meta had sought to hire “a lot of people” from OpenAI, “so far none of our best people have decided to take them up on that,” Altman said, speaking on the “Uncapped” podcast, which is hosted by his brother.
“I’ve heard that Meta thinks of us as their biggest competitor,” he said. “Their current AI efforts have not worked as well as they have hoped and I respect being aggressive and continuing to try new things.”
Meta did not immediately respond to a request for comment from CNBC.
The Meta CEO is personally trying to assemble a top artificial intelligence team for its “superintelligence” AI lab and has invested heavily in AI through its Meta AI research division, which also oversees its Llama series of open-source large language models.
The moves come after Meta had once again delayed the release of its latest flagship AI model due to concerns about its capabilities, according to a report from the Wall Street Journal.
Meanwhile, sources have previously told CNBC that Zuckerberg has become so frustrated with Meta’s standing in AI that he’s willing to invest billions in top talent.
Last week Alexandr Wang, founder of Scale AI, announced he was leaving for Meta as part of a deal that saw the Facebook parent dish out $14.3 billion for a 49% stake in the AI startup. Wang added that a small number of Scale AI employees would also join Meta as part of the agreement.
The Times had previously reported that Wang would head a research lab pursuing “superintelligence,” an AI system that surpasses human intelligence.
The company has also recently poached other top talent, including Jack Rae, a principal researcher at Google’s AI research laboratory DeepMind, according to a report from Bloomberg. The report added that Zuckerberg had been directly involved with the recruitment efforts.
Speaking on the podcast, which was released on Tuesday, Altman said that Meta’s strategy of offering a large, upfront, guaranteed compensation would detract from the actual work and not set up a winning culture.
“I think that there’s a lot of people, and Meta will be a new one, that are saying ‘we’re just going to try to copy OpenAI,'” he added. “That basically never works. You’re always going to where your competitor was, and you don’t build up a culture of learning what it’s like to innovate.”
However, spending big on startups and their talent is nothing new to the AI space. Former Apple chief design officer Jony Ive joined OpenAI after the company acquired Ive’s AI devices startup io through a $6.4 billion all-equity deal last month.
Some tech analysts have also pushed back against the notion that Meta has been missing the mark on AI.
“They basically built the rails for open source AI development, and so much of what is happening in AI is being built on Meta,” Daniel Newman, CEO at Futurum Group, told CNBC’s “Power Lunch” last week.
Open-source generally refers to software in which the source code is made freely available on the web for possible modification and redistribution. Llama’s open-source characteristics have allowed many third-party applications to be built on top of it.
Newman added that Meta’s massive investments, such as in ScaleAI, will continue to push it forward in training its behemoth models.
For a third time since taking office in January, President Donald Trumpplans toextend a deadline that would require China’s ByteDance to divest TikTok’s U.S. business.
“President Trump will sign an additional Executive Order this week to keep TikTok up and running,” White House Press Secretary Karoline Leavitt said in a statement. “As he has said many times, President Trump does not want TikTok to go dark. This extension will last 90 days, which the Administration will spend working to ensure this deal is closed so that the American people can continue to use TikTok with the assurance that their data is safe and secure.”
ByteDance was nearing the deadline of June 19, to sell TikTok’s U.S. operations in order to satisfy a national security law that the Supreme Court upheld just a few days before Trump’s second presidential inauguration. Under the law, app store operators like Apple and Google and internet service providers would be penalized for supporting TikTok.
ByteDance originally faced a Jan. 19 deadline to comply with the national security law, but Trump signed an executive order when he first took office that pushed the deadline to April 5. Trump extended the deadline for the second time a day before that April mark.
Trump told NBC News in May that he would extend the TikTok deadline again if no deal was reached, and he reiterated his plans on Thursday.
Prior to Trump signing the first executive order, TikTok briefly went offline in the U.S. for a day, only to return after the president’s announcement. Apple and Google also removed TikTok from the Apple App Store and Google Play during TikTok’s initial U.S. shut down, but then reinstated the app to their respective app stores in February.
Multiple parties including Oracle, AppLovin, and Billionaire Frank McCourt’s Project Liberty consortium have expressed interest in buying TikTok’s U.S. operations. It’s unclear whether the Chinese government would approve a deal.
— CNBC’s Kevin Breuninger contributed to this report
Amazon Web Services is set to announce an update to its Graviton4 chip that includes 600 gigabytes per second of network bandwidth, what the company calls the highest offering in the public cloud.
Ali Saidi, a distinguished engineer at AWS, likened the speed to a machine reading 100 music CDs a second.
Graviton4, a central processing unit, or CPU, is one of many chip products that come from Amazon’s Annapurna Labs in Austin, Texas. The chip is a win for the company’s custom strategy and putting it up against traditional semiconductor players like Intel and AMD.
At AWS’s re:Invent 2024 conference last December, the company announced Project Rainier – an AI supercomputer built for startup Anthropic. AWS has put $8 billion into backing Anthropic.
AWS Senior Director for Customer and Project Engineering Gadi Hutt said Amazon is looking to reduce AI training costs and provide an alternative to Nvidia’s expensive graphics processing units, or GPUs.
Anthropic’s Claude Opus 4 AI model is trained on Trainium2 GPUs, according to AWS, and Project Rainier is powered by over half a million of the chips – an order that would have traditionally gone to Nvidia.
Read more CNBC tech news
Hutt said that while Nvidia’s Blackwell is a higher-performing chip than Trainium2, the AWS chip offers better cost performance.
“Trainium3 is coming up this year, and it’s doubling the performance of Trainium2, and it’s going to save energy by an additional 50%,” he said.
The demand for these chips is already outpacing supply, according to Rami Sinno, director of engineering at AWS’ Annapurna Labs.
“Our supply is very, very large, but every single service that we build has a customer attached to it,” he said.
With Graviton4’s upgrade on the horizon and Project Rainier’s Trainium chips, Amazon is demonstrating its broader ambition to control the entire AI infrastructure stack, from networking to training to inference.
And as more major AI models like Claude 4 prove they can train successfully on non-Nvidia hardware, the question isn’t whether AWS can compete with the chip giant — it’s how much market share it can take.
The release schedule for the Graviton4 update will be provided by the end of June, according to an AWS spokesperson.