Connect with us

Published

on

There's a water crisis looming. Big Tech and AI could make it worse

Dubai, UNITED ARAB EMIRATES — A global rush for the next wave of generative artificial intelligence is increasing public scrutiny on an often-overlooked but critically important environmental issue: Big Tech’s expanding water footprint.

Tech giants, including the likes of Microsoft and Alphabet-owned Google, have recently reported a substantial upswing in their water consumption and researchers say one of the main culprits is the race to capitalize on the next wave of AI.

Shaolei Ren, a researcher at the University of California, Riverside, published a study in April investigating the resources needed to run buzzy generative AI models, such as OpenAI’s ChatGPT.

Ren and his colleagues found that ChatGPT gulps 500 milliliters of water (roughly the amount of water in a standard 16-ounce bottle) for every 10 to 50 prompts, depending on when and where the AI model is deployed.

Hundreds of millions of monthly users all submitting questions on the popular chatbot quickly illustrates just how “thirsty” AI models can be.

The study’s authors warned that if the growing water footprint of AI models is not sufficiently addressed, the issue could become a major roadblock to the socially responsible and sustainable use of AI in the future.

People take part in a protest called by Uruguay’s Central Union (PIT-CNT) in “defense of water” against the handling of the national authorities with respect to the management of the shortage of drinking water reserves in Montevideo on May 31, 2023.

Eitan Abramovich | Afp | Getty Images

ChatGPT creator OpenAI, part owned by Microsoft, did not respond to a request to comment on the study’s findings.

“In general, the public is getting more knowledgeable and aware of the water issue and if they learn that the Big Tech’s are taking away their water resources and they are not getting enough water, nobody will like it,” Ren told CNBC via videoconference.

“I think we are going to see more clashes over the water usage in the coming years as well, so this type of risk will have to be taken care of by the companies,” he added.

‘A hidden cost’

Data centers are part of the lifeblood of Big Tech — and a lot of water is required to keep the power-hungry servers cool and running smoothly.

For Meta, its these warehouse-scale data centers that generate not only the highest percentage of its water use but also the lion’s share of its energy use and greenhouse gas emissions.

In July, protesters took to the streets of Uruguay’s capital to push back against Google’s plan to build a data center. The proposal sought to use vast quantities of water at a time when the South American country was suffering its worst drought in 74 years.

Google reportedly said at the time the project was still at an exploratory phase and stressed that sustainability remained at the heart of its mission.

With AI, we’re seeing the classic problem with technology in that you have efficiency gains but then you have rebound effects with more energy and more resources being used.

Somya Joshi

Head of division: global agendas, climate and systems at SEI

In Microsoft’s latest environmental sustainability report, the U.S. tech company disclosed that its global water consumption rose by more than a third from 2021 to 2022, climbing to nearly 1.7 billion gallons.

It means that Microsoft’s annual water use would be enough to fill more than 2,500 Olympic-sized swimming pools.

For Google, meanwhile, total water consumption at its data centers and offices came in at 5.6 billion gallons in 2022, a 21% increase on the year before.

Both companies are working to reduce their water footprint and become “water positive” by the end of the decade, meaning that they aim to replenish more water than they use.

Google plans to operate its data centers on carbon-free energy by 2030

It’s notable, however, that their latest water consumption figures were disclosed before the launch of their own respective ChatGPT competitors. The computing power needed to run Microsoft’s Bing Chat and Google Bard could mean significantly higher levels of water use over the coming months.

“With AI, we’re seeing the classic problem with technology in that you have efficiency gains but then you have rebound effects with more energy and more resources being used,” said Somya Joshi, head of division: global agendas, climate and systems at the Stockholm Environment Institute.

“And when it comes to water, we’re seeing an exponential rise in water use just for supplying cooling to some of the machines that are needed, like heavy computation servers, and large-language models using larger and larger amounts of data,” Joshi told CNBC during the COP28 climate summit in the United Arab Emirates.

“So, on one hand, companies are promising to their customers more efficient models … but this comes with a hidden cost when it comes to energy, carbon and water,” she added.

How are tech firms reducing their water footprint?

A spokesperson for Microsoft told CNBC that the company is investing in research to measure the energy and water use and carbon impact of AI, while working on ways to make large systems more efficient.

“AI will be a powerful tool for advancing sustainability solutions, but we need a plentiful clean energy supply globally to power this new technology, which has increased consumption demands,” a spokesperson for Microsoft told CNBC via email.

“We will continue to monitor our emissions, accelerate progress while increasing our use of clean energy to power datacenters, purchasing renewable energy, and other efforts to meet our sustainability goals of being carbon negative, water positive and zero waste by 2030,” they added.

Aerial view of the proposed site of the Meta Platforms Inc. data center outside Talavera de la Reina, Spain, on Monday, July 17, 2023. Meta is planning to build a 1 billion ($1.1 billion) data center which it expects to use about 665 million liters (176 million gallons) of water a year, and up to 195 liters per second during “peak water flow,” according to a technical report.

Paul Hanna | Bloomberg | Getty Images

Separately, a Google spokesperson told CNBC that research shows that while AI computing demand has dramatically increased, the energy needed to power this technology is rising “at a much slower rate than many forecasts have predicted.”

“We are using tested practices to reduce the carbon footprint of workloads by large margins; together these principles can reduce the energy of training a model by up to 100x and emissions by up to 1000x,” the spokesperson said.

“Google data centers are designed, built and operated to maximize efficiency – compared with five years ago, Google now delivers around 5X as much computing power with the same amount of electrical power,” they continued.

“To support the next generation of fundamental advances in AI, our latest TPU v4 [supercomputer] is proven to be one of the fastest, most efficient, and most sustainable ML [machine leanring] infrastructure hubs in the world.”

Continue Reading

Technology

OpenAI signs $38 billion compute deal with Amazon, partnering with cloud leader for first time

Published

on

By

OpenAI signs  billion compute deal with Amazon, partnering with cloud leader for first time

OpenAI signs $38B infrastructure deal with Amazon Web Service

OpenAI has signed a deal to buy $38 billion worth of capacity from Amazon Web Services, its first contract with the leader in cloud infrastructure and the latest sign that the $500 billion artificial intelligence startup is no longer reliant on Microsoft.

Under the agreement announced on Monday, OpenAI will immediately begin running workloads on AWS infrastructure, tapping hundreds of thousands of Nvidia’s graphics processing units (GPUs) in the U.S., with plans to expand capacity in the coming years.

Amazon stock climbed about 5% following the news.

The first phase of the deal will use existing AWS data centers, and Amazon will eventually build out additional infrastructure for OpenAI.

“It’s completely separate capacity that we’re putting down,” said Dave Brown, vice president of compute and machine learning services at AWS, in an interview. “Some of that capacity is already available, and OpenAI is making use of that.”

Read more CNBC Amazon coverage

OpenAI has been on a dealmaking spree of late, announcing roughly $1.4 trillion worth of buildout agreements with companies including Nvidia, Broadcom, Oracle and Google — prompting skeptics to warn of an AI bubble and question whether the country has the power and resources needed to turn the ambitious promises into reality.

Until this year, OpenAI had an exclusive cloud agreement with Microsoft, which first backed the company in 2019 and has invested a total of $13 billion. In January, Microsoft said it would no longer be the exclusive cloud provider for OpenAI, and was moving to an arrangement where it would have right of first refusal for new requests.

Last week, Microsoft’s preferential status expired under its newly negotiated commercial terms with OpenAI, freeing the ChatGPT creator to partner more widely with the other hyperscalers. Even before that, OpenAI forged cloud deals with Oracle and Google, but AWS is by far the market leader.

“Scaling frontier AI requires massive, reliable compute,” OpenAI CEO Sam Altman said in Monday’s release. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”

OpenAI will still be spending heavily with Microsoft, reaffirming that commitment by saying last week that it will purchase an incremental $250 billion of Azure services.

Amazon's $11B data center goes live: Here's an inside look

For Amazon, the pact is significant both in the size and scale of the deal itself and because the cloud giant has close ties to OpenAI rival Anthropic. Amazon has invested billions of dollars in Anthropic, and is currently constructing an $11 billion data center campus in New Carlisle, Indiana, that’s designed exclusively for Anthropic workloads.

“The breadth and immediate availability of optimized compute demonstrates why AWS is uniquely positioned to support OpenAI’s vast AI workloads,” AWS CEO Matt Garman said in the release.

In its earnings report last week, Amazon reported more than 20% year-over-year revenue growth at AWS, beating analyst estimates. But growth was faster at Microsoft and Google, which reported cloud expansion of 40% and 34%, respectively.

Starting on Nvidia

The current agreement with OpenAI is explicitly for use of Nvidia chips, including two popular Blackwell models, but there’s potential to incorporate additional silicon down the road. Amazon’s custom-built Trainium chip is being used by Anthropic in the new facility.

“We like Trainium because we’re able to give customers something that gives them better price performance and honestly gives them choice,” Brown said, adding that he can’t provide any details on “anything we’ve done with OpenAI on Trainium at this point.”

The infrastructure will support both inference — such as powering ChatGPT’s real-time responses — and training of next-generation frontier models. OpenAI can expand with AWS as needed over the next seven years, but no plans beyond 2026 have been finalized.

OpenAI CEO Sam Altman (L) shakes hands with Microsoft Chief Technology Officer and Executive VP of Artificial Intelligence Kevin Scott during the Microsoft Build conference at the Seattle Convention Center Summit Building in Seattle, Washington, U.S., on May 21, 2024.

Jason Redmond | Afp | Getty Images

OpenAI’s foundation models, including so-called open-weight options, are already available on Bedrock, AWS’s managed service for accessing leading AI systems.

Companies including Peloton, Thomson Reuters, Comscore, and Triomics use OpenAI models on AWS for tasks ranging from coding and mathematical problem solving to scientific analysis and agentic workflows.

Monday’s announcement establishes a more direct relationship.

“As part of this deal, OpenAI is a customer of AWS,” Brown said. “They’ve committed to buying compute capacity from us, and we’re charging OpenAI for that capacity. It’s very, very straightforward.”

For OpenAI, the most highly valued private AI company, the AWS agreement is another step in getting ready to eventually go public. By diversifying its cloud partners and locking in long-term capacity across providers, OpenAI is signaling both independence and operational maturity.

Altman acknowledged in a recent livestream that an IPO is “the most likely path” given OpenAI’s capital needs. CFO Sarah Friar has echoed that sentiment, framing the recent corporate restructuring as a necessary step toward going public.

WATCH: AWS CEO Matt Garman on Amazon’s massive new AI data center for Anthropic

AWS CEO Matt Garman on Amazon's massive new AI data center for Anthropic, Trainium chips and more

Continue Reading

Technology

MongoDB CEO Dev Ittycheria steps down, replaced by Cloudflare executive CJ Desai

Published

on

By

MongoDB CEO Dev Ittycheria steps down, replaced by Cloudflare executive CJ Desai

MongoDB CEO Dev Ittycheria arrives at the Allen & Co. Media and Technology Conference in Sun Valley, Idaho, on July 9, 2025.

David Paul Morris | Bloomberg | Getty Images

Database software maker MongoDB said on Monday that CEO Dev Ittycheria is stepping down from the top job after an 11-year run.

Chirantan “CJ” Desai, who has spent the past year as president of product and engineering at Cloudflare, is replacing Ittycheria, effective Nov. 10, MongoDB said. Ittycheria will remain on the company’s board.

“Earlier this year, I would say as part of our normal succession planning process, the board asked me about my long-term plans and whether I could commit for another five years as CEO,” Ittycheria told CNBC in an interview. “I thought long and hard about it, and I talked to my family, I talked to the board and ultimately realized I couldn’t make that kind of decision.”

Before joining MongoDB, Ittycheria was president of BMC, which bought his company BladeLogic for $854 million in 2008. As BladeLogic’s co-founder and CEO, Ittycheria took the company public in 2007. He’s also been an investor at venture firms OpenView and Greylock.

Ittycheria led MongoDB’s IPO in 2017, three years after taking the helm. The company won over individual software developers thanks to its database’s architecture that could store a variety of data in documents, challenging market incumbents like Oracle.

Under Ittycheria, the company prioritized cloud subscriptions, landed multi-year deals, partnered with rival cloud providers Amazon and Microsoft and expanded the software’s capabilities into generative artificial intelligence.

MongoDB’s stock closed on Friday at $359.82, representing a fifteenfold gain since the IPO and lifting the company’s market cap to almost $30 billion. MongoDB’s net loss in the July quarter narrowed to $47 million from $54.5 million a year earlier, while revenue rose 24% to $591 million.

Cloudflare said in a filing on Thursday that Desai would step down on Nov. 7, to become CEO “at another notable, publicly-traded company.” Desai previously served as operating chief at ServiceNow. He resigned in July 2024, after the software company found a policy violation with the hiring of the U.S. Army’s chief information officer. Previously Desai held leadership positions at EMC and Symantec.

“We talked to people close to ServiceNow, as well as other people who know CJ really well, and we felt very, very comfortable that CJ is the right person to lead MongoDB in this next era,” Ittycheria said.

Desai, whose first job out of college was at Oracle, said he will split his time between New York and the San Francisco area.

MongoDB also said it expects to exceed the high end of its guidance ranges for revenue and adjusted earnings per share in the fiscal third quarter. The top end of its range was 79 cents per share in earnings, and $592 million in revenue.

Desai said he’s “looking forward to grow MongoDB to $5 billion-plus in a durable, profitable way, in revenues, and most importantly, to be the gold standard for modern database technology, no matter what kind of workloads exist.” He did not offer a timeline for the revenue goal.

Executives will discuss the leadership change on a conference call with analysts at 10 a.m. ET.

WATCH: MongoDB CEO Dev Ittycheria on Q2 results: The opportunity in front of us is massive

MongoDB CEO Dev Ittycheria on Q2 results: The opportunity in front of us is massive

Continue Reading

Technology

Nvidia stock climbs 3% as U.S. approves chip sale to the UAE under Microsoft deal

Published

on

By

Nvidia stock climbs 3% as U.S. approves chip sale to the UAE under Microsoft deal

Jensen Huang, CEO of Nvidia, speaks during the 2025 Asia-Pacific Economic Cooperation (APEC) CEO Summit in Gyeongju, South Korea, October 31, 2025.

Kim Soo-hyeon | Reuters

Microsoft said Monday it has secured export licenses to ship Nvidia chips to the United Arab Emirates in a move that could accelerate the Gulf’s lofty AI ambitions.

The tech giant said it is the first company under U.S. President Donald Trump‘s administration to secure such licenses from the Commerce Department and that the approval, granted in September, was based on “updated and stringent technology safeguards.”

The licenses enable the firm to ship the equivalent of 60,400 additional A100 chips, involving tech darling Nvidia’s more advanced GB300 GPUs.

“While the chips are powerful and the numbers are large, more important is their positive impact across the UAE,” Microsoft said in a blog post. “We’re using these GPUs to provide access to advanced AI models from OpenAI, Anthropic, open-source providers, and Microsoft itself.”

Nvidia shares climbed 3% Monday. Microsoft stock rose slightly.

Azad Zangana, head of GCC macroeconomic analysis at Oxford Economics, said in a note that Nvidia’s chips are “crucial” for the UAE’s push to be a major global player in AI.

“Access to the world’s leading AI chips provides the hardware that will give developers the leading edge that is needed in an incredibly competitive global landscape,” Zangana wrote.

U.S. reportedly approves several billion dollars of Nvidia chip sales to UAE

There is a “very important” relationship between the UAE and U.S. governments that has spanned multiple administrations, Microsoft President Brad Smith told CNBC’s Dan Murphy at the ADIPEC conference in Abu Dhabi.

“We’re very grateful to the Secretary of Commerce Howard Lutnick, and the work that he has championed to enable export licenses to be made available to us,” Smith said. “That builds as well on the relationships we had with Secretary [Marco] Rubio when he was in the Senate and Democrats as well. [It] takes two parties to govern, and we keep that in mind.”

Microsoft also announced it will be increasing its investment in UAE, bringing its total contribution to $15.2 billion by the end of this decade.

That includes a $1.5 billion equity investment in AI firm G42 and more than $5.5 billion in capital expenses for the expansion of Microsoft’s AI and cloud infrastructure projects in the region.

“We’re really investing in trust, and I think it’s that combination of technology, talent and trust that you’re seeing come together here in the UAE, around AI, around technology, but really the future of the whole economy,” Smith said.

Microsoft president: 'Huge' challenge and great opportunity as global economy enters a new phase

— CNBC’s Dan Murphy contributed to this report.

Continue Reading

Trending