
Your iPhone battery life is probably getting worse. Here’s what to do.
More Videos
Published
2 years agoon
By
admin
A view of the new iPhone 14 at an Apple event at their headquarters in Cupertino, California, September 7, 2022.
Carlos Barria | Reuters
If your iPhone battery life seems like it’s getting worse, that’s because it is.
There’s a simple explanation. If you’ve owned your iPhone — or any phone — for more than a year or so and plugged it in daily, it typically begins to lose its ability to receive a full charge.
The capacity of a battery deteriorates over time, which can be due to the heat from your charger, frequent fast charging or other activities that cause your phone’s temperature to increase, such as gaming, as The Wall Street Journal recently reported.
If your iPhone can’t store more than 80% of a charge, it may be time to get your battery replaced. It’s something you can do at the Apple Store, and it can help you get more life from your iPhone if it’s otherwise in working order.
A battery replacement is free if you have AppleCare+, which costs between $3.99 and $13.49 per month depending on the iPhone model you have. IF you don’t have the program, then a new battery costs anywhere from about $69 to $99, depending on your iPhone model.
There’s an easy way to check to see how much capacity your battery has lost. Here’s what to do:
- Open Settings.
- Choose Battery.
- Select Battery Health & Charging.
You’ll see “Maximum Capacity.” This is the measure of battery relative to when you first bought the phone. My iPhone 14 Pro Max from last September, for example, has a capacity of 87%. That means I can’t get Apple’s free battery replacement through the AppleCare+ plan that I pay for. But it explains why my battery life seems like it’s gotten worse.
Take your phone to Apple for a replacement if your phone displays anything less than 80% capacity. It’s a quick and relatively affordable way to improve your iPhone battery life, particularly if you don’t otherwise need a new phone.
You may like
Technology
A look at OpenAI’s tangled web of dealmaking
Published
7 hours agoon
September 28, 2025By
admin
OpenAI CEO Sam Altman speaks to media following a Q&A at the OpenAI data center in Abilene, Texas, U.S., Sept. 23, 2025.
Shelby Tauber | Reuters
OpenAI CEO Sam Altman is everywhere.
His artificial intelligence startup, now valued at $500 billion, has been inking deals valued in the tens to hundreds of billions of dollars with infrastructure partners, even as it continues to burn mounds of cash.
Those expenditures are driving the market.
The Nasdaq and S&P 500 rose to record highs this week after Nvidia agreed to invest up to $100 billion in OpenAI. That followed a $300 billion deal between OpenAI and Oracle in July as part of the the Stargate program, a $500 billion infrastructure project that’s also being funded by SoftBank.
Its commitments don’t stop there. CoreWeave on Thursday said it’s agreed to provide OpenAI up to $22.4 billion in AI infrastructure, an increase from the $11.9 billion it initially announced in March. Earlier this month, chipmaker Broadcom said it had secured a new $10 billion customer, and analysts were quick to point to OpenAI.
While OpenAI says that scaling is key to driving innovation and future AI breakthroughs, investors and analysts are beginning to raise their eyebrows over the mindboggling sums, as well as OpenAI’s reliance on an increasingly interconnected web of infrastructure partners.
OpenAI took a $350 million stake in CoreWeave ahead of its IPO in March, for instance. Nvidia formalized its financial stake in OpenAI by participating in a $6.6 billion funding round in October. Oracle is spending about $40 billion on Nvidia chips to power one of OpenAI’s Stargate data centers, according to a May report from the Financial Times. Earlier this month, CoreWeave disclosed an order worth at least $6.3 billion from Nvidia.
And through its $100 billion investment in OpenAI, Nvidia will get equity in the startup and earn revenue at the same time.
OpenAI is only expected to generate $13 billion in revenue this year, according to the company’s CFO Sarah Friar. She told CNBC that technology booms require bold bets on infrastructure.
“When the internet was getting started, people kept feeling like, ‘Oh, we’re over-building, there’s too much,'” Friar said. “Look where we are today, right?”
Altman told CNBC in August that he’s willing to run the company at a loss in order to prioritize growth and its investments.
‘Troubling signal’
But some analysts are raising red flags, arguing that OpenAI’s deal with Nvidia is reminiscent of vendor financing patterns that helped burst the dot-com bubble in the early 2000s.
Nvidia has been the biggest winner of the AI boom so far because it produces the graphics processing units (GPUs) that are necessary to train models and run large AI workloads. Nvidia’s investment in OpenAI, which will be paid out in installments over several years, will help the startup build out data centers that are based around its GPUs.
“You don’t have to be a skeptic about AI technology’s promise in general to see this announcement as a troubling signal about how self-referential the entire space has become,” Bespoke Investment Group wrote in a note to clients on Tuesday. “If NVDA has to provide the capital that becomes its revenues in order to maintain growth, the whole ecosystem may be unsustainable.”
Sam Altman, CEO of OpenAI (L), and Jensen Huang CEO of Nvidia.
Reuters
Peter Boockvar, chief investment officer at One Point BFG Wealth Partners, said names of companies from the late 1990′s were ringing in his ears after the OpenAI-Nvidia deal was announced.
A key difference, however, is that this transaction is “so much bigger in terms of dollars,” he wrote in a note.
“For this whole massive experiment to work without causing large losses, OpenAI and its peers now have got to generate huge revenues and profits to pay for all the obligations they are signing up for and at the same time provide a return to its investors,” Boockvar said.
An OpenAI spokesperson referred CNBC to comments from Altman and Friar this week, adding that the company is pursuing “a once-in-a-century opportunity that demands ambition equal to the moment.”
The total amount of demand for compute could reach a staggering 200 gigawatts by 2030, according to Bain & Company’s 2025 Technology Report. Building enough data centers to meet this anticipated demand would cost about $500 billion a year, meaning AI companies would have to generate a combined $2 trillion in annual revenue to cover those costs.
Even if companies throw their whole weight behind investing in the cloud and data centers, “the amount would still fall $800 billion short of the revenue needed to fund the full investment,” Bain said.
There’s a clear uphill battle ahead, but OpenAI’s Altman brushed off concerns on Tuesday, rejecting the idea that the infrastructure spending spree is overkill.
“This is what it takes to deliver AI,” Altman told CNBC. “Unlike previous technological revolutions or previous versions of the internet, there’s so much infrastructure that’s required, and this is a small sample of it.”
–CNBC’s Yun Li and MacKenzie Sigalos contributed to this report
WATCH: OpenAI’s Sam Altman defends Stargate expansion as demand for AI soars

Technology
5 takeaways from CNBC’s investigation into ‘nudify’ apps and sites
Published
9 hours agoon
September 28, 2025By
admin
Jessica Guistolise, Megan Hurley and Molly Kelley talk with CNBC in Minneapolis, Minnesota, on July 11, 2025, about fake pornographic images and videos depicting their faces made by their mutual friend Ben using AI site DeepSwap.
Jordan Wyatt | CNBC
In the summer of 2024, a group of women in the Minneapolis area learned that a male friend used their Facebook photos mixed with artificial intelligence to create sexualized images and videos.
Using an AI site called DeepSwap, the man secretly created deepfakes of the friends and over 80 women in the Twin Cities region. The discovery created emotional trauma and led the group to seek the help of a sympathetic state senator.
As a CNBC investigation shows, the rise of “nudify” apps and sites has made it easier than ever for people to create nonconsensual, explicit deepfakes. Experts said these services are all over the Internet, with many being promoted via Facebook ads, available for download on the Apple and Google app stores and easily accessed using simple web searches.
“That’s the reality of where the technology is right now, and that means that any person can really be victimized,” said Haley McNamara, senior vice president of strategic initiatives and programs at the National Center on Sexual Exploitation.
CNBC’s reporting shines a light on the legal quagmire surrounding AI, and how a group of friends became key figures in the fight against nonconsensual, AI-generated porn.
Here are five takeaways from the investigation.
The women lack legal recourse
Because the women weren’t underage and the man who created the deepfakes never distributed the content, there was no apparent crime.
“He did not break any laws that we’re aware of,” said Molly Kelley, one of the Minnesota victims and a law student. “And that is problematic.”
Now, Kelley and the women are advocating for a local bill in their state, proposed by Democratic state Senator Erin Maye Quade, intended to block nudify services in Minnesota. Should the bill become law, it would levy fines on the entities enabling the creation of the deepfakes.
Maye Quade said the bill is reminiscent of laws that prohibit peeping into windows to snap explicit photos without consent.
“We just haven’t grappled with the emergence of AI technology in the same way,” Maye Quade said in an interview with CNBC, referring to the speed of AI development.
The harm is real
Jessica Guistolise, one of the Minnesota victims, said she continues to suffer from panic and anxiety stemming from the incident last year.
Sometimes, she said, a simple click of a camera shutter can cause her to lose her breath and begin trembling, her eyes swelling with tears. That’s what happened at a conference she attended a month after first learning about the images.
“I heard that camera click, and I was quite literally in the darkest corners of the internet,” Guistolise said. “Because I’ve seen myself doing things that are not me doing things.”
Mary Anne Franks, professor at the George Washington University Law School, compared the experience to the feelings victims describe when talking about so-called revenge porn, or the posting of a person’s sexual photos and videos online, often by a former romantic partner.
“It makes you feel like you don’t own your own body, that you’ll never be able to take back your own identity,” said Franks, who is also president of the Cyber Civil Rights Initiative, a nonprofit organization dedicated to combating online abuse and discrimination.
Deepfakes are easier to create than ever
Less than a decade ago, a person would need to be an AI expert to make explicit deepfakes. Thanks to nudifier services, all that’s required is an internet connection and a Facebook photo.
Researchers said new AI models have helped usher in a wave of nudify services. The models are often bundled within easy-to-use apps, so that people lacking technical skills can create the content.
And while nudify services can contain disclaimers about obtaining consent, it’s unclear whether there is any enforcement mechanism. Additionally, many nudify sites market themselves simply as so-called face-swapping tools.
“There are apps that present as playful and they are actually primarily meant as pornographic in purpose,” said Alexios Mantzarlis, an AI security expert at Cornell Tech. “That’s another wrinkle in this space.”
Nudify service DeepSwap is hard to find
The site that was used to create the content is called DeepSwap, and there’s not much information about it online.
In a press release published in July, DeepSwap used a Hong Kong dateline and included a quote from Penyne Wu, who was identified in the release as CEO and co-founder. The media contact on the release was Shawn Banks, who was listed as marketing manager.
CNBC was unable to find information online about Wu, and sent multiple emails to the address provided for Banks, but received no response.
DeepSwap’s website currently lists “MINDSPARK AI LIMITED” as its company name, provides an address in Dublin, and states that its terms of service are “governed by and construed in accordance with the laws of Ireland.”
However, in July, the same DeepSwap page had no mention of Mindspark, and references to Ireland instead said Hong Kong.
AI’s collateral damage
Maye Quade’s bill, which is still being considered, would fine tech companies that offer nudify services $500,000 for every nonconsensual, explicit deepfake that they generate in the state of Minnesota.
Some experts are concerned, however, that the Trump administration’s plans to bolster the AI sector will undercut states’ efforts.
In late July, Trump signed executive orders as part of the White House’s AI Action Plan, underscoring AI development as a “national security imperative.”
Kelley hopes that any federal AI push doesn’t jeopardize the efforts of the Minnesota women.
“I’m concerned that we will continue to be left behind and sacrificed at the altar of trying to have some geopolitical race for powerful AI,” Kelley said.
WATCH: The alarming rise of AI ‘nudify’ apps that create explicit images of real people.

Technology
How a ‘nudify’ site turned a group of friends into key figures in a fight against AI-generated porn
Published
1 day agoon
September 27, 2025By
admin

In June of last year, Jessica Guistolise received a text message that would change her life.
While the technology consultant was dining with colleagues on a work trip in Oregon, her phone alerted her to a text from an acquaintance named Jenny, who said she had urgent information to share about her estranged husband, Ben.
After a nearly two-hour conversation with Jenny later that night, Guistolise recalled, she was dazed and in a state of panic. Jenny told her she’d found pictures on Ben’s computer of more than 80 women whose social media photos were used to create deepfake pornography — videos and photos of sexual activities made using artificial intelligence to merge real photos with pornographic images. All the women in Ben’s images lived in the Minneapolis area.
Jenny used her phone to snap pictures of images on Ben’s computer, Guistolise said. The screenshots, some of which were viewed by CNBC, revealed that Ben used a site called DeepSwap to create the deepfakes. DeepSwap falls into a category of “nudify” sites that have proliferated since the emergence of generative AI less than three years ago.
CNBC decided not to use Jenny’s surname in order to protect her privacy and withheld Ben’s surname due to his assertion of mental health struggles. They are now divorced.
Guistolise said that after talking to Jenny, she was desperate to cut her trip short and rush home.
In Minneapolis the women’s experiences would soon spark a growing opposition to AI deepfake tools and those who use them.
One of the manipulated photos Guistolise saw upon her return was generated using a photo from a family vacation. Another was from her goddaughter’s college graduation. Both had been taken from her Facebook page.
“The first time I saw the actual images, I think something inside me shifted, like fundamentally changed,” said Guistolise, 42.
CNBC interviewed more than two dozen people — including victims, their family members, attorneys, sexual-abuse experts, AI and cybersecurity researchers, trust and safety workers in the tech industry, and lawmakers — to learn how nudify websites and apps work and to understand their real-life impact on people.
“It’s not something that I would wish for on anybody,” Guistolise said.
Jessica Guistolise, Megan Hurley and Molly Kelley talk with CNBC in Minneapolis, Minnesota, on July 11, 2025, about fake pornographic images and videos depicting their faces made by their mutual friend Ben using AI site DeepSwap.
Jordan Wyatt | CNBC
Nudify apps represent a small but rapidly growing corner of the new AI universe, which exploded following the arrival of OpenAI’s ChatGPT in late 2022. Since then, Meta, Alphabet, Microsoft, Amazon and others have collectively spent hundreds of billions of dollars investing in AI and pursuing artificial general intelligence, or AGI — technology that could rival and even surpass the capabilities of humans.
For consumers, most of the excitement to date has been around chatbots and image generators that allow users to perform complex tasks with simple text prompts. There’s also the burgeoning market of AI companions, and a host of agents designed to enhance productivity.
But victims of nudify apps are experiencing the flip side of the AI boom. Thanks to generative AI, products such as DeepSwap are so easy to use — requiring no coding ability or technical expertise — that they can be accessed by just about anyone. Guistolise and others said they worry that it’s only a matter of time before the technology spreads widely, leaving many more people to suffer the consequences.
Guistolise filed a police report about the case and obtained a restraining order against Ben. But she and her friends quickly realized there was a problem with that strategy.
Ben’s actions may have been legal.
The women involved weren’t underage. And as far as they were aware, the deepfakes hadn’t been distributed, existing only on Ben’s computer. While they feared that the videos and images were on a server somewhere and could end up in the hands of bad actors, there was nothing of that sort that they could pin on Ben.
One of the other women involved was Molly Kelley, a law student who would spend the ensuing year helping the group navigate AI’s uncharted legal maze.
“He did not break any laws that we’re aware of,” Kelley said, referring to Ben’s behavior. “And that is problematic.”
Ben admitted to creating the deepfakes, and told CNBC by email that he feels guilty and ashamed of his behavior.
Jenny described Ben’s actions as “horrific, inexcusable, and unforgivable,” in an emailed statement.
“From the moment I learned the truth, my loyalty has been with the women affected, and my focus remains on how best to support them as they navigate their new reality,” she wrote. “This is not an issue that will resolve itself. We need stronger laws to ensure accountability — not only for the individuals who misuse this technology, but also for the companies that enable its use on their platforms.”
Readily available
Like other new and simple-to-use AI tools, experts say that many apps that have nudify services advertise on Facebook and are available to download from the Apple App Store and Google Play Store.
Haley McNamara, senior vice president at the National Center on Sexual Exploitation, said nudify apps and sites have made it “very easy to create realistic sexually explicit, deepfake imagery of a person based off of one photo in less time than it takes to brew a cup of coffee.”
Two photos of Molly Kelley’s face and one of Megan Hurley’s appear on a screenshot taken from a computer belonging to their mutual friend Ben, who used the women’s Facebook photos without their consent to make fake pornographic images and videos using the AI site DeepSwap, July 11, 2025.
A spokesperson from Meta, Facebook’s owner, said in a statement that the company has strict rules barring ads that contain nudity and sexual activity and that it shares information it learns about nudify services with other companies through an industrywide child-safety initiative. Meta characterized the nudify ecosystem as an adversarial space and said it’s improving its technology to try to prevent bad actors from running ads.
Apple told CNBC that it regularly removes and rejects apps that violate its app store guidelines related to content deemed offensive, misleading and overtly sexual and pornographic.
Google declined to comment.
The issue extends well beyond the U.S.
In June 2024, around the same time the women in Minnesota discovered what was happening, an Australian man was sentenced to nine years in prison for creating deepfake content of 26 women. That same month, media reports detailed an investigation by Australian authorities into a school incident in which a teenager allegedly created and distributed deepfake content of nearly 50 female classmates.
“Whatever the worst potential of any technology is, it’s almost always exercised against women and girls first,” said Mary Anne Franks, professor at the George Washington University Law School.
Security researchers from the University of Florida and Georgetown University wrote in a research paper presented in August that nudify tools are taking design cues from popular consumer apps and using familiar subscription models. DeepSwap charges users $19.99 a month to access “premium” benefits, which includes credits that can be used for AI video generation, faster processing and higher-quality images.
The researchers said the “nudification platforms have gone fully mainstream” and are “advertised on Instagram and hosted in app stores.”
Guistolise said she knew that people could use AI to create nonconsensual porn, but she didn’t realize how easy and accessible the apps were until she saw a synthetic version of herself participating in raunchy, explicit activity.
According to the screenshots of Ben’s DeepSwap page, the faces of Guistolise and the other Minnesota women sit neatly in rows of eight, like in a school yearbook. Clicking on the photos, Jenny’s pictures show, leads to a collection of computer-generated clones engaged in a variety of sexual acts. The women’s faces had been merged with the nude bodies of other women.
DeepSwap’s privacy policy states that users have seven days to look at the content from the time they upload it to the site, and that the data is stored for that period on servers in Ireland. DeepSwap’s site says it deletes the data at that point, but users can download it in the interim onto their own computer.
The site also has a terms of service page, which says users shouldn’t upload any content that “contains any private or personal information of a third party without such third party’s consent.” Based on the experiences of the Minnesota women, who provided no consent, it’s unclear whether DeepSwap has any enforcement mechanism.
DeepSwap provides little publicly by way of contact information and didn’t reply to multiple CNBC requests for comment.
CNBC reporting found AI site DeepSwap, shown here, was used by a Minneapolis man to create fake pornographic images and videos depicting the faces of more than 80 of his friends and acquaintances.
In a press release published in July, DeepSwap used a Hong Kong dateline and included a quote attributed to a person the release identified as CEO and co-founder Penyne Wu. The media contact on the release was listed as marketing manager Shawn Banks.
CNBC was unable to find information online about Wu, and sent multiple emails to the address provided for Banks, but received no response.
DeepSwap’s website currently lists “MINDSPARK AI LIMITED” as its company name, provides an address in Dublin, and states that its terms of service are “governed by and construed in accordance with the laws of Ireland.”
However, in July, the same DeepSwap page had no mention of Mindspark, and references to Ireland instead said Hong Kong.
Psychological trauma
Kelley, 42, found out about her inclusion in Ben’s AI portfolio after receiving a text message from Jenny. She invited Jenny over that afternoon.
After learning what happened, Kelley, who was six months pregnant at the time, said it took her hours to muster the strength to view the photos captured from Jenny’s phone. Kelley said what she saw was her face “very realistically on someone else’s body, in images and videos.”
Kelley said her stress level spiked to a degree that it soon started to affect her health. Her doctor warned her that too much cortisol, brought on by stress, would cause her body not “to make any insulin,” Kelley recalled.
“I was not enjoying life at all like this,” said Kelley, who, like Guistolise, filed a police report on the matter.
Kelley said that in Jenny’s photos she recognized some of her good friends, including many she knew from the service industry in Minneapolis. She said she then notified the women and she purchased facial-recognition software to help identify the other victims so they could be informed. About half a dozen victims have yet to be identified, she said.
“It was incredibly time consuming and really stressful because I was trying to work,” she said.
Victims of nudify tools can experience significant trauma, leading to suicidal thoughts, self-harm and a fear of trust, said Ari Ezra Waldman, a law professor at University of California, Irvine who testified at a 2024 House committee hearing on the harms of deepfakes.
Waldman said even when nudified images haven’t been posted publicly, subjects can fear that the images may eventually be shared, and “now someone has this dangling over their head like a sword of Damocles.”
“Everyone is subject to being objectified or pornographied by everyone else,” he said.
Three victims showed CNBC explicit, AI-created deepfake images depicting their faces as well as those of other women, during an interview in Minneapolis, Minnesota, on July 11, 2025.
Megan Hurley, 42, said she was trying to enjoy a cruise last summer off the western coast of Canada when she received an urgent text message from Kelley. Her vacation was ruined.
Hurley described instant feelings of deep paranoia after returning home to Minneapolis. She said she had awkward conversations with an ex-boyfriend and other male friends, asking them to take screenshots if they ever saw AI-generated porn online that looked like her.
“I don’t know what your porn consumption is like, but if you ever see me, could you please screencap and let me know where it is?” Hurley said, describing the kinds of messages she sent at the time. “Because we’d be able to prove dissemination at that point.”
Hurley said she contacted the FBI but never heard back. She also filled out an online FBI crime report, which she shared with CNBC. The FBI confirmed that it received CNBC’s request for comment, but didn’t provide a response.
The group of women began searching for help from lawmakers. They were led to Minnesota state Sen. Erin Maye Quade, a Democrat who had previously sponsored a bill that became a state statute criminalizing the “nonconsensual dissemination of a deep fake depicting intimate parts or sexual acts.”
Kelley landed a video call with the senator in early August 2024.
In the virtual meeting, several women from the group told their stories, and explained their frustrations about the limited legal recourse available. Maye Quade went to work on a new bill, which she announced in February, that would compel AI companies to shut down apps using their technology to create nudify services.
The bill, which is still being considered, would fine tech companies that offer nudify services $500,000 for every nonconsensual, explicit deepfake that they generate in the state of Minnesota.
Maye Quade told CNBC in an interview that the bill is the modern equivalent of longstanding laws that make it illegal for a person to peep into someone else’s window and snap explicit photos without consent.
“We just haven’t grappled with the emergence of AI technology in the same way,” Maye Quade said.
Minnesota state Sen. Erin Maye Quade, at left, talks to CNBC’s Jonathan Vanian and Katie Tarasov in Minneapolis on July 11, 2025, about her efforts to pass state legislation that would fine tech companies that offer nudify services $500,000 for every nonconsensual, explicit deepfake image they generate in her state.
Jordan Wyatt | CNBC
But Maye Quade acknowledged that enforcing the law against companies based overseas presents a significant challenge.
“This is why I think a federal response is more appropriate,” she said. “Because actually having a federal government, a country could take far more actions with companies that are based in other countries.”
Kelley, who gave birth to her son in September 2024, characterized one of her late October meetings with Maye Quade and the group as a “blur,” because she said she was “mentally and physically unwell due to sleep deprivation and stress.”
She said she now avoids social media.
“I never announced the birth of my second child,” Kelley said. “There’s plenty of people out there who have no idea that I had a baby. I just didn’t want to put it online.”
The early days of deepfake pornography
The rise of deepfakes can be traced back to 2018. That’s when videos showing former President Barack Obama giving speeches that never existed and actor Jim Carrey, instead of Jack Nicholson, appearing in “The Shining” started going viral.
Lawmakers sounded the alarm. Sites such as Pornhub and Reddit responded by pledging to take down nonconsensual content from their platforms. Reddit said at the time that it removed a large deepfake-related subreddit as part of an enforcement of a policy banning “involuntary pornography.”
The community congregated elsewhere. One popular place was MrDeepFakes, which hosted explicit AI-generated videos and served as an online discussion forum.
By 2023, MrDeepFakes became the top deepfake site on the web, hosting 43,000 sexualized videos containing nearly 4,000 individuals, according to a 2025 study of the site by researchers from Stanford University and the University of California San Diego.
MrDeepFakes claimed to host only “celebrity” deepfakes, but the researchers found “that hundreds of targeted individuals have little to no online or public presence.” The researchers also discovered a burgeoning economy, with some users agreeing to create custom deepfakes for others at an average cost of $87.50 per video, the paper said.
Some ads for nudify services have gone to more mainstream locations. Alexios Mantzarlis, an AI security expert at Cornell Tech, earlier this year discovered more than 8,000 ads on the Meta ad library across Facebook and Instagram for a nudify service called CrushAI.
AI apps and sites like Undress, DeepNude and CrushAI are some of the “nudify” tools that can be used to create fake pornographic images and videos depicting real people’s faces pulled from innocuous online photos.
Emily Park | CNBC
At least one DeepSwap ad ran on Instagram in October, according to the social media company’s ad library. The account associated with running the ad does not appear to be officially tied to DeepSwap, but Mantzarlis said he suspects the account could have been an affiliate partner of the nudify service.
Meta said it reviewed ads associated with the Instagram account in question and didn’t find any violations.
Top nudify services are often found on third-party affiliate sites such as ThePornDude that earn money by mentioning them, Mantzarlis said.
In July, Mantzarlis co-authored a report analyzing 85 nudify services. The report found that the services receive 18.6 million monthly unique visitors in aggregate, though Mantzarlis said that figure doesn’t take into account people who share the content in places such as Discord and Telegram.
As a business, nudify services are a small part of the generative AI market. Mantzarlis estimates annual revenue of about $36 million, but he said that’s a conservative prediction and includes only AI-generated content from sites that specifically promote nudify services.
MrDeepFakes abruptly shut down in May, shortly after its key operator was publicly identified in a joint investigative report from Canada’s CBC News, Danish news sites Politiken and Tjekdet, and online investigative outlet Bellingcat.
CNBC reached out by email to the address that was associated with the person named as the operator in some materials from the CBC report, but received no reply.
With MrDeepFakes going dark, Discord has emerged as an increasingly popular meeting spot, experts said. Known mostly for its use in the online gaming community, Discord has roughly 200 million global monthly active users who access its servers to discuss shared interests.
CNBC identified several public Discord servers, including one associated with DeepSwap, where users appeared to be asking others in the forum to create sexualized deepfakes based on photos they shared.
Leigh Cassidy Gibson, a researcher at the University of Florida, co-authored the 2025 paper that looked at “20 popular and easy-to-find nudification websites.” She confirmed to CNBC that while DeepSwap wasn’t named, it was one of the sites she and her colleagues studied to understand the market. More recently, she said, they’ve turned their attention to various Discord servers where users seek tutorials and how-to guides on creating AI-generated, sexual content.
Discord declined to comment.
‘It’s insane to me that this is legal right now’
At the federal level, the government has at least taken note.
In May, President Donald Trump signed the “Take It Down Act” into law, which goes into effect in May. The law bans online publication of nonconsensual sexual images and videos, including those that are inauthentic and generated by AI.
“A person who violates one of the publication offenses pertaining to depictions of adults is subject to criminal fines, imprisonment of up to two years, or both,” according to the law’s text.
Experts told CNBC that the law still doesn’t address the central issue facing the Minnesota women, because there’s no evidence that the material was distributed online.
Maye Quade’s bill in Minnesota emphasizes that the creation of the material is the core problem and requires a legal response.
Some experts are concerned that the Trump administration’s plans to bolster the AI sector will undercut states’ efforts. In late July, Trump signed executive orders as part of the White House’s AI Action Plan, underscoring AI development as a “national security imperative.”
As part of Trump’s proposed spending bill earlier this year, states would have been deterred from regulating AI for a 10-year period or risk losing certain government subsidies related to AI infrastructure. The Senate struck down that provision in July, keeping it out of the bill Trump signed in August.
“I would not put it past them trying to resurrect the moratorium,” said Waldman, of UC Irvine, regarding the tech industry’s continued influence on AI policy.
A White House official told CNBC that the Take It Down Act, which was supported by the Trump administration and signed months prior to the AI Action Plan, criminalizes nonconsensual deepfakes. The official said the AI Action Plan encourages states to allow federal laws to override individual state laws.
In San Francisco, home to OpenAI and other high-valued AI startups, the city can pursue civil cases against nudify services due to California consumer protection laws. Last year San Francisco sued 16 companies associated with nudify apps.
The San Francisco City Attorney’s office said in June that an investigation related to the lawsuits had led to 10 of the most-visited nudify websites being taken offline or no longer being accessible in California. One of the companies that was sued, Briver LLC, settled with the city and has agreed to pay $100,000 in civil penalties. Additionally, Briver no longer operates websites that can create nonconsensual deepfake pornography, the city attorney’s office said.
Jordan Wyatt | CNBC
Further south, in Silicon Valley, Meta in June sued Hong Kong-based Joy Timeline HK, the company behind CrushAI. Meta said that Joy Timeline attempted to “circumvent Meta’s ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules.”
Still, Mantzarlis, who has been publishing his research on Indicator, said he continues to find nudify-related ads on Meta’s platforms.
Mantzarlis and a colleague from the American Sunlight Project discovered 4,215 ads for 15 AI nudifier services that ran on Facebook and Instagram since June 11, they wrote in a joint report on Sept. 10. Mantzarlis said Meta eventually removed the ads, some of which were more subtle than others in implying nudifying capabilities.
Meta told CNBC that earlier this month that it removed thousands of ads linked to companies offering nudify services and sent the entities cease-and-desist letters for violating the company’s ad guidelines.
In Minnesota, the group of friends are trying to get on with their lives while continuing to advocate for change.
Guistolise said she wants people to realize that AI is potentially being used to harm them in ways they never imagined.
“It’s so important that people know that this really is out there and it’s really accessible and it’s really easy to do, and it really needs to stop,” Guistolise said. “So here we are.”
Survivors of sexual violence can seek confidential support from the National Sexual Assault Hotline at 1-800-656-4673.
Trending
-
Sports3 years ago
‘Storybook stuff’: Inside the night Bryce Harper sent the Phillies to the World Series
-
Sports1 year ago
Story injured on diving stop, exits Red Sox game
-
Sports2 years ago
Game 1 of WS least-watched in recorded history
-
Sports3 years ago
Button battles heat exhaustion in NASCAR debut
-
Sports3 years ago
MLB Rank 2023: Ranking baseball’s top 100 players
-
Sports4 years ago
Team Europe easily wins 4th straight Laver Cup
-
Environment2 years ago
Japan and South Korea have a lot at stake in a free and open South China Sea
-
Environment12 months ago
Here are the best electric bikes you can buy at every price level in October 2024