Connect with us

Published

on

Parade goers hold Pride flags during the annual Pride Parade in San Francisco on Sunday, June 29, 2024.

Minh Connors | San Francisco Chronicle | Hearst Newspapers | Getty Images

Meta CEO Mark Zuckerberg is staying mum these days when it comes to the LGBTQ+ community. It wasn’t always that way. 

San Francisco Pride Executive Director Suzanne Ford told CNBC she remembers when Zuckerberg personally called the nonprofit to ensure that the company then known as Facebook had a spot at the annual event. As the world’s largest LGBTQ+ parade, the SF Pride event has become a symbol representing advocacy and social justice for members of the community.

In 2015, SF Pride was prohibiting Facebook from marching at the event because of the company’s policies that required people to use their legal names on the social network, Ford said. Members of the LGBTQ+ community were worried that bad actors were exploiting the company’s account policy by reporting transgender Facebook users and others who no longer identify by their legal names.

After Facebook updated the policy, Zuckerberg called SF Pride’s then-executive director George Ridgely to ask him that Facebook be included in the parade, Ford said. 

The relationship between SF Pride and Meta has since splintered.

SF Pride formally cut ties with Meta in March after the company enacted a number of new policies, including a scaling back of internal programs designed to increase hiring of diverse candidates, which CNBC reported in January.

Meta also eased content-moderation guidelines as part of its policy changes, which multiple current and former employees told CNBC could instigate more online abuse toward marginalized communities, including members of the LGBTQ+ community. Zuckerberg has also made an effort to curry favor with President Donald Trump, who signed an executive order in January calling for investigations into companies that support diversity, equity and inclusion, or DEI,  initiatives.

Since the organization’s decision to end its relationship with Meta, Ford said that she hasn’t heard from Zuckerberg or anybody that SF Pride used to have a relationship with at the company. 

Meta will not be taking part in this year’s SF Pride festival, set to take place this weekend at San Francisco’s Civic Center. The annual parade will be held on Sunday, according to the event’s website. The theme for 2025 is “Queer Joy is Resistance.”

“Why was it so important for Mark back then, and why is it so important for Mark now not to be associated with San Francisco Pride?” Ford said.

Meta declined to comment.

FILE PHOTO: Facebook CEO Mark Zuckerberg marched with 700 Facebook employees In San Francisco’s Gay Pride Parade on June 30, 2013.

Kobby Dagan | VWPics |AP

Meta isn’t the only company distancing itself from SF Pride. Other major companies like Anheuser-Busch, Comcast, Diageo and Nissan are also no longer sponsoring SF Pride after years of support, CNBC previously reported.

Given that SF Pride shares a geographic center with Meta and so much of the tech industry, the lack of support for the LGBTQ+ community after years of public trumpeting cuts especially deep, Ford said. Google-parent Alphabet has also stopped sponsoring SF Pride this year, she said.

San Francisco represents both the “home of innovation” for the tech industry and the “home and the birthplace of the LGBTQ community in the United States,” said Ford, adding that it’s no mistake why so much innovation comes from the region.

“Creative and wonderful people want to come to San Francisco — it’s not the drinking water — but they come here because you can be yourself here,” she said. “You can love who you love, you can be who you are and you don’t have to march to any certain drumbeat.”

Tech companies represent a little over 15% of SF Pride’s overall sponsorship funding for the event. The organization’s budget is down $180,000 from their target because of a drop of overall corporate sponsors, a spokesperson told CNBC on Wednesday.

There are still large tech sponsorships from the likes of Apple, Amazon and Salesforce, but otherwise, there’s a palpable silence from the tech industry this year about supporting LGBTQ+ causes, Ford said. 

For instance, Ford said that in previous years, her time was often spent speaking to tech companies’ employee resource groups in the lead-up to SF Pride, but she has yet to receive any invitation of that kind this year.

Ford said she also hopes that OpenAI CEO Sam Altman, who married his partner Oliver Mulherin in 2024, will be more vocal about supporting the LGBTQ+ community and SF Pride. Ford said she briefly met Altman a few months ago to discuss SF Pride, but she has not heard from him since.

“One would think that OpenAI here in San Francisco, that they would think that they should be supporting the fabric of the community,” said Ford, adding that the lack of support from OpenAI and Altman is “painful because Sam is a member of our community, and he certainly has resources.”

OpenAI declined to comment.

A parade float during the annual Pride Parade in San Francisco on Sunday, June 29, 2024.

Minh Connors | San Francisco Chronicle | Hearst Newspapers | Getty Images

Prominent tech companies like Meta, Amazon and Uber have posted rainbow-coated messages on their websites and social media accounts in years past to show support for Pride Month, which is observed in June, but this year, tech companies’ online presence are noticeably less colorful.

The threat of a lawsuit coupled with the possibility of a public tongue-lashing by Trump, other politicians and social media has caused many tech leaders and corporate executives to stay quiet on LGBTQ+ issues, said Amy Dufrane, CEO of human resource certification organization HRCI.

“Anything that touches the space of DEI, we’re seeing companies pull back from that out of fear,” she said.

Executives who support LGBTQ+ and related DEI issues are doing so under the radar to avoid drawing attention, Dufrane said. For example, a spokesperson for SF Pride said that two tech companies have recently donated to the organization but want to remain anonymous. Ford declined to name the tech companies.

“Sometimes people in our community assume there’s no good, there’s no one at these corporations that cares about us,” Ford said. “Sometimes they do, and they don’t want the consequences of caring about us.”

Ford said that the door is still open for Zuckerberg to contact SF Pride, but ultimately, it would be up to the nonprofit’s board to decide the next steps. Ford said that Zuckerberg would likely have to make a “commitment to some things that I don’t think that he would be willing to do.”

“We have got to leave space for people to change, we got to leave space like if at Meta there’s a leadership change or they come to the realization that this is just bad, the track they’re going down is wrong,” Ford said. “I want to leave space for them to come and have a discussion with us and to show us that they are in line with our values.”

Disclosure: Comcast owns NBCUniversal, the parent company of CNBC.

WATCH: The tech feud between Altman and Zuckerberg.

Sam Altman says Meta offered millions to poach OpenAI staff

Continue Reading

Technology

How a ‘nudify’ site turned a group of friends into key figures in a fight against AI-generated porn

Published

on

By

How a 'nudify' site turned a group of friends into key figures in a fight against AI-generated porn

The alarming rise of AI ‘nudify’ apps that create explicit images of real people

In June of last year, Jessica Guistolise received a text message that would change her life.

While the technology consultant was dining with colleagues on a work trip in Oregon, her phone alerted her to a text from an acquaintance named Jenny, who said she had urgent information to share about her estranged husband, Ben.

After a nearly two-hour conversation with Jenny later that night, Guistolise recalled, she was dazed and in a state of panic. Jenny told her she’d found pictures on Ben’s computer of more than 80 women whose social media photos were used to create deepfake pornography — videos and photos of sexual activities made using artificial intelligence to merge real photos with pornographic images. All the women in Ben’s images lived in the Minneapolis area.

Jenny used her phone to snap pictures of images on Ben’s computer, Guistolise said. The screenshots, some of which were viewed by CNBC, revealed that Ben used a site called DeepSwap to create the deepfakes. DeepSwap falls into a category of “nudify” sites that have proliferated since the emergence of generative AI less than three years ago. 

CNBC decided not to use Jenny’s surname in order to protect her privacy and withheld Ben’s surname due to his assertion of mental health struggles. They are now divorced.

Guistolise said that after talking to Jenny, she was desperate to cut her trip short and rush home.

In Minneapolis the women’s experiences would soon spark a growing opposition to AI deepfake tools and those who use them.

One of the manipulated photos Guistolise saw upon her return was generated using a photo from a family vacation. Another was from her goddaughter’s college graduation. Both had been taken from her Facebook page.  

“The first time I saw the actual images, I think something inside me shifted, like fundamentally changed,” said Guistolise, 42.

CNBC interviewed more than two dozen people — including victims, their family members, attorneys, sexual-abuse experts, AI and cybersecurity researchers, trust and safety workers in the tech industry, and lawmakers — to learn how nudify websites and apps work and to understand their real-life impact on people.

It’s not something that I would wish for on anybody,” Guistolise said.

Jessica Guistolise, Megan Hurley and Molly Kelley talk with CNBC in Minneapolis, Minnesota, on July 11, 2025, about fake pornographic images and videos depicting their faces made by their mutual friend Ben using AI site DeepSwap.

Jordan Wyatt | CNBC

Nudify apps represent a small but rapidly growing corner of the new AI universe, which exploded following the arrival of OpenAI’s ChatGPT in late 2022. Since then, Meta, Alphabet, Microsoft, Amazon and others have collectively spent hundreds of billions of dollars investing in AI and pursuing artificial general intelligence, or AGI — technology that could rival and even surpass the capabilities of humans. 

For consumers, most of the excitement to date has been around chatbots and image generators that allow users to perform complex tasks with simple text prompts. There’s also the burgeoning market of AI companions, and a host of agents designed to enhance productivity. 

But victims of nudify apps are experiencing the flip side of the AI boom. Thanks to generative AI, products such as DeepSwap are so easy to use — requiring no coding ability or technical expertise — that they can be accessed by just about anyone. Guistolise and others said they worry that it’s only a matter of time before the technology spreads widely, leaving many more people to suffer the consequences.

Guistolise filed a police report about the case and obtained a restraining order against Ben. But she and her friends quickly realized there was a problem with that strategy.

Ben’s actions may have been legal. 

The women involved weren’t underage. And as far as they were aware, the deepfakes hadn’t been distributed, existing only on Ben’s computer. While they feared that the videos and images were on a server somewhere and could end up in the hands of bad actors, there was nothing of that sort that they could pin on Ben. 

One of the other women involved was Molly Kelley, a law student who would spend the ensuing year helping the group navigate AI’s uncharted legal maze. 

“He did not break any laws that we’re aware of,” Kelley said, referring to Ben’s behavior. “And that is problematic.”

Ben admitted to creating the deepfakes, and told CNBC by email that he feels guilty and ashamed of his behavior.

Jenny described Ben’s actions as “horrific, inexcusable, and unforgivable,” in an emailed statement.

“From the moment I learned the truth, my loyalty has been with the women affected, and my focus remains on how best to support them as they navigate their new reality,” she wrote. “This is not an issue that will resolve itself. We need stronger laws to ensure accountability — not only for the individuals who misuse this technology, but also for the companies that enable its use on their platforms.”

Readily available

Like other new and simple-to-use AI tools, experts say that many apps that have nudify services advertise on Facebook and are available to download from the Apple App Store and Google Play Store.

Haley McNamara, senior vice president at the National Center on Sexual Exploitation, said nudify apps and sites have made it “very easy to create realistic sexually explicit, deepfake imagery of a person based off of one photo in less time than it takes to brew a cup of coffee.”

Two photos of Molly Kelley’s face and one of Megan Hurley’s appear on a screenshot taken from a computer belonging to their mutual friend Ben, who used the women’s Facebook photos without their consent to make fake pornographic images and videos using the AI site DeepSwap, July 11, 2025.

A spokesperson from Meta, Facebook’s owner, said in a statement that the company has strict rules barring ads that contain nudity and sexual activity and that it shares information it learns about nudify services with other companies through an industrywide child-safety initiative. Meta characterized the nudify ecosystem as an adversarial space and said it’s improving its technology to try to prevent bad actors from running ads. 

Apple told CNBC that it regularly removes and rejects apps that violate its app store guidelines related to content deemed offensive, misleading and overtly sexual and pornographic. 

Google declined to comment.

The issue extends well beyond the U.S.

In June 2024, around the same time the women in Minnesota discovered what was happening, an Australian man was sentenced to nine years in prison for creating deepfake content of 26 women. That same month, media reports detailed an investigation by Australian authorities into a school incident in which a teenager allegedly created and distributed deepfake content of nearly 50 female classmates.

“Whatever the worst potential of any technology is, it’s almost always exercised against women and girls first,” said Mary Anne Franks, professor at the George Washington University Law School.

Security researchers from the University of Florida and Georgetown University wrote in a research paper presented in August that nudify tools are taking design cues from popular consumer apps and using familiar subscription models. DeepSwap charges users $19.99 a month to access “premium” benefits, which includes credits that can be used for AI video generation, faster processing and higher-quality images.

The researchers said the “nudification platforms have gone fully mainstream” and are “advertised on Instagram and hosted in app stores.”

Guistolise said she knew that people could use AI to create nonconsensual porn, but she didn’t realize how easy and accessible the apps were until she saw a synthetic version of herself participating in raunchy, explicit activity. 

According to the screenshots of Ben’s DeepSwap page, the faces of Guistolise and the other Minnesota women sit neatly in rows of eight, like in a school yearbook. Clicking on the photos, Jenny’s pictures show, leads to a collection of computer-generated clones engaged in a variety of sexual acts. The women’s faces had been merged with the nude bodies of other women.

DeepSwap’s privacy policy states that users have seven days to look at the content from the time they upload it to the site, and that the data is stored for that period on servers in Ireland. DeepSwap’s site says it deletes the data at that point, but users can download it in the interim onto their own computer. 

The site also has a terms of service page, which says users shouldn’t upload any content that “contains any private or personal information of a third party without such third party’s consent.” Based on the experiences of the Minnesota women, who provided no consent, it’s unclear whether DeepSwap has any enforcement mechanism. 

DeepSwap provides little publicly by way of contact information and didn’t reply to multiple CNBC requests for comment.

CNBC reporting found AI site DeepSwap, shown here, was used by a Minneapolis man to create fake pornographic images and videos depicting the faces of more than 80 of his friends and acquaintances.

In a press release published in July, DeepSwap used a Hong Kong dateline and included a quote attributed to a person the release identified as CEO and co-founder Penyne Wu. The media contact on the release was listed as marketing manager Shawn Banks. 

CNBC was unable to find information online about Wu, and sent multiple emails to the address provided for Banks, but received no response. 

DeepSwap’s website currently lists “MINDSPARK AI LIMITED” as its company name, provides an address in Dublin, and states that its terms of service are “governed by and construed in accordance with the laws of Ireland.”

However, in July, the same DeepSwap page had no mention of Mindspark, and references to Ireland instead said Hong Kong. 

Psychological trauma

Kelley, 42, found out about her inclusion in Ben’s AI portfolio after receiving a text message from Jenny. She invited Jenny over that afternoon.

After learning what happened, Kelley, who was six months pregnant at the time, said it took her hours to muster the strength to view the photos captured from Jenny’s phone. Kelley said what she saw was her face “very realistically on someone else’s body, in images and videos.” 

Kelley said her stress level spiked to a degree that it soon started to affect her health. Her doctor warned her that too much cortisol, brought on by stress, would cause her body not “to make any insulin,” Kelley recalled. 

“I was not enjoying life at all like this,” said Kelley, who, like Guistolise, filed a police report on the matter.

Kelley said that in Jenny’s photos she recognized some of her good friends, including many she knew from the service industry in Minneapolis. She said she then notified the women and she purchased facial-recognition software to help identify the other victims so they could be informed. About half a dozen victims have yet to be identified, she said.

“It was incredibly time consuming and really stressful because I was trying to work,” she said. 

Victims of nudify tools can experience significant trauma, leading to suicidal thoughts, self-harm and a fear of trust, said Ari Ezra Waldman, a law professor at University of California, Irvine who testified at a 2024 House committee hearing on the harms of deepfakes.

Waldman said even when nudified images haven’t been posted publicly, subjects can fear that the images may eventually be shared, and “now someone has this dangling over their head like a sword of Damocles.” 

“Everyone is subject to being objectified or pornographied by everyone else,” he said. 

Three victims showed CNBC explicit, AI-created deepfake images depicting their faces as well as those of other women, during an interview in Minneapolis, Minnesota, on July 11, 2025.

Megan Hurley, 42, said she was trying to enjoy a cruise last summer off the western coast of Canada when she received an urgent text message from Kelley. Her vacation was ruined. 

Hurley described instant feelings of deep paranoia after returning home to Minneapolis. She said she had awkward conversations with an ex-boyfriend and other male friends, asking them to take screenshots if they ever saw AI-generated porn online that looked like her. 

“I don’t know what your porn consumption is like, but if you ever see me, could you please screencap and let me know where it is?” Hurley said, describing the kinds of messages she sent at the time. “Because we’d be able to prove dissemination at that point.”

Hurley said she contacted the FBI but never heard back. She also filled out an online FBI crime report, which she shared with CNBC. The FBI confirmed that it received CNBC’s request for comment, but didn’t provide a response.

The group of women began searching for help from lawmakers. They were led to Minnesota state Sen. Erin Maye Quade, a Democrat who had previously sponsored a bill that became a state statute criminalizing the “nonconsensual dissemination of a deep fake depicting intimate parts or sexual acts.”  

Kelley landed a video call with the senator in early August 2024. 

In the virtual meeting, several women from the group told their stories, and explained their frustrations about the limited legal recourse available. Maye Quade went to work on a new bill, which she announced in February, that would compel AI companies to shut down apps using their technology to create nudify services. 

The bill, which is still being considered, would fine tech companies that offer nudify services $500,000 for every nonconsensual, explicit deepfake that they generate in the state of Minnesota.

Maye Quade told CNBC in an interview that the bill is the modern equivalent of longstanding laws that make it illegal for a person to peep into someone else’s window and snap explicit photos without consent. 

“We just haven’t grappled with the emergence of AI technology in the same way,” Maye Quade said.

Minnesota state Sen. Erin Maye Quade, at left, talks to CNBC’s Jonathan Vanian and Katie Tarasov in Minneapolis on July 11, 2025, about her efforts to pass state legislation that would fine tech companies that offer nudify services $500,000 for every nonconsensual, explicit deepfake image they generate in her state.

Jordan Wyatt | CNBC

But Maye Quade acknowledged that enforcing the law against companies based overseas presents a significant challenge. 

“This is why I think a federal response is more appropriate,” she said. “Because actually having a federal government, a country could take far more actions with companies that are based in other countries.”

Kelley, who gave birth to her son in September 2024, characterized one of her late October meetings with Maye Quade and the group as a “blur,” because she said she was “mentally and physically unwell due to sleep deprivation and stress.”

She said she now avoids social media. 

“I never announced the birth of my second child,” Kelley said. “There’s plenty of people out there who have no idea that I had a baby. I just didn’t want to put it online.”

The early days of deepfake pornography

The rise of deepfakes can be traced back to 2018. That’s when videos showing former President Barack Obama giving speeches that never existed and actor Jim Carrey, instead of Jack Nicholson, appearing in “The Shining” started going viral. 

Lawmakers sounded the alarm. Sites such as Pornhub and Reddit responded by pledging to take down nonconsensual content from their platforms. Reddit said at the time that it removed a large deepfake-related subreddit as part of an enforcement of a policy banning “involuntary pornography.”

The community congregated elsewhere. One popular place was MrDeepFakes, which hosted explicit AI-generated videos and served as an online discussion forum. 

By 2023, MrDeepFakes became the top deepfake site on the web, hosting 43,000 sexualized videos containing nearly 4,000 individuals, according to a 2025 study of the site by researchers from Stanford University and the University of California San Diego.

MrDeepFakes claimed to host only “celebrity” deepfakes, but the researchers found “that hundreds of targeted individuals have little to no online or public presence.” The researchers also discovered a burgeoning economy, with some users agreeing to create custom deepfakes for others at an average cost of $87.50 per video, the paper said.

Some ads for nudify services have gone to more mainstream locations. Alexios Mantzarlis, an AI security expert at Cornell Tech, earlier this year discovered more than 8,000 ads on the Meta ad library across Facebook and Instagram for a nudify service called CrushAI. 

AI apps and sites like Undress, DeepNude and CrushAI are some of the “nudify” tools that can be used to create fake pornographic images and videos depicting real people’s faces pulled from innocuous online photos.

Emily Park | CNBC

At least one DeepSwap ad ran on Instagram in October, according to the social media company’s ad library. The account associated with running the ad does not appear to be officially tied to DeepSwap, but Mantzarlis said he suspects the account could have been an affiliate partner of the nudify service.

Meta said it reviewed ads associated with the Instagram account in question and didn’t find any violations.

Top nudify services are often found on third-party affiliate sites such as ThePornDude that earn money by mentioning them, Mantzarlis said. 

In July, Mantzarlis co-authored a report analyzing 85 nudify services. The report found that the services receive 18.6 million monthly unique visitors in aggregate, though Mantzarlis said that figure doesn’t take into account people who share the content in places such as Discord and Telegram.

As a business, nudify services are a small part of the generative AI market. Mantzarlis estimates annual revenue of about $36 million, but he said that’s a conservative prediction and includes only AI-generated content from sites that specifically promote nudify services. 

MrDeepFakes abruptly shut down in May, shortly after its key operator was publicly identified in a joint investigative report from Canada’s CBC News, Danish news sites Politiken and Tjekdet, and online investigative outlet Bellingcat.

CNBC reached out by email to the address that was associated with the person named as the operator in some materials from the CBC report, but received no reply. 

With MrDeepFakes going dark, Discord has emerged as an increasingly popular meeting spot, experts said. Known mostly for its use in the online gaming community, Discord has roughly 200 million global monthly active users who access its servers to discuss shared interests. 

CNBC identified several public Discord servers, including one associated with DeepSwap, where users appeared to be asking others in the forum to create sexualized deepfakes based on photos they shared. 

Leigh Cassidy Gibson, a researcher at the University of Florida, co-authored the 2025 paper that looked at “20 popular and easy-to-find nudification websites.” She confirmed to CNBC that while DeepSwap wasn’t named, it was one of the sites she and her colleagues studied to understand the market. More recently, she said, they’ve turned their attention to various Discord servers where users seek tutorials and how-to guides on creating AI-generated, sexual content.

Discord declined to comment.

‘It’s insane to me that this is legal right now’

At the federal level, the government has at least taken note. 

In May, President Donald Trump signed the “Take It Down Act” into law, which goes into effect in May. The law bans online publication of nonconsensual sexual images and videos, including those that are inauthentic and generated by AI. 

“A person who violates one of the publication offenses pertaining to depictions of adults is subject to criminal fines, imprisonment of up to two years, or both,” according to the law’s text.

Experts told CNBC that the law still doesn’t address the central issue facing the Minnesota women, because there’s no evidence that the material was distributed online. 

Maye Quade’s bill in Minnesota emphasizes that the creation of the material is the core problem and requires a legal response. 

Some experts are concerned that the Trump administration’s plans to bolster the AI sector will undercut states’ efforts. In late July, Trump signed executive orders as part of the White House’s AI Action Plan, underscoring AI development as a “national security imperative.” 

As part of Trump’s proposed spending bill earlier this year, states would have been deterred from regulating AI for a 10-year period or risk losing certain government subsidies related to AI infrastructure. The Senate struck down that provision in July, keeping it out of the bill Trump signed in August.  

“I would not put it past them trying to resurrect the moratorium,” said Waldman, of UC Irvine, regarding the tech industry’s continued influence on AI policy.

A White House official told CNBC that the Take It Down Act, which was supported by the Trump administration and signed months prior to the AI Action Plan, criminalizes nonconsensual deepfakes. The official said the AI Action Plan encourages states to allow federal laws to override individual state laws.

In San Francisco, home to OpenAI and other high-valued AI startups, the city can pursue civil cases against nudify services due to California consumer protection laws. Last year San Francisco sued 16 companies associated with nudify apps.

The San Francisco City Attorney’s office said in June that an investigation related to the lawsuits had led to 10 of the most-visited nudify websites being taken offline or no longer being accessible in California. One of the companies that was sued, Briver LLC, settled with the city and has agreed to pay $100,000 in civil penalties. Additionally, Briver no longer operates websites that can create nonconsensual deepfake pornography, the city attorney’s office said.

Jordan Wyatt | CNBC

Further south, in Silicon Valley, Meta in June sued Hong Kong-based Joy Timeline HK, the company behind CrushAI. Meta said that Joy Timeline attempted to “circumvent Meta’s ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules.”

Still, Mantzarlis, who has been publishing his research on Indicator, said he continues to find nudify-related ads on Meta’s platforms. 

Mantzarlis and a colleague from the American Sunlight Project discovered 4,215 ads for 15 AI nudifier services that ran on Facebook and Instagram since June 11, they wrote in a joint report on Sept. 10. Mantzarlis said Meta eventually removed the ads, some of which were more subtle than others in implying nudifying capabilities.  

Meta told CNBC that earlier this month that it removed thousands of ads linked to companies offering nudify services and sent the entities cease-and-desist letters for violating the company’s ad guidelines.

In Minnesota, the group of friends are trying to get on with their lives while continuing to advocate for change. 

Guistolise said she wants people to realize that AI is potentially being used to harm them in ways they never imagined.

“It’s so important that people know that this really is out there and it’s really accessible and it’s really easy to do, and it really needs to stop,” Guistolise said. “So here we are.”

Survivors of sexual violence can seek confidential support from the National Sexual Assault Hotline at 1-800-656-4673.

Continue Reading

Technology

Musk, Thiel, Bannon named in partially redacted Epstein documents released by Democrats

Published

on

By

Musk, Thiel, Bannon named in partially redacted Epstein documents released by Democrats

Charges against Jeffrey Epstein were announced on July 8, 2019 in New York City. Epstein will be charged with one count of sex trafficking of minors and one count of conspiracy to engage in sex trafficking of minors.

Stephanie Keith | Getty Images News | Getty Images

Elon Musk, Peter Thiel and former Trump White House advisor Steve Bannon are among those who appeared in partially redacted files related to the late convicted sex offender Jeffrey Epstein that were released on Friday by Democrats in the House Oversight Committee.

The committee earlier embarked on a probe to evaluate whether the federal government mishandled its case against Epstein and co-conspirator Ghislaine Maxwell, who is serving a 20-year prison sentence following a 2022 conviction for recruiting teenage girls to be sexually abused by Epstein.

President Donald Trump had promised voters on the campaign trail that he would release government documents related to Epstein, who was arrested in the summer of 2019 on sex trafficking charges and died in a New York federal prison, reportedly by suicide, before trial.

However, Trump has refused to endorse the release of any Epstein files since returning to the White House in January, and Republicans in Congress have followed his lead, keeping the documents out of the public’s view.

Democrats in the committee on Friday released redacted pages from a new batch of files they obtained through their probe without giving their Republican peers advanced notice. They were rebuked for the move.

In a statement on Friday, the committee said that the batch included 8,544 documents in response to a subpoena in August, and that, “Further review of the documents, which were redacted to protect the identity of victims, is ongoing.”

The latest batch of documents received by the committee from the Justice Department contained itineraries and notes by Epstein memorializing invitations he’d sent, trips he’d planned and meetings he’d booked with tech and business leaders.

Demonstrators gather for a press conference calling for the release of the Jeffrey Epstein files outside the United States Captiol on Wednesday September 03, 2025 in Washington, DC.

The Washington Post | The Washington Post | Getty Images

One of the itineraries indicated that Epstein expected Musk to make a trip to his private island in the U.S. Virgin Islands on Dec. 6, 2014, but then asked “is this still happening?”

Musk told Vanity Fair in 2019 that he had visited Epstein’s New York City mansion and that Epstein “tried repeatedly to get me to visit his island,” but the Tesla CEO had declined.

In June, Musk wrote in a post on X, that he thought Trump and his administration were withholding Epstein-related files from the public view in order to protect the president’s reputation.

“Time to drop the really big bomb: @realDonaldTrump is in the Epstein files,” Musk, who was in the midst of a public spat with the president, wrote at the time. “That is the real reason they have not been made public. Have a nice day, DJT!”

Trump was mentioned in previously released court documents from the Epstein case, but has not been formally accused of wrongdoing.

Musk started the year leading the Trump administration’s Department of Government Efficiency (DOGE), an effort to slash the size of the federal government and reduce the power of various regulatory agencies. He left DOGE in May, and he and the president proceeded to hurl insults at each other in public over a number of disagreements.

However, Trump and Musk remain close enough that they sat together at a memorial service for Charlie Kirk earlier this month after the right-wing activist was assassinated while speaking at a university in Utah.

The partially redacted files also indicated Epstein had breakfast with Bannon on Feb. 16, 2019, and lunch with investor Peter Thiel on Nov. 27, 2017. Bannon is a long-time Trump ally, and Thiel was a major backer of Trump ahead of the 2016 election who spoke at the Republican National Convention.

The files also mentioned that Epstein booked a “tentative breakfast party” with Microsoft founder Bill Gates, historically a supporter of Democrats, in December 2014.

Musk, Thiel, Bannon and Gates weren’t immediately available for comment.

WATCH: House Speaker Mike Johnson on Epstein files

House Speaker Mike Johnson on Epstein files: We want the American people to see it

Continue Reading

Technology

Trump calls for the firing of Lisa Monaco, Microsoft president of global affairs

Published

on

By

Trump calls for the firing of Lisa Monaco, Microsoft president of global affairs

U.S. Deputy Attorney General Lisa O. Monaco speaks as Attorney General Merrick Garland looks on after announcing an antitrust lawsuit against Live Nation Entertainment during a press conference at the Department of Justice in Washington, U.S., May 23, 2024. 

Ken Cedeno | Reuters

President Donald Trump on Friday demanded that Microsoft fire Lisa Monaco, an executive who served as deputy attorney general during the Biden administration.

The request appeared on Trump’s Truth Social account, which has 10 million followers. It comes one day after former FBI Director James Comey was indicted, days after Trump pushed to prosecute him.

“She is a menace to U.S. National Security, especially given the major contracts that Microsoft has with the United States Government,” Trump wrote in the post. “Because of Monaco’s many wrongful acts, the U.S. Government recently stripped her of all Security Clearances, took away all of her access to National Security Intelligence, and banned her from all Federal Properties.”

Microsoft declined to comment.

Parts of the U.S. government use Microsoft’s cloud infrastructure and productivity software. Earlier this month, Microsoft agreed to offer $3.1 billion in savings in one year on cloud services for agencies to use.

Earlier on Friday, Fox Business anchor Maria Bartiromo published an X post about Monaco joining Microsoft. The appointment happened in July, according to Monaco’s LinkedIn profile. The post contained a link to a July article on the University of Chicago law school’s website.

On Thursday, Microsoft said it would cut off cloud-based storage and artificial intelligence subscriptions to a unit of the Israeli military, after investigating a claim that the division had built a system to track Palestinians’ phone calls.

On Monday, Trump is set to meet with Benjamin Netanyahu, Israel’s prime minister, NBC News reported.

Microsoft CEO Satya Nadella attended a dinner alongside other technology executives at the White House earlier this month.

Read more CNBC tech news

Continue Reading

Trending