Connect with us

Published

on

Get ready for a revolution. Crypto is on the verge of transforming finance, just like the internet revolutionized information.

Theres a lot of good thats bound to come to the world once we break the walled gardens of finance, said Jason Yanowitz, the co-founder at Blockworks, the media brand thats driving forward the most important conversations in crypto.

In anticipation of crypto becoming the underlying technology of the entire financial market and the largest asset class in the world, Blockworks' team led by Yanowitz and co-founder Michael Ippolito raised $12 million at a $135 million valuation.

See Also: Benzinga's Stock Whisper Index – 5 Tickers Investors Secretly Monitor

This round putsthe company in a stronger position to expand and compete with the likes of Wall Street Journal. Yanowitz explained that, unlike other media companies, his team is not using the money to buy trust; instead, they are using it to build products at the bottom of the funnel.

Yanowitzs unconventional leadership tactics date back to 2017. While visiting Hungary, a region historically plagued by tyranny, he wanted to learn more about self-sovereign money like Bitcoin BTC/USD but could not find good sources.

With aims to mend these information asymmetries, Yanowitz co-founded Blockworks in 2018. The startup initially focused on hosting conferences before moving into digital media. This is opposite to how media companies are traditionally built, Yanowitz explained, noting that the first events they hosted were happy hours for the institutional and crypto crowds.

Eventually, Blockworks launched its Digital Asset Summit, one of the few crypto events where attendees still adhere to formal dress codes.

The podcast network was our first step into digital media, and when the pandemic happened, we realized we needed to double down on the media side of the business to address certain gaps. We added reporters from Bloomberg, CNBC, and the Wall Street Journal to help us create a world-class media company, launching the new media site in 2021.

The newsletter and podcast businesses, as well as partnerships with the likes ofBankless, resulted in Blockworks exponential growth throughout 2021. In anticipation of a broader slowdown heading into 2022, Yanowitzs team began building a platform for professional users. The recent raise, whichincluded participation from 10T, Framework Ventures, and Santiago Santos, is helping accelerate growth while keeping the culture intact, Yanowitz said, noting many new users are likely to come from Asia, where there has been a reverse in regulation and an inflow of capital and investment.

We dont want 100 million people to read Blockworks. We aim to reach a million of the most influential crypto executives and investors globally, Yanowitz said, noting he is willing to sacrifice page views and attendance at conferences to fulfill Blockworks mission.

Good content will win out in the long term," he says. "Were playing a very long-term game.

Now Read: BlackRock Moves Forward With Bitcoin ETF

Continue Reading

Technology

How a ‘nudify’ site turned a group of friends into key figures in a fight against AI-generated porn

Published

on

By

How a 'nudify' site turned a group of friends into key figures in a fight against AI-generated porn

The alarming rise of AI ‘nudify’ apps that create explicit images of real people

In June of last year, Jessica Guistolise received a text message that would change her life.

While the technology consultant was dining with colleagues on a work trip in Oregon, her phone alerted her to a text from an acquaintance named Jenny, who said she had urgent information to share about her estranged husband, Ben.

After a nearly two-hour conversation with Jenny later that night, Guistolise recalled, she was dazed and in a state of panic. Jenny told her she’d found pictures on Ben’s computer of more than 80 women whose social media photos were used to create deepfake pornography — videos and photos of sexual activities made using artificial intelligence to merge real photos with pornographic images. All the women in Ben’s images lived in the Minneapolis area.

Jenny used her phone to snap pictures of images on Ben’s computer, Guistolise said. The screenshots, some of which were viewed by CNBC, revealed that Ben used a site called DeepSwap to create the deepfakes. DeepSwap falls into a category of “nudify” sites that have proliferated since the emergence of generative AI less than three years ago. 

CNBC decided not to use Jenny’s surname in order to protect her privacy and withheld Ben’s surname due to his assertion of mental health struggles. They are now divorced.

Guistolise said that after talking to Jenny, she was desperate to cut her trip short and rush home.

In Minneapolis the women’s experiences would soon spark a growing opposition to AI deepfake tools and those who use them.

One of the manipulated photos Guistolise saw upon her return was generated using a photo from a family vacation. Another was from her goddaughter’s college graduation. Both had been taken from her Facebook page.  

“The first time I saw the actual images, I think something inside me shifted, like fundamentally changed,” said Guistolise, 42.

CNBC interviewed more than two dozen people — including victims, their family members, attorneys, sexual-abuse experts, AI and cybersecurity researchers, trust and safety workers in the tech industry, and lawmakers — to learn how nudify websites and apps work and to understand their real-life impact on people.

It’s not something that I would wish for on anybody,” Guistolise said.

Jessica Guistolise, Megan Hurley and Molly Kelley talk with CNBC in Minneapolis, Minnesota, on July 11, 2025, about fake pornographic images and videos depicting their faces made by their mutual friend Ben using AI site DeepSwap.

Jordan Wyatt | CNBC

Nudify apps represent a small but rapidly growing corner of the new AI universe, which exploded following the arrival of OpenAI’s ChatGPT in late 2022. Since then, Meta, Alphabet, Microsoft, Amazon and others have collectively spent hundreds of billions of dollars investing in AI and pursuing artificial general intelligence, or AGI — technology that could rival and even surpass the capabilities of humans. 

For consumers, most of the excitement to date has been around chatbots and image generators that allow users to perform complex tasks with simple text prompts. There’s also the burgeoning market of AI companions, and a host of agents designed to enhance productivity. 

But victims of nudify apps are experiencing the flip side of the AI boom. Thanks to generative AI, products such as DeepSwap are so easy to use — requiring no coding ability or technical expertise — that they can be accessed by just about anyone. Guistolise and others said they worry that it’s only a matter of time before the technology spreads widely, leaving many more people to suffer the consequences.

Guistolise filed a police report about the case and obtained a restraining order against Ben. But she and her friends quickly realized there was a problem with that strategy.

Ben’s actions may have been legal. 

The women involved weren’t underage. And as far as they were aware, the deepfakes hadn’t been distributed, existing only on Ben’s computer. While they feared that the videos and images were on a server somewhere and could end up in the hands of bad actors, there was nothing of that sort that they could pin on Ben. 

One of the other women involved was Molly Kelley, a law student who would spend the ensuing year helping the group navigate AI’s uncharted legal maze. 

“He did not break any laws that we’re aware of,” Kelley said, referring to Ben’s behavior. “And that is problematic.”

Ben admitted to creating the deepfakes, and told CNBC by email that he feels guilty and ashamed of his behavior.

Jenny described Ben’s actions as “horrific, inexcusable, and unforgivable,” in an emailed statement.

“From the moment I learned the truth, my loyalty has been with the women affected, and my focus remains on how best to support them as they navigate their new reality,” she wrote. “This is not an issue that will resolve itself. We need stronger laws to ensure accountability — not only for the individuals who misuse this technology, but also for the companies that enable its use on their platforms.”

Readily available

Like other new and simple-to-use AI tools, experts say that many apps that have nudify services advertise on Facebook and are available to download from the Apple App Store and Google Play Store.

Haley McNamara, senior vice president at the National Center on Sexual Exploitation, said nudify apps and sites have made it “very easy to create realistic sexually explicit, deepfake imagery of a person based off of one photo in less time than it takes to brew a cup of coffee.”

Two photos of Molly Kelley’s face and one of Megan Hurley’s appear on a screenshot taken from a computer belonging to their mutual friend Ben, who used the women’s Facebook photos without their consent to make fake pornographic images and videos using the AI site DeepSwap, July 11, 2025.

A spokesperson from Meta, Facebook’s owner, said in a statement that the company has strict rules barring ads that contain nudity and sexual activity and that it shares information it learns about nudify services with other companies through an industrywide child-safety initiative. Meta characterized the nudify ecosystem as an adversarial space and said it’s improving its technology to try to prevent bad actors from running ads. 

Apple told CNBC that it regularly removes and rejects apps that violate its app store guidelines related to content deemed offensive, misleading and overtly sexual and pornographic. 

Google declined to comment.

The issue extends well beyond the U.S.

In June 2024, around the same time the women in Minnesota discovered what was happening, an Australian man was sentenced to nine years in prison for creating deepfake content of 26 women. That same month, media reports detailed an investigation by Australian authorities into a school incident in which a teenager allegedly created and distributed deepfake content of nearly 50 female classmates.

“Whatever the worst potential of any technology is, it’s almost always exercised against women and girls first,” said Mary Anne Franks, professor at the George Washington University Law School.

Security researchers from the University of Florida and Georgetown University wrote in a research paper presented in August that nudify tools are taking design cues from popular consumer apps and using familiar subscription models. DeepSwap charges users $19.99 a month to access “premium” benefits, which includes credits that can be used for AI video generation, faster processing and higher-quality images.

The researchers said the “nudification platforms have gone fully mainstream” and are “advertised on Instagram and hosted in app stores.”

Guistolise said she knew that people could use AI to create nonconsensual porn, but she didn’t realize how easy and accessible the apps were until she saw a synthetic version of herself participating in raunchy, explicit activity. 

According to the screenshots of Ben’s DeepSwap page, the faces of Guistolise and the other Minnesota women sit neatly in rows of eight, like in a school yearbook. Clicking on the photos, Jenny’s pictures show, leads to a collection of computer-generated clones engaged in a variety of sexual acts. The women’s faces had been merged with the nude bodies of other women.

DeepSwap’s privacy policy states that users have seven days to look at the content from the time they upload it to the site, and that the data is stored for that period on servers in Ireland. DeepSwap’s site says it deletes the data at that point, but users can download it in the interim onto their own computer. 

The site also has a terms of service page, which says users shouldn’t upload any content that “contains any private or personal information of a third party without such third party’s consent.” Based on the experiences of the Minnesota women, who provided no consent, it’s unclear whether DeepSwap has any enforcement mechanism. 

DeepSwap provides little publicly by way of contact information and didn’t reply to multiple CNBC requests for comment.

CNBC reporting found AI site DeepSwap, shown here, was used by a Minneapolis man to create fake pornographic images and videos depicting the faces of more than 80 of his friends and acquaintances.

In a press release published in July, DeepSwap used a Hong Kong dateline and included a quote attributed to a person the release identified as CEO and co-founder Penyne Wu. The media contact on the release was listed as marketing manager Shawn Banks. 

CNBC was unable to find information online about Wu, and sent multiple emails to the address provided for Banks, but received no response. 

DeepSwap’s website currently lists “MINDSPARK AI LIMITED” as its company name, provides an address in Dublin, and states that its terms of service are “governed by and construed in accordance with the laws of Ireland.”

However, in July, the same DeepSwap page had no mention of Mindspark, and references to Ireland instead said Hong Kong. 

Psychological trauma

Kelley, 42, found out about her inclusion in Ben’s AI portfolio after receiving a text message from Jenny. She invited Jenny over that afternoon.

After learning what happened, Kelley, who was six months pregnant at the time, said it took her hours to muster the strength to view the photos captured from Jenny’s phone. Kelley said what she saw was her face “very realistically on someone else’s body, in images and videos.” 

Kelley said her stress level spiked to a degree that it soon started to affect her health. Her doctor warned her that too much cortisol, brought on by stress, would cause her body not “to make any insulin,” Kelley recalled. 

“I was not enjoying life at all like this,” said Kelley, who, like Guistolise, filed a police report on the matter.

Kelley said that in Jenny’s photos she recognized some of her good friends, including many she knew from the service industry in Minneapolis. She said she then notified the women and she purchased facial-recognition software to help identify the other victims so they could be informed. About half a dozen victims have yet to be identified, she said.

“It was incredibly time consuming and really stressful because I was trying to work,” she said. 

Victims of nudify tools can experience significant trauma, leading to suicidal thoughts, self-harm and a fear of trust, said Ari Ezra Waldman, a law professor at University of California, Irvine who testified at a 2024 House committee hearing on the harms of deepfakes.

Waldman said even when nudified images haven’t been posted publicly, subjects can fear that the images may eventually be shared, and “now someone has this dangling over their head like a sword of Damocles.” 

“Everyone is subject to being objectified or pornographied by everyone else,” he said. 

Three victims showed CNBC explicit, AI-created deepfake images depicting their faces as well as those of other women, during an interview in Minneapolis, Minnesota, on July 11, 2025.

Megan Hurley, 42, said she was trying to enjoy a cruise last summer off the western coast of Canada when she received an urgent text message from Kelley. Her vacation was ruined. 

Hurley described instant feelings of deep paranoia after returning home to Minneapolis. She said she had awkward conversations with an ex-boyfriend and other male friends, asking them to take screenshots if they ever saw AI-generated porn online that looked like her. 

“I don’t know what your porn consumption is like, but if you ever see me, could you please screencap and let me know where it is?” Hurley said, describing the kinds of messages she sent at the time. “Because we’d be able to prove dissemination at that point.”

Hurley said she contacted the FBI but never heard back. She also filled out an online FBI crime report, which she shared with CNBC. The FBI confirmed that it received CNBC’s request for comment, but didn’t provide a response.

The group of women began searching for help from lawmakers. They were led to Minnesota state Sen. Erin Maye Quade, a Democrat who had previously sponsored a bill that became a state statute criminalizing the “nonconsensual dissemination of a deep fake depicting intimate parts or sexual acts.”  

Kelley landed a video call with the senator in early August 2024. 

In the virtual meeting, several women from the group told their stories, and explained their frustrations about the limited legal recourse available. Maye Quade went to work on a new bill, which she announced in February, that would compel AI companies to shut down apps using their technology to create nudify services. 

The bill, which is still being considered, would fine tech companies that offer nudify services $500,000 for every nonconsensual, explicit deepfake that they generate in the state of Minnesota.

Maye Quade told CNBC in an interview that the bill is the modern equivalent of longstanding laws that make it illegal for a person to peep into someone else’s window and snap explicit photos without consent. 

“We just haven’t grappled with the emergence of AI technology in the same way,” Maye Quade said.

Minnesota state Sen. Erin Maye Quade, at left, talks to CNBC’s Jonathan Vanian and Katie Tarasov in Minneapolis on July 11, 2025, about her efforts to pass state legislation that would fine tech companies that offer nudify services $500,000 for every nonconsensual, explicit deepfake image they generate in her state.

Jordan Wyatt | CNBC

But Maye Quade acknowledged that enforcing the law against companies based overseas presents a significant challenge. 

“This is why I think a federal response is more appropriate,” she said. “Because actually having a federal government, a country could take far more actions with companies that are based in other countries.”

Kelley, who gave birth to her son in September 2024, characterized one of her late October meetings with Maye Quade and the group as a “blur,” because she said she was “mentally and physically unwell due to sleep deprivation and stress.”

She said she now avoids social media. 

“I never announced the birth of my second child,” Kelley said. “There’s plenty of people out there who have no idea that I had a baby. I just didn’t want to put it online.”

The early days of deepfake pornography

The rise of deepfakes can be traced back to 2018. That’s when videos showing former President Barack Obama giving speeches that never existed and actor Jim Carrey, instead of Jack Nicholson, appearing in “The Shining” started going viral. 

Lawmakers sounded the alarm. Sites such as Pornhub and Reddit responded by pledging to take down nonconsensual content from their platforms. Reddit said at the time that it removed a large deepfake-related subreddit as part of an enforcement of a policy banning “involuntary pornography.”

The community congregated elsewhere. One popular place was MrDeepFakes, which hosted explicit AI-generated videos and served as an online discussion forum. 

By 2023, MrDeepFakes became the top deepfake site on the web, hosting 43,000 sexualized videos containing nearly 4,000 individuals, according to a 2025 study of the site by researchers from Stanford University and the University of California San Diego.

MrDeepFakes claimed to host only “celebrity” deepfakes, but the researchers found “that hundreds of targeted individuals have little to no online or public presence.” The researchers also discovered a burgeoning economy, with some users agreeing to create custom deepfakes for others at an average cost of $87.50 per video, the paper said.

Some ads for nudify services have gone to more mainstream locations. Alexios Mantzarlis, an AI security expert at Cornell Tech, earlier this year discovered more than 8,000 ads on the Meta ad library across Facebook and Instagram for a nudify service called CrushAI. 

AI apps and sites like Undress, DeepNude and CrushAI are some of the “nudify” tools that can be used to create fake pornographic images and videos depicting real people’s faces pulled from innocuous online photos.

Emily Park | CNBC

At least one DeepSwap ad ran on Instagram in October, according to the social media company’s ad library. The account associated with running the ad does not appear to be officially tied to DeepSwap, but Mantzarlis said he suspects the account could have been an affiliate partner of the nudify service.

Meta said it reviewed ads associated with the Instagram account in question and didn’t find any violations.

Top nudify services are often found on third-party affiliate sites such as ThePornDude that earn money by mentioning them, Mantzarlis said. 

In July, Mantzarlis co-authored a report analyzing 85 nudify services. The report found that the services receive 18.6 million monthly unique visitors in aggregate, though Mantzarlis said that figure doesn’t take into account people who share the content in places such as Discord and Telegram.

As a business, nudify services are a small part of the generative AI market. Mantzarlis estimates annual revenue of about $36 million, but he said that’s a conservative prediction and includes only AI-generated content from sites that specifically promote nudify services. 

MrDeepFakes abruptly shut down in May, shortly after its key operator was publicly identified in a joint investigative report from Canada’s CBC News, Danish news sites Politiken and Tjekdet, and online investigative outlet Bellingcat.

CNBC reached out by email to the address that was associated with the person named as the operator in some materials from the CBC report, but received no reply. 

With MrDeepFakes going dark, Discord has emerged as an increasingly popular meeting spot, experts said. Known mostly for its use in the online gaming community, Discord has roughly 200 million global monthly active users who access its servers to discuss shared interests. 

CNBC identified several public Discord servers, including one associated with DeepSwap, where users appeared to be asking others in the forum to create sexualized deepfakes based on photos they shared. 

Leigh Cassidy Gibson, a researcher at the University of Florida, co-authored the 2025 paper that looked at “20 popular and easy-to-find nudification websites.” She confirmed to CNBC that while DeepSwap wasn’t named, it was one of the sites she and her colleagues studied to understand the market. More recently, she said, they’ve turned their attention to various Discord servers where users seek tutorials and how-to guides on creating AI-generated, sexual content.

Discord declined to comment.

‘It’s insane to me that this is legal right now’

At the federal level, the government has at least taken note. 

In May, President Donald Trump signed the “Take It Down Act” into law, which goes into effect in May. The law bans online publication of nonconsensual sexual images and videos, including those that are inauthentic and generated by AI. 

“A person who violates one of the publication offenses pertaining to depictions of adults is subject to criminal fines, imprisonment of up to two years, or both,” according to the law’s text.

Experts told CNBC that the law still doesn’t address the central issue facing the Minnesota women, because there’s no evidence that the material was distributed online. 

Maye Quade’s bill in Minnesota emphasizes that the creation of the material is the core problem and requires a legal response. 

Some experts are concerned that the Trump administration’s plans to bolster the AI sector will undercut states’ efforts. In late July, Trump signed executive orders as part of the White House’s AI Action Plan, underscoring AI development as a “national security imperative.” 

As part of Trump’s proposed spending bill earlier this year, states would have been deterred from regulating AI for a 10-year period or risk losing certain government subsidies related to AI infrastructure. The Senate struck down that provision in July, keeping it out of the bill Trump signed in August.  

“I would not put it past them trying to resurrect the moratorium,” said Waldman, of UC Irvine, regarding the tech industry’s continued influence on AI policy.

A White House official told CNBC that the Take It Down Act, which was supported by the Trump administration and signed months prior to the AI Action Plan, criminalizes nonconsensual deepfakes. The official said the AI Action Plan encourages states to allow federal laws to override individual state laws.

In San Francisco, home to OpenAI and other high-valued AI startups, the city can pursue civil cases against nudify services due to California consumer protection laws. Last year San Francisco sued 16 companies associated with nudify apps.

The San Francisco City Attorney’s office said in June that an investigation related to the lawsuits had led to 10 of the most-visited nudify websites being taken offline or no longer being accessible in California. One of the companies that was sued, Briver LLC, settled with the city and has agreed to pay $100,000 in civil penalties. Additionally, Briver no longer operates websites that can create nonconsensual deepfake pornography, the city attorney’s office said.

Jordan Wyatt | CNBC

Further south, in Silicon Valley, Meta in June sued Hong Kong-based Joy Timeline HK, the company behind CrushAI. Meta said that Joy Timeline attempted to “circumvent Meta’s ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules.”

Still, Mantzarlis, who has been publishing his research on Indicator, said he continues to find nudify-related ads on Meta’s platforms. 

Mantzarlis and a colleague from the American Sunlight Project discovered 4,215 ads for 15 AI nudifier services that ran on Facebook and Instagram since June 11, they wrote in a joint report on Sept. 10. Mantzarlis said Meta eventually removed the ads, some of which were more subtle than others in implying nudifying capabilities.  

Meta told CNBC that earlier this month that it removed thousands of ads linked to companies offering nudify services and sent the entities cease-and-desist letters for violating the company’s ad guidelines.

In Minnesota, the group of friends are trying to get on with their lives while continuing to advocate for change. 

Guistolise said she wants people to realize that AI is potentially being used to harm them in ways they never imagined.

“It’s so important that people know that this really is out there and it’s really accessible and it’s really easy to do, and it really needs to stop,” Guistolise said. “So here we are.”

Survivors of sexual violence can seek confidential support from the National Sexual Assault Hotline at 1-800-656-4673.

Continue Reading

Science

NASA’s Astrobee Robots Gain New Capabilities via Arkisys Partnership

Published

on

By

NASA has partnered with Arkisys to extend the Astrobee robotic mission aboard the ISS. The free-flying robots are set to support future exploration by performing spacecraft maintenance and assisting astronauts. The collaboration sustains a platform vital for testing new technologies in microgravity as NASA prepares missions to the Moon and Mars.

Continue Reading

Environment

Deal time: Mercedes EQB gets the axe – and major markdowns

Published

on

By

Deal time: Mercedes EQB gets the axe – and major markdowns

Mercedes-Benz is saying goodbye to its capable, seven-passenger EQB electric vehicle – but that doesn’t mean it’s over. If you’ve been eyeing a new, quasi-affordable SUV with nationwide dealer support and a luxury logo, the time is now.

German-language Mercedes fansite JESMB is reporting that Mercedes-Benz has removed the EQB from its dealer configurator page, and the company’s Hungarian plant in Kecskemét will only produce new EQBs that have already been ordered until production of the new-look Mercedes GLB “with EQ technology” begins in 2026.

A quick search reveals that dealers are pushing hard to unload their existing stock of Mercedes EQBs. Mercedes-Benz of North Olmsted in Ohio (home of Benzs and Bowties’ Doug Horner), for example, recently advertised a new EQB with an MSRP of $59,300 with a $9,000 manufacturer incentive plus a $4,744 dealer discount. That’s more than 23% off the EV’s original sticker price and, at $45,556, is well below the $48,841 average transaction price for new vehicles in July.

MBZNO sold that car, and they’re not alone. CarsDirect has reported up to $14,500 in total Mercedes-Benz lease incentives for some EQB lease programs in select markets while TrueCar reports an average 15.6% average savings (!) off MSRP.

Advertisement – scroll for more content

For that money, Mercedes’ EQB customers get a capable, mid-sized SUV with room for five adults and two kids in (what my family has come to call) “the wayback” seats, 251 miles of EPA-rated range and a 30 minute 10-80% charge time on a 100 kW DCFC. 0-60 mph performance and highway acceleration is adequate, ranging from a 6.0-second sprint in the EQB 350 models and 7-8 seconds from the 250+ and 300 models.

It’s still a tough sell


Mercedes EQB slasher sale; via ChatGPT.

Even with the discounts, there’s no escaping the fact that EVs from brands like Chevy, Ford, Hyundai, and Kia have objectively eclipsed the EQB in terms of range, performance, and charging speeds.

That said, the three-pointed star still means something to a lot of buyers. If they can look beyond the specs and take the EQB for a test drive, they might find that the signature Mercedes-Benz feel indeed lives in this well-rounded electric SUV, and that will probably be able to handle everything they throw at it. Plus, with the $7,500 Federal EV Tax Credit set to expire on September 30th, the current deals on this electric SUV might be as good as it gets!

SOURCES: CarsDirect, TrueCar, JESMB; featured image by MBUSA.


If you drive an electric vehicle, make charging at home fast, safe, and convenient with a Level 2 charger installed by Qmerit. As the nation’s most trusted EV charger installation network, Qmerit connects you with licensed, background-checked electricians who specialize in EV charging. You’ll get a quick online estimate, upfront pricing, and installation backed by Qmerit’s nationwide quality guarantee. Their pros follow the highest safety standards so you can plug in at home with total peace of mind.

Ready to charge smarter? Get started today with Qmerit. (trusted affiliate link)

FTC: We use income earning auto affiliate links. More.

Continue Reading

Trending