Published
2 years agoon
By
adminChatGPT users are grumbling that the popular chatbot has gotten lazier by refusing to complete some complex requests and its creator OpenAI is investigating the strange behavior.
Social media platforms such as X, Reddit and even OpenAIs developer forum are riddled with accounts that ChatGPT a large-language model trained on massive troves of internet data — is resisting labor-intensive prompts, such as requests to help the user write code and transcribe blocks of text.
The strange trend has emerged as Microsoft-backed OpenAI faces stiff competition from other firms pursuing generative artificial intelligence products, including Google, which recently released its own Gemini chatbot tool.
Late last month, X user and startup executive Matt Wensing posted screenshots of an exchange in which he asked ChatGPT to list all weeks between now and May 5th 2024.
The chatbot initially refused, claiming that it cant generate an exhaustive list of each individual week but eventually complied when Wensing insisted.
GPT has definitely gotten more resistant to doing tedious work, Wensing wrote on Nov. 27. Essentially giving you part of the answer and then telling you to do the rest.
In a second example, Wensing asked ChatGPT to generate a few dozen lines of code. Instead, the chatbot provided him a template and told him to follow this pattern to complete the task.
Another example where I asked it to extend some code.
It would have had to generate perhaps 50 lines.
It told me to do it instead. pic.twitter.com/MymTCrMSCZ
In another viral instance, ChatGPT refused a users request to transcribe text included in a photo from a page of a book.
ChatGPT officials confirmed last Thursday that they were examining the situation.
We’ve heard all your feedback about GPT4 getting lazier! we haven’t updated the model since Nov 11th, and this certainly isn’t intentional. model behavior can be unpredictable, and we’re looking into fixing it.
we've heard all your feedback about GPT4 getting lazier! we haven't updated the model since Nov 11th, and this certainly isn't intentional. model behavior can be unpredictable, and we're looking into fixing it ?
The company later clarified its point, stating the idea is not that the model has somehow changed itself since Nov 11th.
It’s just that differences in model behavior can be subtle — only a subset of prompts may be degraded, and it may take a long time for customers and employees to notice and fix these patterns, the company said.
ChatGPT and other chatbots have swelled in popularity since last year. However, experts have raised concerns about their tendency to hallucinate, or spit out false and inaccurate information, such as a recent incident in which ChatGPT and Googles Bard chatbot each falsely claimed that Israel and Hamas had reached a ceasefire agreement when no deal had occurred at the time.
A recent study performed by researchers at Long Island University found that ChatGPT incorrectly responded to some 75% of questions about prescription drug usage and gave some responses that would have caused harm to patients.
OpenAI has contended with internal turmoil following the surprise firing and rehiring of Sam Altman as its CEO. A report last week said senior OpenAI employees had raised concerns that Altman was psychologically abusive to staffers before his initial firing.
ChatGPT remains the most popular chatbot of its kind, reaching more than 100 million active weekly users as of November, according to Altman.

You may like
Technology
How a ‘nudify’ site turned a group of friends into key figures in a fight against AI-generated porn
Published
13 mins agoon
September 27, 2025By
admin

In June of last year, Jessica Guistolise received a text message that would change her life.
While the technology consultant was dining with colleagues on a work trip in Oregon, her phone alerted her to a text from an acquaintance named Jenny, who said she had urgent information to share about her estranged husband, Ben.
After a nearly two-hour conversation with Jenny later that night, Guistolise recalled, she was dazed and in a state of panic. Jenny told her she’d found pictures on Ben’s computer of more than 80 women whose social media photos were used to create deepfake pornography — videos and photos of sexual activities made using artificial intelligence to merge real photos with pornographic images. All the women in Ben’s images lived in the Minneapolis area.
Jenny used her phone to snap pictures of images on Ben’s computer, Guistolise said. The screenshots, some of which were viewed by CNBC, revealed that Ben used a site called DeepSwap to create the deepfakes. DeepSwap falls into a category of “nudify” sites that have proliferated since the emergence of generative AI less than three years ago.
CNBC decided not to use Jenny’s surname in order to protect her privacy and withheld Ben’s surname due to his assertion of mental health struggles. They are now divorced.
Guistolise said that after talking to Jenny, she was desperate to cut her trip short and rush home.
In Minneapolis the women’s experiences would soon spark a growing opposition to AI deepfake tools and those who use them.
One of the manipulated photos Guistolise saw upon her return was generated using a photo from a family vacation. Another was from her goddaughter’s college graduation. Both had been taken from her Facebook page.
“The first time I saw the actual images, I think something inside me shifted, like fundamentally changed,” said Guistolise, 42.
CNBC interviewed more than two dozen people — including victims, their family members, attorneys, sexual-abuse experts, AI and cybersecurity researchers, trust and safety workers in the tech industry, and lawmakers — to learn how nudify websites and apps work and to understand their real-life impact on people.
“It’s not something that I would wish for on anybody,” Guistolise said.
Jessica Guistolise, Megan Hurley and Molly Kelley talk with CNBC in Minneapolis, Minnesota, on July 11, 2025, about fake pornographic images and videos depicting their faces made by their mutual friend Ben using AI site DeepSwap.
Jordan Wyatt | CNBC
Nudify apps represent a small but rapidly growing corner of the new AI universe, which exploded following the arrival of OpenAI’s ChatGPT in late 2022. Since then, Meta, Alphabet, Microsoft, Amazon and others have collectively spent hundreds of billions of dollars investing in AI and pursuing artificial general intelligence, or AGI — technology that could rival and even surpass the capabilities of humans.
For consumers, most of the excitement to date has been around chatbots and image generators that allow users to perform complex tasks with simple text prompts. There’s also the burgeoning market of AI companions, and a host of agents designed to enhance productivity.
But victims of nudify apps are experiencing the flip side of the AI boom. Thanks to generative AI, products such as DeepSwap are so easy to use — requiring no coding ability or technical expertise — that they can be accessed by just about anyone. Guistolise and others said they worry that it’s only a matter of time before the technology spreads widely, leaving many more people to suffer the consequences.
Guistolise filed a police report about the case and obtained a restraining order against Ben. But she and her friends quickly realized there was a problem with that strategy.
Ben’s actions may have been legal.
The women involved weren’t underage. And as far as they were aware, the deepfakes hadn’t been distributed, existing only on Ben’s computer. While they feared that the videos and images were on a server somewhere and could end up in the hands of bad actors, there was nothing of that sort that they could pin on Ben.
One of the other women involved was Molly Kelley, a law student who would spend the ensuing year helping the group navigate AI’s uncharted legal maze.
“He did not break any laws that we’re aware of,” Kelley said, referring to Ben’s behavior. “And that is problematic.”
Ben admitted to creating the deepfakes, and told CNBC by email that he feels guilty and ashamed of his behavior.
Jenny described Ben’s actions as “horrific, inexcusable, and unforgivable,” in an emailed statement.
“From the moment I learned the truth, my loyalty has been with the women affected, and my focus remains on how best to support them as they navigate their new reality,” she wrote. “This is not an issue that will resolve itself. We need stronger laws to ensure accountability — not only for the individuals who misuse this technology, but also for the companies that enable its use on their platforms.”
Readily available
Like other new and simple-to-use AI tools, experts say that many apps that have nudify services advertise on Facebook and are available to download from the Apple App Store and Google Play Store.
Haley McNamara, senior vice president at the National Center on Sexual Exploitation, said nudify apps and sites have made it “very easy to create realistic sexually explicit, deepfake imagery of a person based off of one photo in less time than it takes to brew a cup of coffee.”
Two photos of Molly Kelley’s face and one of Megan Hurley’s appear on a screenshot taken from a computer belonging to their mutual friend Ben, who used the women’s Facebook photos without their consent to make fake pornographic images and videos using the AI site DeepSwap, July 11, 2025.
A spokesperson from Meta, Facebook’s owner, said in a statement that the company has strict rules barring ads that contain nudity and sexual activity and that it shares information it learns about nudify services with other companies through an industrywide child-safety initiative. Meta characterized the nudify ecosystem as an adversarial space and said it’s improving its technology to try to prevent bad actors from running ads.
Apple told CNBC that it regularly removes and rejects apps that violate its app store guidelines related to content deemed offensive, misleading and overtly sexual and pornographic.
Google declined to comment.
The issue extends well beyond the U.S.
In June 2024, around the same time the women in Minnesota discovered what was happening, an Australian man was sentenced to nine years in prison for creating deepfake content of 26 women. That same month, media reports detailed an investigation by Australian authorities into a school incident in which a teenager allegedly created and distributed deepfake content of nearly 50 female classmates.
“Whatever the worst potential of any technology is, it’s almost always exercised against women and girls first,” said Mary Anne Franks, professor at the George Washington University Law School.
Security researchers from the University of Florida and Georgetown University wrote in a research paper presented in August that nudify tools are taking design cues from popular consumer apps and using familiar subscription models. DeepSwap charges users $19.99 a month to access “premium” benefits, which includes credits that can be used for AI video generation, faster processing and higher-quality images.
The researchers said the “nudification platforms have gone fully mainstream” and are “advertised on Instagram and hosted in app stores.”
Guistolise said she knew that people could use AI to create nonconsensual porn, but she didn’t realize how easy and accessible the apps were until she saw a synthetic version of herself participating in raunchy, explicit activity.
According to the screenshots of Ben’s DeepSwap page, the faces of Guistolise and the other Minnesota women sit neatly in rows of eight, like in a school yearbook. Clicking on the photos, Jenny’s pictures show, leads to a collection of computer-generated clones engaged in a variety of sexual acts. The women’s faces had been merged with the nude bodies of other women.
DeepSwap’s privacy policy states that users have seven days to look at the content from the time they upload it to the site, and that the data is stored for that period on servers in Ireland. DeepSwap’s site says it deletes the data at that point, but users can download it in the interim onto their own computer.
The site also has a terms of service page, which says users shouldn’t upload any content that “contains any private or personal information of a third party without such third party’s consent.” Based on the experiences of the Minnesota women, who provided no consent, it’s unclear whether DeepSwap has any enforcement mechanism.
DeepSwap provides little publicly by way of contact information and didn’t reply to multiple CNBC requests for comment.
CNBC reporting found AI site DeepSwap, shown here, was used by a Minneapolis man to create fake pornographic images and videos depicting the faces of more than 80 of his friends and acquaintances.
In a press release published in July, DeepSwap used a Hong Kong dateline and included a quote attributed to a person the release identified as CEO and co-founder Penyne Wu. The media contact on the release was listed as marketing manager Shawn Banks.
CNBC was unable to find information online about Wu, and sent multiple emails to the address provided for Banks, but received no response.
DeepSwap’s website currently lists “MINDSPARK AI LIMITED” as its company name, provides an address in Dublin, and states that its terms of service are “governed by and construed in accordance with the laws of Ireland.”
However, in July, the same DeepSwap page had no mention of Mindspark, and references to Ireland instead said Hong Kong.
Psychological trauma
Kelley, 42, found out about her inclusion in Ben’s AI portfolio after receiving a text message from Jenny. She invited Jenny over that afternoon.
After learning what happened, Kelley, who was six months pregnant at the time, said it took her hours to muster the strength to view the photos captured from Jenny’s phone. Kelley said what she saw was her face “very realistically on someone else’s body, in images and videos.”
Kelley said her stress level spiked to a degree that it soon started to affect her health. Her doctor warned her that too much cortisol, brought on by stress, would cause her body not “to make any insulin,” Kelley recalled.
“I was not enjoying life at all like this,” said Kelley, who, like Guistolise, filed a police report on the matter.
Kelley said that in Jenny’s photos she recognized some of her good friends, including many she knew from the service industry in Minneapolis. She said she then notified the women and she purchased facial-recognition software to help identify the other victims so they could be informed. About half a dozen victims have yet to be identified, she said.
“It was incredibly time consuming and really stressful because I was trying to work,” she said.
Victims of nudify tools can experience significant trauma, leading to suicidal thoughts, self-harm and a fear of trust, said Ari Ezra Waldman, a law professor at University of California, Irvine who testified at a 2024 House committee hearing on the harms of deepfakes.
Waldman said even when nudified images haven’t been posted publicly, subjects can fear that the images may eventually be shared, and “now someone has this dangling over their head like a sword of Damocles.”
“Everyone is subject to being objectified or pornographied by everyone else,” he said.
Three victims showed CNBC explicit, AI-created deepfake images depicting their faces as well as those of other women, during an interview in Minneapolis, Minnesota, on July 11, 2025.
Megan Hurley, 42, said she was trying to enjoy a cruise last summer off the western coast of Canada when she received an urgent text message from Kelley. Her vacation was ruined.
Hurley described instant feelings of deep paranoia after returning home to Minneapolis. She said she had awkward conversations with an ex-boyfriend and other male friends, asking them to take screenshots if they ever saw AI-generated porn online that looked like her.
“I don’t know what your porn consumption is like, but if you ever see me, could you please screencap and let me know where it is?” Hurley said, describing the kinds of messages she sent at the time. “Because we’d be able to prove dissemination at that point.”
Hurley said she contacted the FBI but never heard back. She also filled out an online FBI crime report, which she shared with CNBC. The FBI confirmed that it received CNBC’s request for comment, but didn’t provide a response.
The group of women began searching for help from lawmakers. They were led to Minnesota state Sen. Erin Maye Quade, a Democrat who had previously sponsored a bill that became a state statute criminalizing the “nonconsensual dissemination of a deep fake depicting intimate parts or sexual acts.”
Kelley landed a video call with the senator in early August 2024.
In the virtual meeting, several women from the group told their stories, and explained their frustrations about the limited legal recourse available. Maye Quade went to work on a new bill, which she announced in February, that would compel AI companies to shut down apps using their technology to create nudify services.
The bill, which is still being considered, would fine tech companies that offer nudify services $500,000 for every nonconsensual, explicit deepfake that they generate in the state of Minnesota.
Maye Quade told CNBC in an interview that the bill is the modern equivalent of longstanding laws that make it illegal for a person to peep into someone else’s window and snap explicit photos without consent.
“We just haven’t grappled with the emergence of AI technology in the same way,” Maye Quade said.
Minnesota state Sen. Erin Maye Quade, at left, talks to CNBC’s Jonathan Vanian and Katie Tarasov in Minneapolis on July 11, 2025, about her efforts to pass state legislation that would fine tech companies that offer nudify services $500,000 for every nonconsensual, explicit deepfake image they generate in her state.
Jordan Wyatt | CNBC
But Maye Quade acknowledged that enforcing the law against companies based overseas presents a significant challenge.
“This is why I think a federal response is more appropriate,” she said. “Because actually having a federal government, a country could take far more actions with companies that are based in other countries.”
Kelley, who gave birth to her son in September 2024, characterized one of her late October meetings with Maye Quade and the group as a “blur,” because she said she was “mentally and physically unwell due to sleep deprivation and stress.”
She said she now avoids social media.
“I never announced the birth of my second child,” Kelley said. “There’s plenty of people out there who have no idea that I had a baby. I just didn’t want to put it online.”
The early days of deepfake pornography
The rise of deepfakes can be traced back to 2018. That’s when videos showing former President Barack Obama giving speeches that never existed and actor Jim Carrey, instead of Jack Nicholson, appearing in “The Shining” started going viral.
Lawmakers sounded the alarm. Sites such as Pornhub and Reddit responded by pledging to take down nonconsensual content from their platforms. Reddit said at the time that it removed a large deepfake-related subreddit as part of an enforcement of a policy banning “involuntary pornography.”
The community congregated elsewhere. One popular place was MrDeepFakes, which hosted explicit AI-generated videos and served as an online discussion forum.
By 2023, MrDeepFakes became the top deepfake site on the web, hosting 43,000 sexualized videos containing nearly 4,000 individuals, according to a 2025 study of the site by researchers from Stanford University and the University of California San Diego.
MrDeepFakes claimed to host only “celebrity” deepfakes, but the researchers found “that hundreds of targeted individuals have little to no online or public presence.” The researchers also discovered a burgeoning economy, with some users agreeing to create custom deepfakes for others at an average cost of $87.50 per video, the paper said.
Some ads for nudify services have gone to more mainstream locations. Alexios Mantzarlis, an AI security expert at Cornell Tech, earlier this year discovered more than 8,000 ads on the Meta ad library across Facebook and Instagram for a nudify service called CrushAI.
AI apps and sites like Undress, DeepNude and CrushAI are some of the “nudify” tools that can be used to create fake pornographic images and videos depicting real people’s faces pulled from innocuous online photos.
Emily Park | CNBC
At least one DeepSwap ad ran on Instagram in October, according to the social media company’s ad library. The account associated with running the ad does not appear to be officially tied to DeepSwap, but Mantzarlis said he suspects the account could have been an affiliate partner of the nudify service.
Meta said it reviewed ads associated with the Instagram account in question and didn’t find any violations.
Top nudify services are often found on third-party affiliate sites such as ThePornDude that earn money by mentioning them, Mantzarlis said.
In July, Mantzarlis co-authored a report analyzing 85 nudify services. The report found that the services receive 18.6 million monthly unique visitors in aggregate, though Mantzarlis said that figure doesn’t take into account people who share the content in places such as Discord and Telegram.
As a business, nudify services are a small part of the generative AI market. Mantzarlis estimates annual revenue of about $36 million, but he said that’s a conservative prediction and includes only AI-generated content from sites that specifically promote nudify services.
MrDeepFakes abruptly shut down in May, shortly after its key operator was publicly identified in a joint investigative report from Canada’s CBC News, Danish news sites Politiken and Tjekdet, and online investigative outlet Bellingcat.
CNBC reached out by email to the address that was associated with the person named as the operator in some materials from the CBC report, but received no reply.
With MrDeepFakes going dark, Discord has emerged as an increasingly popular meeting spot, experts said. Known mostly for its use in the online gaming community, Discord has roughly 200 million global monthly active users who access its servers to discuss shared interests.
CNBC identified several public Discord servers, including one associated with DeepSwap, where users appeared to be asking others in the forum to create sexualized deepfakes based on photos they shared.
Leigh Cassidy Gibson, a researcher at the University of Florida, co-authored the 2025 paper that looked at “20 popular and easy-to-find nudification websites.” She confirmed to CNBC that while DeepSwap wasn’t named, it was one of the sites she and her colleagues studied to understand the market. More recently, she said, they’ve turned their attention to various Discord servers where users seek tutorials and how-to guides on creating AI-generated, sexual content.
Discord declined to comment.
‘It’s insane to me that this is legal right now’
At the federal level, the government has at least taken note.
In May, President Donald Trump signed the “Take It Down Act” into law, which goes into effect in May. The law bans online publication of nonconsensual sexual images and videos, including those that are inauthentic and generated by AI.
“A person who violates one of the publication offenses pertaining to depictions of adults is subject to criminal fines, imprisonment of up to two years, or both,” according to the law’s text.
Experts told CNBC that the law still doesn’t address the central issue facing the Minnesota women, because there’s no evidence that the material was distributed online.
Maye Quade’s bill in Minnesota emphasizes that the creation of the material is the core problem and requires a legal response.
Some experts are concerned that the Trump administration’s plans to bolster the AI sector will undercut states’ efforts. In late July, Trump signed executive orders as part of the White House’s AI Action Plan, underscoring AI development as a “national security imperative.”
As part of Trump’s proposed spending bill earlier this year, states would have been deterred from regulating AI for a 10-year period or risk losing certain government subsidies related to AI infrastructure. The Senate struck down that provision in July, keeping it out of the bill Trump signed in August.
“I would not put it past them trying to resurrect the moratorium,” said Waldman, of UC Irvine, regarding the tech industry’s continued influence on AI policy.
A White House official told CNBC that the Take It Down Act, which was supported by the Trump administration and signed months prior to the AI Action Plan, criminalizes nonconsensual deepfakes. The official said the AI Action Plan encourages states to allow federal laws to override individual state laws.
In San Francisco, home to OpenAI and other high-valued AI startups, the city can pursue civil cases against nudify services due to California consumer protection laws. Last year San Francisco sued 16 companies associated with nudify apps.
The San Francisco City Attorney’s office said in June that an investigation related to the lawsuits had led to 10 of the most-visited nudify websites being taken offline or no longer being accessible in California. One of the companies that was sued, Briver LLC, settled with the city and has agreed to pay $100,000 in civil penalties. Additionally, Briver no longer operates websites that can create nonconsensual deepfake pornography, the city attorney’s office said.
Jordan Wyatt | CNBC
Further south, in Silicon Valley, Meta in June sued Hong Kong-based Joy Timeline HK, the company behind CrushAI. Meta said that Joy Timeline attempted to “circumvent Meta’s ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules.”
Still, Mantzarlis, who has been publishing his research on Indicator, said he continues to find nudify-related ads on Meta’s platforms.
Mantzarlis and a colleague from the American Sunlight Project discovered 4,215 ads for 15 AI nudifier services that ran on Facebook and Instagram since June 11, they wrote in a joint report on Sept. 10. Mantzarlis said Meta eventually removed the ads, some of which were more subtle than others in implying nudifying capabilities.
Meta told CNBC that earlier this month that it removed thousands of ads linked to companies offering nudify services and sent the entities cease-and-desist letters for violating the company’s ad guidelines.
In Minnesota, the group of friends are trying to get on with their lives while continuing to advocate for change.
Guistolise said she wants people to realize that AI is potentially being used to harm them in ways they never imagined.
“It’s so important that people know that this really is out there and it’s really accessible and it’s really easy to do, and it really needs to stop,” Guistolise said. “So here we are.”
Survivors of sexual violence can seek confidential support from the National Sexual Assault Hotline at 1-800-656-4673.
Science
NASA’s Astrobee Robots Gain New Capabilities via Arkisys Partnership
Published
45 mins agoon
September 27, 2025By
adminEnvironment
Deal time: Mercedes EQB gets the axe – and major markdowns
Published
51 mins agoon
September 27, 2025By
admin

Mercedes-Benz is saying goodbye to its capable, seven-passenger EQB electric vehicle – but that doesn’t mean it’s over. If you’ve been eyeing a new, quasi-affordable SUV with nationwide dealer support and a luxury logo, the time is now.
German-language Mercedes fansite JESMB is reporting that Mercedes-Benz has removed the EQB from its dealer configurator page, and the company’s Hungarian plant in Kecskemét will only produce new EQBs that have already been ordered until production of the new-look Mercedes GLB “with EQ technology” begins in 2026.
A quick search reveals that dealers are pushing hard to unload their existing stock of Mercedes EQBs. Mercedes-Benz of North Olmsted in Ohio (home of Benzs and Bowties’ Doug Horner), for example, recently advertised a new EQB with an MSRP of $59,300 with a $9,000 manufacturer incentive plus a $4,744 dealer discount. That’s more than 23% off the EV’s original sticker price and, at $45,556, is well below the $48,841 average transaction price for new vehicles in July.
MBZNO sold that car, and they’re not alone. CarsDirect has reported up to $14,500 in total Mercedes-Benz lease incentives for some EQB lease programs in select markets while TrueCar reports an average 15.6% average savings (!) off MSRP.
Advertisement – scroll for more content
For that money, Mercedes’ EQB customers get a capable, mid-sized SUV with room for five adults and two kids in (what my family has come to call) “the wayback” seats, 251 miles of EPA-rated range and a 30 minute 10-80% charge time on a 100 kW DCFC. 0-60 mph performance and highway acceleration is adequate, ranging from a 6.0-second sprint in the EQB 350 models and 7-8 seconds from the 250+ and 300 models.
It’s still a tough sell

Even with the discounts, there’s no escaping the fact that EVs from brands like Chevy, Ford, Hyundai, and Kia have objectively eclipsed the EQB in terms of range, performance, and charging speeds.
That said, the three-pointed star still means something to a lot of buyers. If they can look beyond the specs and take the EQB for a test drive, they might find that the signature Mercedes-Benz feel indeed lives in this well-rounded electric SUV, and that will probably be able to handle everything they throw at it. Plus, with the $7,500 Federal EV Tax Credit set to expire on September 30th, the current deals on this electric SUV might be as good as it gets!
SOURCES: CarsDirect, TrueCar, JESMB; featured image by MBUSA.

If you drive an electric vehicle, make charging at home fast, safe, and convenient with a Level 2 charger installed by Qmerit. As the nation’s most trusted EV charger installation network, Qmerit connects you with licensed, background-checked electricians who specialize in EV charging. You’ll get a quick online estimate, upfront pricing, and installation backed by Qmerit’s nationwide quality guarantee. Their pros follow the highest safety standards so you can plug in at home with total peace of mind.
Ready to charge smarter? Get started today with Qmerit. (trusted affiliate link)
FTC: We use income earning auto affiliate links. More.
Trending
-
Sports3 years ago
‘Storybook stuff’: Inside the night Bryce Harper sent the Phillies to the World Series
-
Sports1 year ago
Story injured on diving stop, exits Red Sox game
-
Sports2 years ago
Game 1 of WS least-watched in recorded history
-
Sports3 years ago
Button battles heat exhaustion in NASCAR debut
-
Sports3 years ago
MLB Rank 2023: Ranking baseball’s top 100 players
-
Sports4 years ago
Team Europe easily wins 4th straight Laver Cup
-
Environment2 years ago
Japan and South Korea have a lot at stake in a free and open South China Sea
-
Environment12 months ago
Here are the best electric bikes you can buy at every price level in October 2024