Mark Zuckerberg, chief executive officer of Meta Platforms Inc., left, arrives at federal court in San Jose, California, US, on Tuesday, Dec. 20, 2022.
David Paul Morris | Bloomberg | Getty Images
Toward the end of 2022, engineers on Meta’s team combating misinformation were ready to debut a key fact-checking tool that had taken half a year to build. The company needed all the reputational help it could get after a string of crises had badly damaged the credibility of Facebook and Instagram and given regulators additional ammunition to bear down on the platforms.
The new product would let third-party fact-checkers like The Associated Press and Reuters, as well as credible experts, add comments at the top of questionable articles on Facebook as a way to verify their trustworthiness.
related investing news
an hour ago
But CEO Mark Zuckerberg’s commitment to make 2023 the “year of efficiency” spelled the end of the ambitious effort, according to three people familiar with the matter who asked not to be named due to confidentiality agreements.
Over multiple rounds of layoffs, Meta announced plans to eliminate roughly 21,000 jobs, a mass downsizing that had an outsized effect on the company’s trust and safety work. The fact-checking tool, which had initial buy-in from executives and was still in a testing phase early this year, was completely dissolved, the sources said.
A Meta spokesperson did not respond to questions related to job cuts in specific areas and said in an emailed statement that “we remain focused on advancing our industry-leading integrity efforts and continue to invest in teams and technologies to protect our community.”
Across the tech industry, as companies tighten their belts and impose hefty layoffs to address macroeconomic pressures and slowing revenue growth, wide swaths of people tasked with protecting the internet’s most-populous playgrounds are being shown the exits. The cuts come at a time of increased cyberbullying, which has been linked to higher rates of adolescent self-harm, and as the spread of misinformation and violent content collides with the exploding use of artificial intelligence.
In their most recent earnings calls, tech executives highlighted their commitment to “do more with less,” boosting productivity with fewer resources. Meta, Alphabet, Amazon and Microsoft have all cut thousands of jobs after staffing up rapidly before and during the Covid pandemic. Microsoft CEO Satya Nadella recently said his company would suspend salary increases for full-time employees.
The slashing of teams tasked with trust and safety and AI ethics is a sign of how far companies are willing to go to meet Wall Street demands for efficiency, even with the 2024 U.S. election season — and the online chaos that’s expected to ensue — just months away from kickoff. AI ethics and trust and safety are different departments within tech companies but are aligned on goals related to limiting real-life harm that can stem from use of their companies’ products and services.
“Abuse actors are usually ahead of the game; it’s cat and mouse,” said Arjun Narayan, who previously served as a trust and safety lead at Google and TikTok parent ByteDance, and is now head of trust and safety at news aggregator app Smart News. “You’re always playing catch-up.”
For now, tech companies seem to view both trust and safety and AI ethics as cost centers.
Twitter effectively disbanded its ethical AI team in November and laid off all but one of its members, along with 15% of its trust and safety department, according to reports. In February, Google cut about one-third of a unit that aims to protect society from misinformation, radicalization, toxicity and censorship. Meta reportedly ended the contracts of about 200 content moderators in early January. It also laid off at least 16 members of Instagram’s well-being group and more than 100 positions related to trust, integrity and responsibility, according to documents filed with the U.S. Department of Labor.
Andy Jassy, chief executive officer of Amazon.Com Inc., during the GeekWire Summit in Seattle, Washington, U.S., on Tuesday, Oct. 5, 2021.
David Ryder | Bloomberg | Getty Images
In March, Amazon downsized its responsible AI team and Microsoft laid off its entire ethics and society team – the second of two layoff rounds that reportedly took the team from 30 members to zero. Amazon didn’t respond to a request for comment, and Microsoft pointed to a blog post regarding its job cuts.
At Amazon’s game streaming unit Twitch, staffers learned of their fate in March from an ill-timed internal post from Amazon CEO Andy Jassy.
Jassy’s announcement that 9,000 jobs would be cut companywide included 400 employees at Twitch. Of those, about 50 were part of the team responsible for monitoring abusive, illegal or harmful behavior, according to people familiar with the matter who spoke on the condition of anonymity because the details were private.
The trust and safety team, or T&S as it’s known internally, was losing about 15% of its staff just as content moderation was seemingly more important than ever.
In an email to employees, Twitch CEO Dan Clancy didn’t call out the T&S department specifically, but he confirmed the broader cuts among his staffers, who had just learned about the layoffs from Jassy’s post on a message board.
“I’m disappointed to share the news this way before we’re able to communicate directly to those who will be impacted,” Clancy wrote in the email, which was viewed by CNBC.
‘Hard to win back consumer trust’
A current member of Twitch’s T&S team said the remaining employees in the unit are feeling “whiplash” and worry about a potential second round of layoffs. The person said the cuts caused a big hit to institutional knowledge, adding that there was a significant reduction in Twitch’s law enforcement response team, which deals with physical threats, violence, terrorism groups and self-harm.
A Twitch spokesperson did not provide a comment for this story, instead directing CNBC to a blog post from March announcing the layoffs. The post didn’t include any mention of trust and safety or content moderation.
Narayan of Smart News said that with a lack of investment in safety at the major platforms, companies lose their ability to scale in a way that keeps pace with malicious activity. As more problematic content spreads, there’s an “erosion of trust,” he said.
“In the long run, it’s really hard to win back consumer trust,” Narayan added.
While layoffs at Meta and Amazon followed demands from investors and a dramatic slump in ad revenue and share prices, Twitter’s cuts resulted from a change in ownership.
Almost immediately after Elon Musk closed his $44 billion purchase of Twitter in October, he began eliminating thousands of jobs. That included all but one member of the company’s 17-person AI ethics team, according to Rumman Chowdhury, who served as director of Twitter’s machine learning ethics, transparency and accountability team. The last remaining person ended up quitting.
The team members learned of their status when their laptops were turned off remotely, Chowdhury said. Hours later, they received email notifications.
“I had just recently gotten head count to build out my AI red team, so these would be the people who would adversarially hack our models from an ethical perspective and try to do that work,” Chowdhury told CNBC. She added, “It really just felt like the rug was pulled as my team was getting into our stride.”
Part of that stride involved working on “algorithmic amplification monitoring,” Chowdhury said, or tracking elections and political parties to see if “content was being amplified in a way that it shouldn’t.”
Chowdhury referenced an initiative in July 2021, when Twitter’s AI ethics team led what was billed as the industry’s first-ever algorithmic bias bounty competition. The company invited outsiders to audit the platform for bias, and made the results public.
Chowdhury said she worries that now Musk “is actively seeking to undo all the work we have done.”
“There is no internal accountability,” she said. “We served two of the product teams to make sure that what’s happening behind the scenes was serving the people on the platform equitably.”
Twitter did not provide a comment for this story.
Advertisers are pulling back in places where they see increased reputational risk.
According to Sensor Tower, six of the top 10 categories of U.S. advertisers on Twitter spent much less in the first quarter of this year compared with a year earlier, with that group collectively slashing its spending by 53%. The site has recently come under fire for allowing the spread of violent images and videos.
The rapid rise in popularity of chatbots is only complicating matters. The types of AI models created by OpenAI, the company behind ChatGPT, and others make it easier to populate fake accounts with content. Researchers from the Allen Institute for AI, Princeton University and Georgia Tech ran tests in ChatGPT’s application programming interface (API), and found up to a sixfold increase in toxicity, depending on which type of functional identity, such as a customer service agent or virtual assistant, a company assigned to the chatbot.
Regulators are paying close attention to AI’s growing influence and the simultaneous downsizing of groups dedicated to AI ethics and trust and safety. Michael Atleson, an attorney at the Federal Trade Commission’s division of advertising practices, called out the paradox in a blog post earlier this month.
“Given these many concerns about the use of new AI tools, it’s perhaps not the best time for firms building or deploying them to remove or fire personnel devoted to ethics and responsibility for AI and engineering,” Atleson wrote. “If the FTC comes calling and you want to convince us that you adequately assessed risks and mitigated harms, these reductions might not be a good look.”
Meta as a bellwether
For years, as the tech industry was enjoying an extended bull market and the top internet platforms were flush with cash, Meta was viewed by many experts as a leader in prioritizing ethics and safety.
The company spent years hiring trust and safety workers, including many with academic backgrounds in the social sciences, to help avoid a repeat of the 2016 presidential election cycle, when disinformation campaigns, often operated by foreign actors, ran rampant on Facebook. The embarrassment culminated in the 2018 Cambridge Analytica scandal, which exposed how a third party was illicitly using personal data from Facebook.
But following a brutal 2022 for Meta’s ad business — and its stock price — Zuckerberg went into cutting mode, winning plaudits along the way from investors who had complained of the company’s bloat.
Beyond the fact-checking project, the layoffs hit researchers, engineers, user design experts and others who worked on issues pertaining to societal concerns. The company’s dedicated team focused on combating misinformation suffered numerous losses, four former Meta employees said.
Prior to Meta’s first round of layoffs in November, the company had already taken steps to consolidate members of its integrity team into a single unit. In September, Meta merged its central integrity team, which handles social matters, with its business integrity group tasked with addressing ads and business-related issues like spam and fake accounts, ex-employees said.
In the ensuing months, as broader cuts swept across the company, former trust and safety employees described working under the fear of looming layoffs and for managers who sometimes failed to see how their work affected Meta’s bottom line.
For example, things like improving spam filters that required fewer resources could get clearance over long-term safety projects that would entail policy changes, such as initiatives involving misinformation. Employees felt incentivized to take on more manageable tasks because they could show their results in their six-month performance reviews, ex-staffers said.
Ravi Iyer, a former Meta project manager who left the company before the layoffs, said that the cuts across content moderation are less bothersome than the fact that many of the people he knows who lost their jobs were performing critical roles on design and policy changes.
“I don’t think we should reflexively think that having fewer trust and safety workers means platforms will necessarily be worse,” said Iyer, who’s now the managing director of the Psychology of Technology Institute at University of Southern California’s Neely Center. “However, many of the people I’ve seen laid off are amongst the most thoughtful in rethinking the fundamental designs of these platforms, and if platforms are not going to invest in reconsidering design choices that have been proven to be harmful — then yes, we should all be worried.”
A Meta spokesperson previously downplayed the significance of the job cuts in the misinformation unit, tweeting that the “team has been integrated into the broader content integrity team, which is substantially larger and focused on integrity work across the company.”
Still, sources familiar with the matter said that following the layoffs, the company has fewer people working on misinformation issues.
For those who’ve gained expertise in AI ethics, trust and safety and related content moderation, the employment picture looks grim.
Newly unemployed workers in those fields from across the social media landscape told CNBC that there aren’t many job openings in their area of specialization as companies continue to trim costs. One former Meta employee said that after interviewing for trust and safety roles at Microsoft and Google, those positions were suddenly axed.
An ex-Meta staffer said the company’s retreat from trust and safety is likely to filter down to smaller peers and startups that appear to be “following Meta in terms of their layoff strategy.”
Chowdhury, Twitter’s former AI ethics lead, said these types of jobs are a natural place for cuts because “they’re not seen as driving profit in product.”
“My perspective is that it’s completely the wrong framing,” she said. “But it’s hard to demonstrate value when your value is that you’re not being sued or someone is not being harmed. We don’t have a shiny widget or a fancy model at the end of what we do; what we have is a community that’s safe and protected. That is a long-term financial benefit, but in the quarter over quarter, it’s really hard to measure what that means.”
At Twitch, the T&S team included people who knew where to look to spot dangerous activity, according to a former employee in the group. That’s particularly important in gaming, which is “its own unique beast,” the person said.
Now, there are fewer people checking in on the “dark, scary places” where offenders hide and abusive activity gets groomed, the ex-employee added.
More importantly, nobody knows how bad it can get.
French accounting software firm Pennylane has doubled its valuation to 2 billion euros ($2.16 billion) in a new 75 million euro funding round.
Pennylane told CNBC that it raised the fresh funds from a host of venture funds, with Sequoia Capital leading the round and Alphabet’s CapitalG, Meritech and DST Global also participating.
Founded in 2020, Pennylane sells what it calls an “all-in-one” accounting platform that’s used by accountants and other financial professionals.
The platform is primarily targeted toward small to medium-sized firms, offering tools for functions spanning expensing, invoicing, cash flow management and financial forecasting.
“We came in tailoring a product that looks a bit like [Intuit’s] QuickBooks or Xero but adapting it to the needs of continental accountants, starting with France,” Pennylane’s CEO and co-founder Arthur Waller told CNBC.
Pennylane currently serves around 4,500 accounting firms and more than 350,000 small and medium-sized enterprises. The startup was previously valued at 1 billion euros in a 2024 investment round.
European expansion
For now, Pennylane only operates in France. However, after the new fundraise, the startup now plans to expand its services across Europe — starting with Germany in the summer.
“It’s going to be a lot of work. It took us approximately five years to have a product mature in France,” Waller said, adding that he hopes to reach product maturity in Germany in a shorter time period of two years.
Pennylane plans to end the year on about 100 million euros of annual recurring revenue — a measure of annual revenue generated from subscriptions that renew each year.
“We are going to get breakeven by end of the year,” Waller said, adding that Pennylane runs on lower customer acquisition costs than other fintechs. “75% of our costs are R&D [research and development],” he added.
Pennylane also plans to boost hiring after the new funding round. It is looking to grow to 800 employees by the end of 2025, up from 550 currently.
‘Co-pilot’ for accountants
Like many other fintechs, Pennylane is embracing artificial intelligence. Waller said the startup is using the technology to help clients automate bookkeeping and free up time for other things like advisory services.
“Because we have a modern tech stack, we’re able to embed all kinds of AI, but also GenAI, into the product,” Waller told CNBC. “We’re really trying to build a ‘co-pilot’ for the accountant.”
He added that new electronic invoicing regulations coming into force across Europe are pushing more and more firms to consider new digital products to serve their accounting needs.
“Every business in France within a year from now will have to chose a product operator to issue and receive invoices,” Waller said, calling e-invoicing a “huge market.”
Luciana Lixandru, a partner at Sequoia who sits on the board of Pennylane, said the reforms represent a “massive market opportunity” as the accounting industry is still catching up in terms of digitization.
“The reality is the market is very fragmented,” Lixandru told CNBC via email. “In each country there are one or two decades-old incumbents, and few options that serve both SMBs and their accountants.”
In this photo illustration, the logo of TikTok is displayed on a smartphone screen on April 5, 2025 in Shanghai, China.
Vcg | Visual China Group | Getty Images
Apple will keep ByteDance-owned TikTok on its App Store for at least 75 more days after receiving assurances from Attorney General Pam Bondi, according to a report from Bloomberg News.
This comes after President Donald Trump signed an executive order Friday to extend the TikTok ban deadline for the second time. TikTok will be banned in the U.S. unless China’s ByteDance sells its U.S. operations under a national security law signed by former President Joe Biden in April 2024.
AG Bondi wrote in a letter to Apple that the company should act in accordance with Trump’s deadline extension and that it would not be penalized for hosting the platform, according to unnamed sources cited in the report.
Apple did not respond to a request for comment.
After TikTok went briefly offline for U.S. users in January following the initial ban deadline, it remained unavailable for download in the App Store until Feb. 13. Apple had reinstated TikTok to its app store after receiving a similar letter of assurance from Bondi.
The extension comes days after Trump announced cumulative tariffs of 54% on China. Prior to the additional tariff rollout on April 2, the president said he could reduce duties on the country to help facilitate a deal for ByteDance to sell its U.S. operations of TikTok.
“Maybe I’ll give them a little reduction in tariffs or something to get it done,” Trump said during a press conference in March. “TikTok is big, but every point in tariffs is worth more than TikTok.”
Whether to buy cryptocurrency as a long-term holding may be the biggest decision an investor interested in digital assets has to make, but where to store crypto like bitcoin can become the most consequential.
Following the wildfires earlier this year in California, social media posts began to appear with claims of bitcoin losses, with some users showing metal plates intended to protect seed phrases burnt up and illegible or describing the complexity of recovering crypto keys stored in a safety deposit box in a bank impacted by the fires. While impossible to verify individual claims about fires consuming hard drives, laptops and other storage devices containing so-called hard and cold storage crypto wallets and seed phrases, what is certain is that bitcoin self-custody presents a unique set of security issues. And those risks are growing.
Holders of crypto typically use some form of what can be called a “wallet,” and there are a few main features – whether that wallet is connected to the internet, and how much control is directly embedded in the wallet for trades and transfers. There is also the underlying issue of whether a crypto investor uses a third party for custody at all, or maintains total custody and trading control over their holdings.
The standard third-party platform “hot wallet” – think of an offering from a Coinbase or Blockchain.com – is constantly connected to the internet. Cold storage and “cold wallets,” on the other hand, include hardware devices (like a USB stick) that holds private keys offline, or even just a seed phrase (a master recovery code, a collection of 12 to 24 words used to recover access to a crypto wallet) on paper/metal. Hardware wallets or offline backups of seed phrases can be used to access crypto when connected to the internet through another device.
With third-party custodial options, there are steps to help owners remain vigilant against the threat posed by cybercriminals who can gain access to an internet-connected platform, including the use of two-factor authentication, and strong passwords. The U.S. Marshals Service within the Department of Justice, which is responsible for asset forfeiture from U.S. law enforcement, uses Coinbase Prime to provide custody for its seized digital assets.
Many crypto bulls prefer to self-custody digital assets like bitcoin for some of the same reasons they are interested in cryptocurrencies to begin with: lack of faith in some forms of institutional control. Custodial wallets from crypto brokers trade convenience for the risk of exchange hacks, shutdowns, or fraud, as in the case of the high-profile implosion of FTX. And the wildfires are just one example in a recent string of global events that raise more questions about shifts in the crypto custody debate. There is the ongoing conflict in the Middle East and Russia-Ukraine war, which has led crypto bulls from overseas to re-think their approach to self-custody.
Nick Neuman, co-founder and CEO of self-custody company Casa, said physical risks in the world like a natural disaster are an opportunity to revisit how bitcoin security works, and the common security lapses folded into most peoples’ practices. “Most people secure their bitcoin with one private key. If that key is on a single device or written down on paper as a seed phrase, it’s a single point of failure. If you lose that key, your bitcoin is gone,” he said.
It should be obvious that keeping seed phrases on paper offers the lowest level of protection against fire, yet it is common practice, Neuman said. Slipping these pieces of paper into fireproof bags or safes offer some protection, but not much, and even going the extra steps to have the seed phrases on “indestructible” metal storage plates presents a few failure points. For one, they might prove to be not so indestructible, and second, they may be impossible to locate amid the rubble.
“Logically, given the location of the fires in California and the stories being shared on X, it’s highly likely bitcoin was lost,” said Neuman. “Some of them are pretty convincing,” he said.
Some self-custody services, like Casa, offer multi-signature setups that reduce the risks of single-point failure. A multi-key crypto “vault” can include mobile phone keys, multiple hardware keys, and a recovery key that a company likes Casa holds on an owner’s behalf.
The multi-sig custody approach allows an owner to hold a majority of keys while a trusted partner holds a minority of keys. John Haar, managing director at Swan Bitcoin, says that in such a setup, the owner would need to lose all the physical devices and all copies of the seed phrases at the same time. As long as the owner can access at least one device or one seed phrase, they would be able to recover their bitcoin. This approach should significantly limit the potential for all of the devices to be lost in an event like a natural disaster, Haar said.
“You can spread these keys across multiple regions or even countries, and you need any three of the five keys to approve a bitcoin transaction,” Neuman said of Casa’s five-key approach.
Jordan Baltazor, chief administrative officer at Fortress Trust, a regulated crypto custodian, says best practices that we use in other areas of personal life should apply to cryptocurrency. For one, diversification of storage approach and weighing of risks. Digital assets are no different, he says, when it comes to backing up personal and sensitive data on the cloud to ensure data against loss or corruption.
Companies including Coinbase and Jack Dorsey’s Block offer products that try to merge some of these ideas, creating a more secure version of a crypto wallet that remains convenient to use. There is Coinbase Vault, which includes enhanced security steps before a user can access crypto holdings for trading. And there is Coinbase Wallet and Block’s Bitkey, which have mobile apps that work like a traditional wallet making moving bitcoin around easy, but with the ability to pair with hardware wallets and added security more commonly associated with cold storage.
Bitkey hardware requires multiple authorizations for transactions for added security, similar to “multi-sig wallets.” Bitkey also offers recovery tools so one of the biggest risks of self-custody — losing codes or phrases needed to recover a cold wallet — is less of an issue.
Solutions like Dorsey’s may help to solve the tension between convenience and security; at minimum, they underline that this tension exists and will likely be something of a roadblock to more widespread crypto adoption. Beyond the risks out there in the form of wildfires, all kinds of natural disasters, and wars, bitcoin self-custody can be vulnerable to the biggest personal risk of all: unexpected death of the bitcoin owner. There is arguably nothing more complicated than inheritance when it comes to unlocking the crypto chain of custody.
Coinbase requires probate court documents and specific will designations before releasing funds from custody, while physical wallets offer little to no support, potentially leaving all that digital value stuck on a private key. Bitkey rolled out its inheritance solution in February for what a Bitkey executive called, “kind of a multibillion-dollar problem waiting to happen.”
“People who have a material investment in bitcoin absolutely need to be thinking differently about how to protect it,” Neuman said. He says that after disasters like the California wildfires, or when exchanges go bust like FTX, the industry does see more crypto holders taking action to move to more secure storage setups. “I suppose it’s human nature to wait until ‘bad things happen’ to spur action to improve your own personal situation,” he said. “But I think people would be better off if they were more proactive. Otherwise, they risk having that ‘bad thing’ happen to them, and then it’s too late,” he said.