Voters cast ballots on election day at the Fairfax County Government Center polling location in Fairfax, Virginia, on November 2, 2021.
Andrew Caballero-Reynolds | AFP | Getty Images
Social media platforms including Meta’s Facebook and Instagram, Twitter, TikTok and Google’sYouTube are readying themselves for another heated Election Day this week.
The companies now regularly come under close scrutiny around election time, something that accelerated following findings that Russian agents used social media to sow division in the run-up to the 2016 election. During the last presidential election in 2020, the platforms faced the challenge of moderating election denialism as an outgoing president stoked the false claims himself, leading several of them to at least temporarily suspend him after the Jan. 6 insurrection.
This year, the platforms are using all of those experiences to prepare for threats to democracy and safety as voters decide who will represent them in Congress, governor’s offices and state legislatures.
Here’s how all the major platforms are planning to police their services on Election Day.
Meta
Onur Dogman | Lightrocket | Getty Images
Meta’s Facebook has been one of the most scrutinized platforms when it comes to misinformation. In response to years of criticism, it has bolstered its approach to election integrity. It’s said it will use many of the same policies and safeguards this year that it had in 2020.
Meta has stood up its Elections Operations Center, which it likened to a command center, to bring together different teams throughout the company to monitor and quickly address threats they see on the platform. It’s used this model dozens of times worldwide since 2018.
Facebook and Instagram also share reliable information with users about how to vote (including in languages other than English). The company said it’s already sent more than 80 million election notifications this year on the two platforms.
The company uses third-party fact-checkers to help label false posts so they can be demoted in the algorithm before they go viral. Meta said it’s investing an additional $5 million in fact-checking and media literacy efforts before Election Day.
Meta said it’s prepared to seek out threats and coordinated harassment against election officials and poll workers, who were the subject of misinformation campaigns and threats during the last election.
The company is once again banning new political ads in the week before the election, as it did in 2020. While ads submitted before the blackout period can still run, political advertisers have expressed frustration about the policy since it’s often helpful to respond to last-minute attacks and polling with fresh messaging. Facebook already has extra screening for those who sign up as political advertisers and maintains information about political ads in a database available to the public.
Meta has pledged to remove posts that seek to suppress voting, like misinformation about how and when to vote. It also said it would reject ads that discourage voting or question the legitimacy of the upcoming election.
In a study by New York University’s Cybersecurity for Democracy and international NGO Global Witness testing election integrity ad screens across social media platforms, the groups found Facebook was mostly successful in blocking ads they submitted with election disinformation. Still, 20% to 50% of the ads tested were approved, depending on what language they were in and whether they were submitted from inside or outside the U.S.
The researchers also violated Facebook’s policies about who is allowed to place ads, with one of the test accounts placing ads from the U.K. The researchers also did not go through Facebook’s authorization process, which is supposed to provide extra scrutiny for political advertisers.
The researchers did not run the ads once they were approved, so it’s not clear whether Facebook would have blocked them during that step.
A Meta spokesperson said in a statement published with the study that it was “based on a very small sample of ads, and are not representative given the number of political ads we review daily across the world.”
“We invest significant resources to protect elections, from our industry-leading transparency efforts to our enforcement of strict protocols on ads about social issues, elections, or politics – and we will continue to do so,” a Meta spokesperson said in a statement to CNBC. The New York Times first reported the statement.
TikTok
TikTok owner ByteDance has launched a women’s fashion website called If Yooou. Pinduoduo launched an e-commerce site in the U.S. called Temu. The two companies are the latest Chinese tech giants to look to crack the international e-commerce market domianted by Amazon.
Mike Kemp | In Pictures | Getty Images
TikTok has become an increasingly important platform for all sorts of discussion, but it’s tried to keep its service at arm’s length from the most heated political discussions.
TikTok does not allow political ads and has stated its desire for the service to be “a fun, positive and joyful experience.”
“TikTok is first and foremost an entertainment platform,” the company said in a September blog post. It added that it wants to “foster and promote a positive environment that brings people together, not divide them.”
Still, the NYU and Global Witness study found TikTok performed the worst out of the platforms it tested in blocking election-related misinformation in ads. Only one ad it submitted in both English and Spanish falsely claiming Covid vaccines were required to vote was rejected, while ads promoting the wrong date for the election or encouraging voters to vote twice were approved.
TikTok did not provide a comment on the report but told the researchers in a statement that it values “feedback from NGOs, academics, and other experts which helps us continually strengthen our processes and policies.”
The service said that while it doesn’t “proactively encourage politicians or political parties to join TikTok,” it welcomes them to do so. The company announced in September that it would try out mandatory verification for government, politician and political party accounts in the U.S. through the midterms and disable those types of accounts from running ads.
TikTok said it would allow those accounts to run ads in limited circumstances, like public health and safety campaigns, but that they’d have to work with a TikTok representative to do so.
TikTok also barred these accounts from other ways to make money on the platform, like through tipping and e-commerce. Politician and political party accounts are also not allowed to solicit campaign donations on their pages.
TikTok has said it’s committed to stemming the spread of misinformation, including by working with experts to strengthen its policies and outside fact-checkers to verify election-related posts.
It’s also sought to build on its experiences from the last election, like by surfacing its election center with information about how to vote earlier in the cycle. It’s also tried to do more to educate creators on the platform about what kinds of paid partnerships are and are not allowed and how to disclose them.
Twitter
A video grab taken from a video posted on the Twitter account of billionaire Tesla chief Elon Musk on October 26, 2022 shows himself carrying a sink as he enters the Twitter headquarters in San Francisco. Elon Musk changed his Twitter profile to “Chief Twit” and posted video of himself walking into the social network’s California headquarters carrying a sink, days before his contentious takeover of the company must be finalized.
– | Afp | Getty Images
Twitter is in a unique position this Election Day, after billionaire Elon Musk bought the platform and took it private less than a couple weeks before voters headed to the polls.
Musk has expressed a desire to loosen Twitter’s content moderation policies. He’s said decisions on whether to reinstate banned users, a group that includes former President Donald Trump, would take a few weeks at least.
But shortly after the deal, Bloomberg reported the team responsible for content moderation lost access to some of their tools. Twitter’s head of safety and integrity, Yoel Roth, characterized that move as a normal measure for a recently acquired company to take and said Twitter’s rules were still being enforced at scale.
But the timing shortly before the election is particularly stark. Musk said teams would have access to all the necessary tools by the end of the week before the election, according to a civil society group leader who was on a call with Musk earlier in the week.
Before Musk’s takeover, Twitter laid out its election integrity plans in an August blog post. Those included activating its civic integrity policy, which allows it to label and demote misleading information about the election, sharing “prebunks,” or proactively debunked false claims about the election, and surfacing relevant news and voting information in a dedicated tab. Twitter has not allowed political ads since 2019.
Google/YouTube
People walk past a billboard advertisement for YouTube on September 27, 2019 in Berlin, Germany.
Sean Gallup | Getty Images
Google and its video platform YouTube are also important platforms outside of Facebook where advertisers seek to get their campaign messages out.
The platforms require advertisers running election messages to become verified and disclose the ad’s backing. Political ads, including information on how much money was behind them and how much they were viewed, are included in the company’s transparency report.
Prior to the last election, Google made it so users could no longer be targeted quite as narrowly with political ads, limiting targeting to certain general demographic categories.
The NYU and Global Witness study found YouTube performed the best out of the platforms it tested in blocking ads with election misinformation. The site ultimately blocked all the misinformation-packed ads the researchers submitted through an account that hadn’t gone through its advertiser verification process. The platform also blocked the YouTube channel hosting the ads, though a Google Ads account remained active.
Like other platforms, Google and YouTube highlight authoritative sources and information on the election high up in related searches. The company said it would remove content violating its policies by misleading about the voting process or encouraging interference with the democratic process.
YouTube also has sought to help users learn how to spot manipulative messages on their own using education content.
Google said it’s helped train campaign and election officials on security practices.
Clarification: This story has been updated to clarify that Meta’s statement on the study was first reported by The New York Times.
In this photo illustration, the Bluesky Social logo is displayed on a cell phone in Rio de Janeiro, Brazil, on September 4, 2024.
Mauro Pimentel | AFP | Getty Images
Micro-blogging startup Bluesky has gained over 1.25 million new users in the past week, indicating some social media users are changing their habits following the U.S. presidential election.
Bluesky’s influx of users shows that the app has been able to pitch itself as an alternative to X, formerly Twitter, which is owned by Elon Musk, as well as Meta’s Threads. The bulk of the new users are coming from the U.S., Canada and the United Kingdom, the company said Wednesday.
“We’re excited to welcome everyone looking for a better social media experience,” Bluesky CEO Jay Graber told CNBC in a statement.
Despite the surge of users, Bluesky’s total base remains a fraction of its rivals’. The Seattle startup claims 15.2 million total users. Meta CEO Mark Zuckerberg in October said Threads had nearly 275 million monthly users. Musk in May claimed that X had 600 million monthly users, but market intelligence firm Sensor Tower pegged X’s monthly base at 318 million users in October.
Created in 2019 as a project inside Twitter, when Jack Dorsey was still CEO, Bluesky doesn’t show ads and has yet to develop a business model. It became an independent company in 2021. Dorsey said in May of this year that he’s no longer a member of Bluesky’s board.
“Journalists, politicians, and news junkies have also been talking up Bluesky as a better X alternative than Threads,” wrote Similarweb, the internet traffic and monitoring service, in a Tuesday blog.
Some users with new Bluesky accounts posted that they had moved to the service due to Musk and his support for President-elect Donald Trump.
“It’s appalling that Elon Musk has transformed Twitter into a Trump propaganda machine, rife with disinformation and misinformation,” one user posted on Bluesky.
This is Bluesky’s second notable surge in the last couple of months.
Bluesky said it picked up 2 million new users in September after the Brazilian Supreme Court suspended X in the country for failing to comply with regional content moderation policies and not appointing a local representative.
Cisco CEO Chuck Robbins speaks at The Wall Street Journal’s Future of Everything Festival in New York on May 21, 2024.
Dia Dipasupil | Getty Images
Cisco reported a fourth straight quarter of declining revenue even as results topped analysts’ estimates. The stock slipped 2.5% in extended trading.
Here’s how the company did in comparison with LSEG consensus:
Earnings per share: 91 cents adjusted vs. 87 cents expected
Revenue: $13.84 billion vs. $13.77 billion expected
Cisco’s revenue dropped 6% in the quarter ended Oct. 26, from $14.7 billion a year earlier, according to a statement. Net income fell to $2.71 billion, or 68 cents per share, from $3.64 billion, or 89 cents per share, in the same quarter a year ago.
Networking revenue plunged 23% to $6.75 billion, slightly below the $6.8 billion consensus of analysts surveyed by StreetAccount.
Security revenue doubled to $2.02 billion, topping the StreetAccount consensus of $1.93 billion. Cisco’s revenue from collaboration was $1.09 billion, a bit below the $1.04 billion consensus estimate.
Cisco CEO Chuck Robbins said on the earnings call on Wednesday that orders from large-scale clients for artificial intelligence infrastructure exceeded $300 million in the quarter. Server makers such as Dell and HPE have also focused on sales of hardware that can help clients implement generative AI.
“We have earned more design wins and remain confident that we will exceed our target of $1 billion of AI orders this fiscal year from web-scale customers,” Robbins said.
Cisco has announced hardware containing Nvidia’s graphics processing units, which are widely used for training AI models, Robbins said.
“Over time, you’ll see us support other GPUs as the market demands,” he said. “But that partnership is still going fine. It’s still early. And I think 2025 is when we’ll start to see enterprise real deployment of some of these technologies.”
For now, enterprises are updating data center infrastructure to prepare for AI and the widespread deployment of AI applications, Robbins said.
U.S. government agencies have delayed deals with Cisco, rather than scrapping them altogether. The Fiscal Responsibility Act of 2023, which became law in June of last year, has limited U.S. government spending, said Scott Herren, Cisco’s finance chief.
Herren said that with Republicans poised to control the White House and both houses of Congress, he expects “to get a budget in place relatively soon.”
During the quarter, Cisco acquired security startups DeepFactor and Robust Intelligence.
Cisco lifted its full-year guidance to $3.60 to $3.66 in adjusted earnings per share on $55.3 billion to $56.3 billion in revenue, up from a prior forecast of $3.52 to $3.58 in EPS and $55 billion to $56.2 billion in revenue. Guidance would indicate projected revenue growth of 3.3% at the middle of the range.
Analysts expected adjusted earnings for the year of $3.58 per share on $55.89 billion in revenue.
As of Wednesday’s close, Cisco’s stock was up 17% year to date, while the S&P 500 index is up around 26% over that stretch.
Republican presidential nominee, former U.S. President Donald Trump, (C) greets attendees during a campaign stop to address Pennsylvanians who are concerned about the threat of Communist China to U.S. agriculture at the Smith Family Farm September 23, 2024 in Smithton, Pennsylvania.
Win Mcnamee | Getty Images
After Donald Trump won the U.S. presidency last week, tech CEOs including Apple‘s Tim Cook, Meta‘s Mark Zuckerberg and Amazon‘s Jeff Bezos publicly praised the president-elect.
One name was conspicuously missing: TikTok CEO Shou Zi Chew.
His absence was notable considering that of all the top tech companies, TikTok faces the most immediate and existential threat from the U.S. government. In April, President Joe Biden signed a law that requires China’s ByteDance to sell TikTok by Jan. 19. If ByteDance fails to comply, internet hosting companies and app store owners such as Apple and Google will be prohibited from supporting TikTok, effectively banning it in the U.S.
Trump’s return to the White House, though, may provide a lifeline for Chew and TikTok.
Although both Republicans and Democrats supported the Biden TikTok ban in April, Trump voiced opposition to the ban during his candidacy. Trump acknowledged the national security and data privacy concerns with TikTok in a March interview with CNBC’s “Squawk Box,” but he also said “there’s a lot of good and there’s a lot of bad” with the app.
Trump also leveraged TikTok’s shaky future in the U.S. as a reason for people to vote against Democrat Vice President Kamala Harris.
“We’re not doing anything with TikTok, but the other side is going to close it up, so if you like TikTok, go out and vote for Trump,” the president-elect said in a September post on his Truth Social service.
Since his election, Trump hasn’t publicly discussed his plans for TikTok, but Trump-Vance transition spokeswoman Karoline Leavitt told CNBC that the president-elect “will deliver.”
“The American people re-elected President Trump by a resounding margin giving him a mandate to implement the promises he made on the campaign trail,” Leavitt said in a statement.
Trump’s rhetoric on TikTok began to turn after the president-elect met in February with billionaire Jeff Yass, a Republican megadonor and a major investor in the Chinese-owned social media app.
Yass’s trading firm Susquehanna International Group owns a 15% stake in ByteDance while Yass maintains a 7% stake in the company, equating to about $21 billion, NBC and CNBC reported in March. That month it was also reported that Yass was a part owner of the business that merged with the parent company of Trump’s Truth Social.
TikTok’s CEO Shou Zi Chew testifies during the Senate Judiciary Committee hearing on online child sexual exploitation, at the U.S. Capitol, in Washington, U.S., January 31, 2024.
Nathan Howard | Reuters
If ByteDance doesn’t sell TikTok by the January deadline, Trump could potentially call on Congress to repeal the law or he can introduce a more “selective enforcement” of the law that would essentially allow TikTok to continue operating in the U.S. without facing penalties, said Sarah Kreps, a Cornell University professor of government. “Selective enforcement” would be akin to police officers not always enforcing every single instance of jaywalking, she said.
At TikTok, meanwhile, Chew has remained quiet since Trump’s victory, just as he had been in the lead-up to Election Day.
The Chinese-owned company may be taking a neutral approach and a wait-and-see strategy for now, said Long Le, a China business expert and Santa Clara University associate teaching professor.
Le said it’s hard to foresee what Trump will do.
“He’s also a contrarian; that’s what makes him unpredictable,” Le said. “He can say one thing, and the next year he’ll change his mind.”
TikTok didn’t respond to requests for comment.
Mark Zuckerberg, CEO of Meta testifies before the Senate Judiciary Committee at the Dirksen Senate Office Building on January 31, 2024 in Washington, DC.
Alex Wong | Getty Images
‘Facebook has been very bad for our country’
When it comes to social media apps, Trump’s campaign comments suggest he’s more concerned with TikTok rival Meta.
In his March interview with “Squawk Box,” Trump said Meta, which owns Facebook and Instagram, posed a much bigger problem than TikTok. He also said a TikTok ban would only benefit Meta, which he labeled “an enemy of the people.”
“Facebook has been very bad for our country, especially when it comes to elections,” Trump said.
But Trump’s negative views on Meta may have changed after comments by CEO Mark Zuckerberg over the past few months, Cornell’s Kreps said.
Zuckerberg described the photo of Trump raising his fist following a failed assassination attempt in July as “one of the most badass things I’ve ever seen in my life.” And after Trump’s win, Zuckerberg congratulated him, saying he was looking forward to working with the president-elect and his administration.
“My sense as an armchair psychologist of Trump is that he really likes people who sing his praises, and so his view on Zuckerberg and Meta, I would imagine, has changed,” Kreps said. “He might then just revert to his American economic nationalism here and say, ‘Let’s protect American industry and continue with the Chinese ban.'”
Meta didn’t respond to a request for comment.
Maintaining support of the TikTok ban could also win Trump political favor with lawmakers concerned about China’s global political and business influence, said Milton Mueller, a professor at Georgia Tech’s School of Public Policy.
“I don’t see him scoring big points politically by standing up for TikTok,” Mueller said, noting that few lawmakers, like Sen. Rand Paul, R-Ky., have opposed the ban.
Even if Trump does provide a lifeline for TikTok, it’s unclear how much damage that would do to his administration since many politicians are reluctant to publicly criticize him, Le said.
“They’re not going to challenge him because he just got so much power,” Le said.
Since launching his TikTok account in June, Trump has amassed over 14 million followers. Given his social media savvy, Trump may not want to make a decision that results in him losing the public attention and influence he’s gained on TikTok, Le said.