BARCELONA, SPAIN – MARCH 01: A view of the MasterCard company logo on their stand during the Mobile World Congress on March 1, 2017 in Barcelona, Spain. (Photo by Joan Cros Garcia/Corbis via Getty Images)
Mastercard said Tuesday that it’s agreed to acquire Minna Technologies, a software firm that makes it easier for consumers to manage their subscriptions.
The move comes as Mastercard and its primary payment network rival Visa are rapidly attempting to expand beyond their core credit and debit card businesses into technology services, such as cybersecurity, fraud prevention, and pay-by-bank payments.
Mastercard declined to disclose financial details of the transaction which is currently subject to a regulatory review.
The payments giant said that the deal, along with other initiatives it’s committed to around subscriptions, will allow it to give consumers a way to access all their subscriptions in a single view — whether inside your banking app or a central “hub.”
Minna Technologies, which is based in Gothenburg, Sweden, develops technology that helps consumers manage subscriptions within their banking apps and websites, regardless of which payment method they used for their subscriptions.
The company said it works with some of the world’s largest financial institutions in the world today. It already counts Mastercard as a key partner as well as its rival Visa.
“These teams and technologies will add to the broader set of tools that help manage the merchant-consumer relationship and minimize any disruption in their experience,” Mastercard said in a blog post Tuesday.
Consumers today often have tons of subscriptions to manage across multiple services such as Netflix, Amazon and Disney Plus. Owning multiple subscriptions can make it difficult to cancel them as consumers can end up losing track of which subscriptions they’re paying for and when.
Mastercard noted that this can have a negative impact on merchants because consumers who aren’t able to easily cancel their subscriptions end up calling on their banks to request a block on payments being taken.
According to Juniper Research data, there are 6.8 billion subscriptions globally, a number that’s expected to jump to 9.3 billion by 2028.
Financial services incumbents such as Mastercard have been rapidly growing their product suite to remain competitive with emerging fintech players that are offering more convenient, digitally native ways to manage consumers’ money management needs.
In 2020, Mastercard acquired Finicity, a U.S. fintech firm that enables third parties — such as fintechs or other banks — to gain access to consumers’ banking information and make payments on their behalf.
Earlier this year, the company announced that by 2030, it would tokenize all cards issued on its network in Europe — in other words, as a consumer, you wouldn’t need to enter your card details manually anymore and would only have to use your thumbprint to authenticate your identity when you pay.
Visa, meanwhile, is also trying to remain competitive with fintech challengers. Last month, the company launched a new service called Visa A2A, which makes it easier for consumers to set up and manage direct debits — payments which are taken directly from your bank account rather than by card.
Chief executive officer at Palo Alto Networks Inc., Nikesh Arora attends the 9th edition of the VivaTech trade show at the Parc des Expositions de la Porte de Versailles on June 11, 2025, in Paris.
Earnings per share: 93 cents adjusted vs. 89 cents expected
Revenue: $2.47 billion vs. $2.46 billion expected
Revenues grew 16% from $2.1 billion a year ago. Net income fell to $334 million, or 47 cents per share, from $351 million, or 49 cents per share in the year-ago period.
Palo Alto’s Chronosphere deal is slated to close in the second half of its fiscal 2026. The cybersecurity provider is also in the process of buying Israeli identity security firm CyberArk for $25 billion under CEO Nikesh Arora‘s acquisition spree.
He told investors in an earnings call that Palo Alto is making this simultaneous acquisition to address the fast-moving AI cycle.
“This large surge towards building AI compute is causing a lot of the AI players to think about newer models for software stacks and infrastructure stacks in the future,” he said.
Palo Alto guided for revenues between $2.57 billion and $2.59 billion in the second quarter, the midpoint of which was in line with a $2.58 billion estimate. For the full year, the company expects $10.50 billion to $10.54 billion, versus a $10.51 billion estimate.
Capital expenditures during the period were much higher than expectations at $84 million. StreetAccount expected $58.1 million. Remaining purchase obligations, which tracks backlog, grew to $15.5 billion and topped a $15.43 billion estimate.
The rise of artificial intelligence has also stirred up increasingly sophisticated cyberattacks and contributed to tools for customers. The Santa Clara, California-based company has infused AI into its tools and launched automated AI agents to help fend off attacks in October.
Tesla CEO Elon Musk (L) talks with Nvidia CEO Jensen Huang during the U.S.-Saudi Investment Forum at the Kennedy Center on Nov. 19, 2025 in Washington, DC.
Win McNamee | Getty Images
Nvidia and xAI said on Wednesday that a large data center facility being built in Saudi Arabia and equipped with hundreds of thousands of Nvidia chips will count Elon Musk’s artificial intelligence startup as its first customer.
Musk and Nvidia CEO Jensen Huang were both in attendance at the U.S.-Saudi Investment Forum in Washington, D.C.
The announcement builds on a partnership from May, when Nvidia said it would provide Saudi Arabia’s Humain with chips that use 500 megawatts of power. On Wednesday, Humain said the project would include about 600,000 Nvidia graphics processing units.
Humain was launched earlier this year and is owned by the Saudi Public Investment Fund. The plan to build the data center was initially announced when Huang visited Saudi Arabia alongside President Donald Trump.
“Could you imagine, a startup company approximately 0 billion dollars in revenues, now going to build a data center for Elon,” Huang said.
The facility is one of the most prominent examples of what Nvidia calls “sovereign AI.” The chipmaker has said that nations will increasingly need to build data centers for AI in order to protect national security and their culture. It’s also a potentially massive market for Nvidia’s pricey AI chips beyond a handful of hyperscalers.
Huang’s appearance at an event supported by President Trump is another sign of the administration’s focus on AI. Huang has become friendly with the president as Nvidia lobbies to gain licenses to ship future AI chips to China.
When announcing the agreement, Musk, who was a major figure in the early days of the second Trump administration, briefly mixed up the size of the data center, which is measured in megawatts, a unit of power. He joked that plans for a data center that would be 1,000 times larger would have to wait.
“That will be eight bazillion, trillion dollars,” Musk joked.
AMD will provide chips that may require as much as 1 gigawatt of power by 2030. The company said the chips that it would provide are its Instinct MI450 GPUs for AI. Cisco will provide additional infrastructure for the data center, AMD said.
Qualcomm will sell Humain its new data center chips that were first revealed in October, called the AI200 and AI250. Humain will deploy 200 megawatts of Qualcomm chips, the company said.
Yann LeCun, known as one of the godfathers of modern artificial intelligence and one of the first AI visionaries to join the company then known as Facebook, is leaving Meta.
LuCun said in a LinkedIn post on Wednesday that he plans to create a startup that specializes in a kind of AI technology that researchers have described as world models, analyzing information beyond web data in order to better represent the physical world and its properties.
“I am creating a startup company to continue the Advanced Machine Intelligence research program (AMI) I have been pursuing over the last several years with colleagues at FAIR, at NYU, and beyond,” LeCun wrote. “The goal of the startup is to bring about the next big revolution in AI: systems that understand the physical world, have persistent memory, can reason, and can plan complex action sequences.”
Meta will partner with LeCun’s startup.
The departure comes at a time of disarray within Meta’s AI unit, which was dramatically overhauled this year after the company released the fourth version of its Llama open-source large language model to a disappointing response from developers. That spurred CEO Mark Zuckerberg to spend billions of dollars recruiting top AI talent, including a June $14.5 billion investment in Scale AI to lure the startup’s 28-year-old CEO Alexandr Wang, now Meta’s new chief AI officer.
LeCun, 65, joined Facebook in 2013 to be director of the FAIR AI research division while maintaining a part-time professorial position at New York University. He said in the LinkedIn post that the “creation of FAIR is my proudest non-technical accomplishment.”
“I am extremely grateful to Mark Zuckerberg, Andrew Bosworth, Chris Cox, and Mike Schroepfer for their support of FAIR, and for their support of the AMI program over the last few years,” LeCun said. “Because of their continued interest and support, Meta will be a partner of the new company.”
At the time, Facebook and Google were heavily recruiting high-level academics like LeCun to spearhead their efforts to produce cutting-edge computer science research that could potentially benefit their core businesses and products.
LeCun, along with other AI luminaries like Yoshua Bengio and Geoffrey Hinton, centered their academic research on a kind of AI technique known as deep learning, which involves the training of enormous software systems called neural networks so they can discover patterns within reams of data. The researchers helped popularize the deep learning approach, and in 2019 won the prestigious Turing Award, presented by the Association for Computing Machinery.
Since then, LeCun’s approach to AI development has drifted from the direction taken by Meta and the rest of Silicon Valley.
Meta and other tech companies like OpenAI have spent billions of dollars in developing so-called foundation models, particularly LLMs, as part of their efforts to advance state-of-the-art computing. However, LeCun and other deep-learning experts, have said that these current AI models, while powerful, have a limited understanding of the world, and new computing architectures are needed for researchers to create software that’s on par with or surpasses humans on certain tasks, a notion known as artificial general intelligence.
“As I envision it, AMI will have far-ranging applications in many sectors of the economy, some of which overlap with Meta’s commercial interests, but many of which do not,” LeCun said in the post. “Pursuing the goal of AMI in an independent entity is a way to maximize its broad impact.”
Besides Wang, other recent notables that Zuckerberg brought in to revamp Meta’s AI unit include former GitHub CEO Nat Friedman, who heads the unit’s product team, and ChatGPT co-creator Shengjia Zhao, the group’s chief scientist.
In October, Meta laid off 600 employees from its Superintelligence Labs division, including some who were part of the FAIR unit that LeCun helped get off the ground. Those layoffs and other cuts to FAIR over the years, coupled with a new AI leadership team, played a major role in LeCun’s decision to leave, according to people familiar with the matter who asked not to be named because they weren’t authorized to speak publicly.
Additionally, LeCun rarely interacted with Wang nor TBD Labs unit, which is compromised of many of the headline-grabbing hires Zuckerberg made over the summer. TBD Labs oversees the development of Meta’s Llama AI models, which were originally developed within FAIR, the people said.
While LeCun was always a champion of sharing AI research and related technologies to the open-source community, Wang and his team favor a more closed approach amid intense competition from rivals like OpenAI and Google, the people said.