Connect with us

Published

on

Meta has built custom computer chips to help with its artificial intelligence and video-processing tasks, and is talking about them in public for the first time.

The social networking giant disclosed its internal silicon chip projects for the first time to reporters earlier this week, ahead of a Thursday virtual event discussing its AI technical infrastructure investments.

related investing news

Nvidia's stock could rise fivefold in 10 years on A.I. trend, says fund manager

CNBC Pro

Investors have been closely watching Meta’s investments into AI and related data center hardware as the company embarks on a “year of efficiency” that includes at least 21,000 layoffs and major cost cutting.

Although it’s expensive for a company to design and build its own computer chips, vice president of infrastructure Alexis Bjorlin told CNBC that Meta believes that the improved performance will justify the investment. The company has also been overhauling its data center designs to focus more on energy-efficient techniques, like liquid cooling, to reduce excess heat.

One of the new computer chips, the Meta Scalable Video Processor (MSVP), is used to process and transmit video to users while cutting down on energy requirements. Bjorlin said “there was nothing commercially available” that could handle the task of processing and delivering 4 billion videos a day as efficiently as Meta wanted.

The other processor is the first in the company’s Meta Training and Inference Accelerator (MTIA) family of chips intended to help with various AI-specific tasks. The new MTIA chip specifically handles “inference,” which is when an already-trained AI model makes a prediction or takes an action.

Bjorlin said that the new AI inference chip helps power some of Meta’s recommendation algorithms used to show content and ads in people’s news feeds. She declined to answer who is manufacturing the chip, but a blog post said that the processor is “fabricated in TSMC 7nm process,” indicating that chip-giant Taiwan Semiconductor Manufacturing is producing the technology.

She said that Meta has a “multi-generational roadmap” for its family of AI chips that include processors used for the task of training AI models, but declined to offer details beyond the new inference chip. Reuters previously reported that Meta cancelled one AI inference chip project and started another that was supposed to roll out around 2025, but Bjorlin declined to comment on that report.

Because Meta isn’t in the business of selling cloud computing services like companies including Google-parent Alphabet or Microsoft, the company didn’t feel compelled to publicly talk about its internal data center chip projects, she said.

“If you look at we’re sharing—our first two chips that we developed—it’s definitely giving a little bit of a view into what are we doing internally,” Bjorlin said. “We haven’t had to advertise this, and we don’t need to advertise this, but you know, the world is interested.”

Meta vice president of engineering Aparna Ramani said the company’s new hardware was developed to work effectively with its home-grown PyTorch software, which has become one of the most popular tools used by third-party developers to create AI apps.

The new hardware will eventually be used to power tasks related to the metaverse, such as virtual reality and augmented reality, as well as the burgeoning field of generative AI, which generally refers to AI software that can create, compelling text, images, and videos.

Ramani also said that Meta has developed a generative AI-powered coding assistant for the company’s developers to help them more easily create and operate software. The new assistant is similar to Microsoft’s GitHub Copilot tool that it released in 2021 with help from the AI startup OpenAI.

In addition, Meta said it completed the second-phase buildout, or the final buildout, of its supercomputer dubbed Research SuperCluster (RSC), which the company detailed last year. Meta used the supercomputer, which contains 16,000 Nvidia A100 GPUs, to train the company’s LLaMA language model, among other uses.

Ramani said that Meta continues to act on its belief that it should contribute to open-source technologies and AI research in order to push the field of technology. The company has disclosed that its biggest LLaMA language model, LLaMA 65B, contains 65 billion parameters and was trained on 1.4 trillion tokens, which refers to the data used for AI training.

Companies like OpenAI and Google have not publicly disclosed similar metrics for their competing large language models, although CNBC reported this week that Google’s PaLM 2 model was trained on 3.6 trillion tokens and contains 340 billion parameters.

Unlike other tech companies, Meta released its LLaMA language model to researchers so they can learn from the technology. However, the LlaMA language model was then leaked to the wider public, leading to many developers building apps incorporating the technology.

Ramani said that Meta is “still thinking through all of our open source collaborations, and certainly, I want to reiterate that our philosophy is still open science and cross collaboration.”

Watch: A.I. is a big driver of sentiment for big tech

Continue Reading

Technology

Google hires Windsurf CEO Varun Mohan, others in latest AI talent deal

Published

on

By

Google hires Windsurf CEO Varun Mohan, others in latest AI talent deal

Chief executive officer of Google Sundar Pichai.

Marek Antoni Iwanczuk | Sopa Images | Lightrocket | Getty Images

Google on Friday made the latest a splash in the AI talent wars, announcing an agreement to bring in Varun Mohan, co-founder and CEO of artificial intelligence coding startup Windsurf.

As part of the deal, Google will also hire other senior Windsurf research and development employees. Google is not investing in Windsurf, but the search giant will take a nonexclusive license to certain Windsurf technology, according to a person familiar with the matter. Windsurf remains free to license its technology to others.

“We’re excited to welcome some top AI coding talent from Windsurf’s team to Google DeepMind to advance our work in agentic coding,” a Google spokesperson wrote in an email. “We’re excited to continue bringing the benefits of Gemini to software developers everywhere.”

The deal between Google and Windsurf comes after the AI coding startup had been in talks with OpenAI for a $3 billion acquisition deal, CNBC reported in April. OpenAI did not immediately respond to a request for comment.

The move ratchets up the talent war in AI particularly among prominent companies. Meta has made lucrative job offers to several employees at OpenAI in recent weeks. Most notably, the Facebook parent added Scale AI founder Alexandr Wang to lead its AI strategy as part of a $14.3 billion investment into his startup. 

Douglas Chen, another Windsurf co-founder, will be among those joining Google in the deal, Jeff Wang, the startup’s new interim CEO and its head of business for the past two years, wrote in a post on X.

“Most of Windsurf’s world-class team will continue to build the Windsurf product with the goal of maximizing its impact in the enterprise,” Wang wrote.

Windsurf has become more popular this year as an option for so-called vibe coding, which is the process of using new age AI tools to write code. Developers and non-developers have embraced the concept, leading to more revenue for Windsurf and competitors, such as Cursor, which OpenAI also looked at buying. All the interest has led investors to assign higher valuations to the startups.

This isn’t the first time Google has hired select people out of a startup. It did the same with Character.AI last summer. Amazon and Microsoft have also absorbed AI talent in this fashion, with the Adept and Inflection deals, respectively.

Microsoft is pushing an agent mode in its Visual Studio Code editor for vibe coding. In April, Microsoft CEO Satya Nadella said AI is composing as much of 30% of his company’s code.

The Verge reported the Google-Windsurf deal earlier on Friday.

WATCH: Google pushes “AI Mode” on homepage

Google pushes "AI Mode" on homepage

Continue Reading

Technology

Nvidia’s Jensen Huang sells more than $36 million in stock, catches Warren Buffett in net worth

Published

on

By

Nvidia's Jensen Huang sells more than  million in stock, catches Warren Buffett in net worth

Jensen Huang, CEO of Nvidia, holds a motherboard as he speaks during the Viva Technology conference dedicated to innovation and startups at Porte de Versailles exhibition center in Paris, France, on June 11, 2025.

Gonzalo Fuentes | Reuters

Nvidia CEO Jensen Huang unloaded roughly $36.4 million worth of stock in the leading artificial intelligence chipmaker, according to a U.S. Securities and Exchange Commission filing.

The sale, which totals 225,000 shares, comes as part of Huang’s previously adopted plan in March to unload up to 6 million shares of Nvidia through the end of the year. He sold his first batch of stock from the agreement in June, equaling about $15 million.

Last year, the tech executive sold about $700 million worth of shares as part of a prearranged plan. Nvidia stock climbed about 1% Friday.

Huang’s net worth has skyrocketed as investors bet on Nvidia’s AI dominance and graphics processing units powering large language models.

The 62-year-old’s wealth has grown by more than a quarter, or about $29 billion, since the start of 2025 alone, based on Bloomberg’s Billionaires Index. His net worth last stood at $143 billion in the index, putting him neck-and-neck with Berkshire Hathaway‘s Warren Buffett at $144 billion.

Shortly after the market opened Friday, Fortune‘s analysis of net worth had Huang ahead of Buffett, with the Nvidia CEO at $143.7 billion and the Oracle of Omaha at $142.1 billion.

Read more CNBC tech news

The company has also achieved its own notable milestones this year, as it prospers off the AI boom.

On Wednesday, the Santa Clara, California-based chipmaker became the first company to top a $4 trillion market capitalization, beating out both Microsoft and Apple. The chipmaker closed above that milestone Thursday as CNBC reported that the technology titan met with President Donald Trump.

Brooke Seawell, venture partner at New Enterprise Associates, sold about $24 million worth of Nvidia shares, according to an SEC filing. Seawell has been on the company’s board since 1997, according to the company.

Huang still holds more than 858 million shares of Nvidia, both directly and indirectly, in different partnerships and trusts.

WATCH: Nvidia hits $4 trillion in market cap milestone despite curbs on chip exports

Nvidia hits $4 trillion in market cap milestone despite curbs on chip exports

Continue Reading

Technology

Tesla to officially launch in India with planned showroom opening

Published

on

By

Tesla to officially launch in India with planned showroom opening

Elon Musk meets with Indian Prime Minister Narendra Modi at Blair House in Washington DC, USA on February 13, 2025.

Anadolu | Anadolu | Getty Images

Tesla will open a showroom in Mumbai, India next week, marking the U.S. electric carmakers first official foray into the country.

The one and a half hour launch event for the Tesla “Experience Center” will take place on July 15 at the Maker Maxity Mall in Bandra Kurla Complex in Mumbai, according to an event invitation seen by CNBC.

Along with the showroom display, which will feature the company’s cars, Tesla is also likely to officially launch direct sales to Indian customers.

The automaker has had its eye on India for a while and now appears to have stepped up efforts to launch locally.

In April, Tesla boss Elon Musk spoke with Indian Prime Minister Narendra Modi to discuss collaboration in areas including technology and innovation. That same month, the EV-maker’s finance chief said the company has been “very careful” in trying to figure out when to enter the market.

Tesla has no manufacturing operations in India, even though the country’s government is likely keen for the company to establish a factory. Instead the cars sold in India will need to be imported from Tesla’s other manufacturing locations in places like Shanghai, China, and Berlin, Germany.

As Tesla begins sales in India, it will come up against challenges from long-time Chinese rival BYD, as well as local player Tata Motors.

One potential challenge for Tesla comes by way of India’s import duties on electric vehicles, which stand at around 70%. India has tried to entice investment in the country by offering companies a reduced duty of 15% if they commit to invest $500 million and set up manufacturing locally.

HD Kumaraswamy, India’s minister for heavy industries, told reporters in June that Tesla is “not interested” in manufacturing in the country, according to a Reuters report.

Tesla is looking to recruit roles in Mumbai, job listings posted on LinkedIn . These include advisors working in showrooms, security, vehicle operators to collect data for its Autopilot feature and service technicians.

There are also roles being advertised in the Indian capital of New Delhi, including for store managers. It’s unclear if Tesla is planning to launch a showroom in the city.

Continue Reading

Trending