Connect with us

Published

on

The once high flying tech sector has endured a heavy selloff this year amid concerns that the sector’s growth could be curtailed by rising interest rates. The tech-heavy Nasdaq Composite is down more than 14%.

Chris Hondros | Newsmakers | Getty Images

A lot has changed in technology since the dot-com boom and bust.

The internet went mobile. The data center went to the cloud. Cars are now driving themselves. Chatbots have gotten pretty smart.

related investing news

Tech is 'down but by no means out' — watch these stocks next year, fund manager says

CNBC Pro
A long-overdue reality check for tech stocks has reset the bar for 2023

CNBC Pro

But one thing has remained. When the economy turns, investors rush for the exits. Despite a furious rally on Thursday, the tech-laden Nasdaq finished in the red for a fourth straight quarter, marking the longest such streak since the dot-bomb period of 2000 to 2001. The only other negative four-quarter stretch in the Nasdaq’s five-decade history was in 1983-84, when the video game market crashed.

This year marks the first time the Nasdaq has ever fallen all four quarters. It dropped 9.1% in the first three months of the year, followed by a second-quarter plunge of 22% and a third-quarter decline of 4.1%. It fell 1% in the fourth quarter because of an 8.7% drop in December.

For the full year, the Nasdaq slid 33%, its steepest decline since 2008 and the third-worst year on record. The drop 14 years ago came during the financial meltdown caused by the housing crisis.

“It’s really hard to be positive on tech right now,” Gene Munster, managing partner of Loup Ventures, told CNBC’s Brian Sullivan on Wednesday. “You feel like you’re missing something. You feel like you’re not getting the joke.”

Tech has been like a horror show this year, says Wedbush's Dan Ives

Other than 2008, the only other year worse for the Nasdaq was 2000, when the dot-com bubble burst and the index sank 39%. Early dreams of the internet taking over the world were vaporized. Pets.com, infamous for the sock puppet, went public in February of that year and shut down nine months later. EToys, which held its IPO in 1999 and saw its market cap grow to almost $8 billion, sank in 2000, losing almost all its value before going bankrupt early the next year. Delivery company Kozmo.com never got its IPO off the ground, filing in March 2000 and withdrawing its offering in August.

Amazon had its worst year ever in 2000, dropping 80%. Cisco fell 29% and then another 53% the next year. Microsoft plummeted by more than 60% and Apple by over 70%.

The parallels to today are quite stark.

In 2022, the company formerly known as Facebook lost roughly two-thirds of its value as investors balked at a future in the metaverse. Tesla fell by a similar amount, as the carmaker long valued like a tech company crashed into reality. Amazon dropped by half.

The IPO market this year was non-existent, but many of the companies that went public last year at astronomical valuations lost 80% or more of their value.

Perhaps the closest analogy to 2000 was the crypto market this year. Digital currencies Bitcoin and ether plunged by more than 60%. Over $2 trillion in value was wiped out as speculators fled crypto. Numerous companies went bankrupt, most notably crypto exchange FTX, which collapsed after reaching a $32 billion valuation earlier in the year. Founder Sam Bankman-Fried now faces criminal fraud charges.

The only major crypto company traded on the Nasdaq is Coinbase, which went public last year. In 2022, its shares fell 86%, eliminating more than $45 billion in market cap. In total, Nasdaq companies have shed close to $9 trillion in value this year, according to FactSet.

At its peak in 2000, Nasdaq companies were worth about $6.6 trillion in total, and proceeded to lose about $5 trillion of that by the time the market bottomed in October 2002.

Don’t fight the fed

Despite the similarities, things are different today.

For the most part, the collapse of 2022 was less about businesses vanishing overnight and had more to do with investors and executives waking up to reality.

Companies are downsizing and getting revalued after a decade of growth fueled by cheap money. With the Fed raising rates to try and get inflation under control, investors have stopped putting a premium on rapid unprofitable growth and started demanding cash generation.

“If you’re looking solely at future cash flows without profitability, those are the companies that did really well in 2020, and those are not as defensible today,” Shannon Saccocia, chief investment officer of SVB Private, told CNBC’s “Closing Bell: Overtime” on Tuesday. “The tech is dead narrative is probably in place for the next couple of quarters,” Saccocia said, adding that some parts of the sector “will have light at the end of this tunnel.”

The 'tech is dead' narrative will only last short term into 2023, says SVB's Shannon Saccocia

The tunnel she’s describing is the continuing rate increases by the Fed, which may only end if the economy enters a recession. Either scenario is troubling for much of technology, which tends to thrive when the economy is in growth mode.

In mid-December, the Fed raised its benchmark interest rate to the highest in 15 years, lifting it to a target range of 4.25% to 4.5%. The rate was anchored near zero through the pandemic as well as in the years that followed the financial crisis.

Tech investor Chamath Palihapitiya told CNBC in late October that more than a decade of zero interest rates “perverted the market” and “allowed manias and asset bubbles to build in every single part of the economy.”

Palihapitiya took as much advantage as anyone of the cheap money available, pioneering investments in special purpose acquisition companies (SPACs), blank-check entities that hunt for companies to take public through a reverse merger.

With no yield available in fixed income and with tech attracting stratospheric valuations, SPACs took off, raising more than $160 billion on U.S. exchanges in 2021, nearly double the prior year, according to data from SPAC Research. That number sank to $13.4 billion this year. CNBC’s Post-SPAC index, comprised of the largest companies that have debuted via SPACs in the last two years, lost two-thirds of its value in 2022.

SPACs slumped in 2022

CNBC

‘Bargain basement’ shopping

The IPO market is as bad as it was in 2001, and quick improvement is unlikely, says Bullpen's Davidson

Continue Reading

Technology

Nvidia refutes report that China’s DeepSeek is using its banned Blackwell AI chips

Published

on

By

Nvidia refutes report that China's DeepSeek is using its banned Blackwell AI chips

Jensen Huang, chief executive officer of Nvidia Corp., outside the US Capitol in Washington, DC, US, on Wednesday, Dec. 3, 2025.

Bloomberg | Bloomberg | Getty Images

Nvidia on Wednesday refuted a report that the Chinese artificial intelligence startup DeepSeek has been using smuggled Blackwell chips to develop its upcoming model.

The U.S. has banned the export of Nvidia’s Blackwell chips, which are considered the company’s most advanced offerings, to China in an effort to stay ahead in the AI race.

DeepSeek is reportedly using chips that were snuck into the country without authorization, according to The Information.

“We haven’t seen any substantiation or received tips of ‘phantom datacenters’ constructed to deceive us and our OEM partners, then deconstructed, smuggled, and reconstructed somewhere else,” a Nvidia spokesperson said in a statement. “While such smuggling seems farfetched, we pursue any tip we receive.”

Read more CNBC tech news

Nvidia has been one of the biggest winners of the AI boom so far because it develops the graphics processing units (GPUs) that are key for training models and running large workloads.

Since the hardware is so crucial for advancing AI technology, Nvidia’s relationship with China has become a political flashpoint among U.S. lawmakers.

President Donald Trump on Monday said Nvidia can ship its H200 chips to “approved customers” in China and elsewhere on the condition that the U.S. will get 25% of those sales.

The announcement was met with pushback from some Republicans.

DeepSeek spooked the U.S. tech sector in January when it released a reasoning model, called R1, that rocketed to the top of app stores and industry leaderboards. R1 was also created at a fraction of the cost of other models in the U.S., according to some analyst estimates.

In August, DeepSeek hinted that China will soon have its own “next generation” chips to support its AI models.

WATCH: Nvidia selling H200 AI chips to China is net positive, says Patrick Moorhead

Nvidia selling H200 AI chips to China is net positive, says Patrick Moorhead

– CNBC’s Kristina Partsinevelos contributed to this report.

Continue Reading

Technology

‘Greetings, earthlings’: Nvidia-backed Starcloud trains first AI model in space as orbital data center race heats up

Published

on

By

‘Greetings, earthlings’: Nvidia-backed Starcloud trains first AI model in space as orbital data center race heats up

The Starcloud-1 satellite is launched into space from a SpaceX rocket on November 2, 2025.

Courtesy: SpaceX | Starcloud

Nvidia-backed startup Starcloud trained an artificial intelligence model from space for the first time, signaling a new era for orbital data centers that could alleviate Earth’s escalating digital infrastructure crisis.

Last month, the Washington-based company launched a satellite with an Nvidia H100 graphics processing unit, sending a chip into outer space that’s 100 times more powerful than any GPU compute that has been in space before. Now, the company’s Starcloud-1 satellite is running and querying responses from Gemma, an open large language model from Google, in orbit, marking the first time in history that an LLM has been has run on a high-powered Nvidia GPU in outer space, CNBC has learned.

“Greetings, Earthlings! Or, as I prefer to think of you — a fascinating collection of blue and green,” reads a message from the recently launched satellite. “Let’s see what wonders this view of your world holds. I’m Gemma, and I’m here to observe, analyze, and perhaps, occasionally offer a slightly unsettlingly insightful commentary. Let’s begin!” the model wrote.

Starcloud’s output Gemma in space. Gemma is a family of open models built from the same technology used to create Google’s Gemini AI models.

Starcloud

Starcloud wants to show outer space can be a hospitable environment for data centers, particularly as Earth-based facilities strain power grids, consume billions of gallons of water annually and produce hefty greenhouse gas emissions. The electricity consumption of data centers is projected to more than double by 2030, according to data from the International Energy Agency.

Starcloud CEO Philip Johnston told CNBC that the company’s orbital data centers will have 10 times lower energy costs than terrestrial data centers.

“Anything you can do in a terrestrial data center, I’m expecting to be able to be done in space. And the reason we would do it is purely because of the constraints we’re facing on energy terrestrially,” Johnston said in an interview.

Johnston, who co-founded the startup in 2024, said Starcloud-1’s operation of Gemma is proof that space-based data centers can exist and operate a variety of AI models in the future, particularly those that require large compute clusters.

“This very powerful, very parameter dense model is living on our satellite,” Johnston said. “We can query, it and it will respond in the same way that when you query a chat from a database on Earth, it will give you a very sophisticated response. We can do that with our satellite.”

In a statement to CNBC, Google DeepMind product director Tris Warkentin said that “seeing Gemma run in the harsh environment of space is a testament to the flexibility and robustness of open models.”

In addition to Gemma, Starcloud was able to train NanoGPT, an LLM created by OpenAI founding member Andrej Karpathy, on the H100 chip using the complete works of Shakespeare. This led the model to speak in Shakespearean English.

Starcloud — a member of the Nvidia Inception program and graduate from Y Combinator and the Google for Startups Cloud AI Accelerator — plans to build a 5-gigawatt orbital data center with solar and cooling panels that measure roughly 4 kilometers in both width and height. A compute cluster of that gigawatt size would produce more power than the largest power plant in the U.S. and would be substantially smaller and cheaper than a terrestrial solar farm of the same capacity, according to Starcloud’s white paper.

These data centers in space would capture constant solar energy to power next-generation AI models, unhindered by the Earth’s day and night cycles and weather changes. Starcloud’s satellites should have a five-year lifespan given the expected lifetime of the Nvidia chips on its architecture, Johnston said.

Orbital data centers would have real-world commercial and military use cases. Already, Starcloud’s systems can enable real-time intelligence and, for example, spot the thermal signature of a wildfire the moment it ignites and immediately alert first responders, Johnston said.

“We’ve linked in the telemetry of the satellite, so we linked in the vital signs that it’s drawing from the sensors — things like altitude, orientation, location, speed,” Johnston said. “You can ask it, ‘Where are you now?’ and it will say ‘I’m above Africa and in 20 minutes, I’ll be above the Middle East.’ And you could also say, ‘What does it feel like to be a satellite? And it will say, ‘It’s kind of a bit weird’ … It’ll give you an interesting answer that you could only have with a very high-powered model.”

Starcloud is working on customer workloads by running inference on satellite imagery from observation company Capella Space, which could help spot lifeboats from capsized vessels at sea and forest fires in a certain location. The company will include several Nvidia H100 chips and integrate Nvidia’s Blackwell platform onto its next satellite launch in October 2026 to offer greater AI performance. The satellite launching next year will feature a module running a cloud platform from cloud infrastructure startup Crusoe, allowing customers to deploy and operate AI workloads from space.

“Running advanced AI from space solves the critical bottlenecks facing data centers on Earth,” Johnston told CNBC.

“Orbital compute offers a way forward that respects both technological ambition and environmental responsibility. When Starcloud-1 looked down, it saw a world of blue and green. Our responsibility is to keep it that way,” he added.

The risks

Risks in operating orbital data centers remain, however. Analysts from Morgan Stanley have noted that orbital data centers could face hurdles such as harsh radiation, difficulty of in-orbit maintenance, debris hazards and regulatory issues tied to data governance and space traffic.

Still, tech giants are pursuing orbital data centers given the prospect of nearly limitless solar energy and greater, gigawatt-sized operations in space.

Along with Starcloud and Nvidia’s efforts, several companies have announced space-based data center missions. On Nov. 4, Google unveiled a “moonshot” initiative titled Project Suncatcher, which aims to put solar-powered satellites into space with Google’s tensor processing units. Privately-owned Lonestar Data Holdings is working to put the first-ever commercial lunar data center on the moon’s surface.

OpenAI CEO Sam Altman has explored an acquisition or partnership with a rocket maker, suggesting a desire to compete against Elon Musk‘s SpaceX, according to The Wall Street Journal. SpaceX is a key launch partner for Starcloud.

Referring to Starcloud’s launch in early November, Nvidia senior director of AI infrastructure Dion Harris said: “From one small data center, we’ve taken a giant leap toward a future where orbital computing harnesses the infinite power of the sun.”

Continue Reading

Technology

Former GitLab CEO raises money for Kilo to compete in crowded AI coding market

Published

on

By

Former GitLab CEO raises money for Kilo to compete in crowded AI coding market

Investors are betting there’s room for another startup using artificial intelligence to help software engineers write code faster. The difference with Kilo Code is it counts former GitLab CEO Sid Sijbrandij among its founders.

On Wednesday, Kilo Code announced $8 million in seed funding, with backing from Breakers, Cota Capital, General Catalyst, Quiet Capital and Tokyo Black.

Sijbrandij is a self-taught developer who helped popularize GitLab’s tools for source code collaboration, deployment and testing. GitLab went public in 2021 and is valued at more than $6 billion. Sijbrandij stepped down as CEO last year to focus on cancer treatment but continued as board chair.

Since then, the technology industry has become obsessed with having large language models write and update software, a practice commonly known in Silicon Valley as vibe coding.

OpenAI co-founder Andrej Karpathy is credited with coining the term in February. OpenAI looked at buying AI coding startup Windsurf for around $3 billion, but scrapped the plan before Google hired senior Windsurf employees in a $2.4 billion transaction in July. Rival Cursor announced a $2.3 billion funding round in November at a $29.3 billion valuation.

At Microsoft, vibe coding already makes up 30% of the company’s code, CEO Satya Nadella said in April.

Sijbrandij witnessed the action and became fascinated by what AI could do for software development. In September, an acquaintance introduced him to Scott Breitenother, who started and later sold consultancy Brooklyn Data.

“I thought we were just kind of having a meet and greet, and then 25 minutes in, Sid’s like, ‘Hey, can you start next week?'” Breitenother said.

Sijbrandij contributed early capital for the startup, which now employs about 34 people across continents. Breitenother is in charge, but he talks with Sijbrandij many times a day.

Kilo Code’s software plugs in to coding applications such as Cursor and Microsoft’s Visual Studio Code. It’s the most widely used service for startup OpenRouter’s application programming interface that gives developers access to a variety of AI models, including Grok Code Fast 1 from Elon Musk’s xAI. Kilo Code has processed more than 3 trillion tokens in the past month, according to OpenRouter. A single token represents about three-quarters of a word.

Daniël Langezaal, a software engineer at Dutch e-commerce startup Plug&Pay, said he has used Kilo Code for months after trying products from Anthropic, Cursor and Microsoft, among others. He said he appreciates Kilo Code’s support for both premium and affordable models, and he likes that people publicly contribute to the Kilo Code extension under an open-source license.

Langezaal has spread the word. About 80% of Plug&Pay’s developers now use Kilo Code, he said. It helped save time for one teammate who recently assembled a complex SQL query.

“With Kilo, it took him a day,” Langezaal said. “If he didn’t have access to Kilo, it would have taken him a few days to implement.”

GitLab, which has been testing a platform for AI agents to perform tasks, is paying attention, and was interested in what Kilo was building.

“I talked to the board,” Sijbrandij said. “We ended up deciding to do it outside of GitLab.”

GitLab included reference to Kilo in a filing last month. The company said that it paid Kilo $1,000 in exchange for a right of first refusal for 10 business days should the startup receive an acquisition proposal before August 2026.

The market is rapidly evolving. Design software company Figma and a slew of startups now offer vibe coding options for less technical people. It’s a category Kilo Code won’t be ignoring for much longer.

“We also want to be the place for people just getting started with code,” Sijbrandij said. “We are working on an app builder that’s more like the Lovable or Bolt experience,” he said, referring to two popular startups.

Lovable, based in Sweden, announced funding at a $1.8 billion valuation in July.

WATCH: Google’s vibe-coding play

Continue Reading

Trending