Dr. Joshua Bederson places Precision Neuroscience’s electrodes onto a brain.
Ashley Capoot
As the lights dimmed in an operating room at The Mount Sinai Hospital in New York City, Dr. Joshua Bederson prepared to make history.
Bederson, system chair for the Department of Neurosurgery at Mount Sinai Health System, is no stranger to long hours in an operating room. The former competitive gymnast has completed more than 6,500 procedures in his career, and he said he visualizes the steps for each one as if he’s rehearsing for a routine.
On this particular morning in April, Bederson was readying for a meningioma resection case, which meant he would be removing a benign brain tumor. Bederson said his primary focus is always on caring for the patient, but in some cases, he also gets to help advance science.
This procedure was one such case.
A small crowd gathered as Bederson took his seat in the operating room, his silhouette aglow from the bright white light shining on the patient in front of him. Health-care workers, scientists and CNBC craned forward – some peering through windows – to watch as Bederson placed four electrode arrays from Precision Neuroscience onto the surface of the patient’s brain for the first time.
An electrode is a small sensor that can detect and carry an electrical signal, and an array is a grid of electrodes. Neurosurgeons use electrodes during some procedures to help monitor and avoid important parts of the brain, like areas that control speech and movement.
Precision is a three-year-old startup building a brain-computer interface, or a BCI. A BCI is a system that decodes neural signals and translates them into commands for external technologies. Perhaps the best-known company in the field is Neuralink, which is owned by Tesla and SpaceX CEO Elon Musk.
Other companies like Synchron and Paradromics have also developed BCI systems, though their goals and designs all vary. The first application of Precision’s system will be to help patients with severe paralysis restore functions like speech and movement, according to its website.
Stephanie Rider of Precision Neuroscience inspects the company’s microelectrode array
Source: Precision Neuroscience
Precision’s flagship BCI is called the Layer 7 Cortical Interface. It’s a microelectrode array that’s thinner than a human hair, and it resembles a piece of yellow scotch tape. Each array is made up of 1,024 electrodes, and Precision says it can conform to the brain’s surface without damaging any tissue.
When Bederson used four of the company’s arrays during the surgery in April, he set a record for the highest number of electrodes to be placed on the brain in real-time, according to Precision. But perhaps more importantly, the arrays were able to detect signals from the patient’s individual fingers, which is a far greater amount of detail than standard electrodes are able to capture.
Using Precision’s electrode array is like turning a pixilated, low-resolution image into a 4K image, said Ignacio Saez, an associate professor of neuroscience, neurosurgery and neurology at the Icahn School of Medicine at Mount Sinai. Saez and his team oversee Precision’s work with Mount Sinai.
“Instead of having 10 electrodes, you’re giving me 1,000 electrodes,” Saez told CNBC in an interview. “The depth and the resolution and the detail that you’re going to get are completely different, even though they somehow reflect the same underlying neurological activity.”
Bederson said accessing this level of detail could help doctors be more delicate with their surgeries and other interventions in the future. For Precision, the ability to record and decode signals from individual fingers will be crucial as the company works to eventually help patients restore fine motor control.
The data marks a milestone for Precision, but there’s a long road ahead before it achieves some of its loftier goals. The company is still working toward approval from the U.S. Food and Drug Administration, and it has yet to implant a patient with a more permanent version of its technology.
“I think these are little baby steps towards the ultimate goal of brain-computer interface,” Bederson told CNBC in an interview.
Inside the operating room
Dr. Joshua Bederson prepares for surgery at The Mount Sinai Hospital.
Ashley Capoot
Bederson’s surgery in April was not Precision’s first rodeo. In fact, it marked the 14th time that the company has placed its array on a human patient’s brain.
Precision has been partnering with academic medical centers and health systems to perform a series of first-in-human clinical studies. The goal of each study varies, and the company announced its collaboration with Mount Sinai in March.
At Mount Sinai, Precision is exploring different applications for its array in clinical settings, like how it can be used to help monitor the brain during surgery. In these procedures, surgeons like Bederson temporarily place Precision’s array onto patients who are already undergoing brain surgery for a medical reason.
Patients give their consent to participate beforehand.
It’s routine for neurosurgeons to map brain signals with electrodes during these types of procedures. Bederson said the current accepted practice is to use anywhere between four to almost 100 electrodes – a far cry from the 4,096 electrodes he was preparing to test.
Electrode arrays from Precision Neuroscience displayed on a table.
Ashley Capoot
Precision’s arrays are in use for a short portion of these surgeries, so CNBC joined the operating room in April once the procedure was already underway.
The patient, who asked to remain anonymous, was asleep. Bederson’s team had already removed part of their skull, which left an opening about the size of a credit card. Four of Precision’s arrays were carefully laid out on a table nearby.
Once the patient was stabilized, Precision’s employees trickled into the operating room. They helped affix the arrays in an arc around the opening on the patient’s head, and connected bundles of long blue wires at the other end to a cart full of equipment and monitors.
Dr. Benjamin Rapoport, Precision’s co-founder and chief scientific officer, quietly looked on. Every major procedure presents some risks, but the soft-spoken neurosurgeon’s calm demeanor never wavered. He told CNBC that each new case is just as exciting as the last, especially since the company is still learning.
Experts help set up the wiring for Precision Neuroscience’s technology.
Ashley Capoot
Bederson entered the operating room as Precision’s preparations neared their end. He helped make some final tweaks to the set up, and the overhead lights in the operating room were turned off.
Ongoing chatter quieted to hushed whispers. Bederson was ready to get started.
He began by carefully pulling back a fibrous membrane called the dura to reveal the surface of the brain. He laid a standard strip of electrodes onto the tissue for a few minutes, and then it was time to test Precision’s technology.
Using a pair of yellow tweezers called long bayonet forceps, Bederson began placing all four of Precision’s electrode arrays onto the patient’s brain. He positioned the first two arrays with ease, but the last two proved slightly more challenging.
Bederson was working with a small section of brain tissue, which meant the arrays needed to be angled just right to lay flat. For reference, imagine arranging the ends of four separate tape measures within a surface area roughly the size of a rubber band. It took a little reconfiguring, but after a couple of minutes, Bederson made it happen.
Real-time renderings of the patient’s brain activity swept across Precision’s monitors in the operating room. All four arrays were working.
In an interview after the surgery, Bederson said it was “complicated” and “a little bit awkward” to place all four arrays at once. From a design perspective, he said two arrays with twice as many points of contact, or longer arrays with greater spacing would have been helpful.
Bederson compared the arrays to spaghetti, and the description was apt. From where CNBC was watching, it was hard to tell where one stopped and the next began.
Once all the arrays were placed and actively detecting signals, Precision’s Rapoport stood with his team by the monitors to help oversee data collection. He said the research is the product of a true team effort from the company, the health system and the patient, who often doesn’t get to see the benefits of the technology at this stage.
“It takes a village to make this sort of thing move forward,” Rapoport said.
CNBC left the operating room as Bederson began removing the tumor, but he said the case went well. The patient woke up afterward with some weakness in their foot since the surgery was within that part of the brain, but Bederson said he expected the foot would recover in around three to four weeks.
Employees from Precision Neuroscience collecting data.
Ashley Capoot
Rapoport was present at this particular surgery because of his role with Precision, but he’s well acquainted with the operating rooms at Mount Sinai.
Rapoport is a practicing surgeon and serves as an assistant professor of neurosurgery at the Icahn School of Medicine at Mount Sinai. Rapoport reports to Bederson, and Bederson said the pair have known one another since Rapoport was in residency at Weill Cornell Medicine.
Dr. Thomas Oxley, the CEO of the competing BCI company Synchron, is also a faculty member under Bederson. Synchron has built a stent-like BCI that can be inserted through a patient’s blood vessels. As of early February, the company had implanted its system into 10 human patients. It is also working toward FDA approval.
Bederson has an equity stake in Synchron, but he told CNBC he didn’t realize how much it would prevent him from participating in research with the Synchron team. He has no monetary investment in Precision.
“I really did not want to have any financial interest in Precision because I think it has an equally promising future and wanted to advance the science as fast as I could,” Bederson said.
Rapoport also helped co-found Musk’s Neuralink in 2017, though he departed the company the following year. Neuralink is building a BCI designed to be inserted directly into the brain tissue, and the company recently received approval to implant its second human patient, according to a report from The Wall Street Journal on Monday.
As the BCI industry heats up, Bederson said the amount that scientists understand about the brain is poised to “explode” over the next several years. Companies like Precision are just getting started.
Dr. Joshua Bederson helps set up Precision Neuroscience’s electrode arrays.
Ashley Capoot
“I really feel like the future is where the excitement is,” Bederson said.
Rapoport said Precision is hoping to receive FDA approval for the wired version of its system “within a few months.” This version, which is what CNBC saw in the operating room, would be for use in a hospital setting or monitored care unit for up to 30 days at a time, he said.
Precision’s permanent implant, which will transmit signals wirelessly, will go through a separate approval process with the FDA.
Rapoport said Precision hopes to implant “a few dozen” patients with the wired version of its technology by the end of the year. That data collection would give the company a “very high level of confidence” in its ability to decode movement and speech signals in real-time, he said.
“Within a few years, we’ll have a much more advanced version of the technology out,” Rapoport said.
Okta on Tuesday topped Wall Street’s third-quarter estimates and issued an upbeat outlook, but shares fell as the company did not provide guidance for fiscal 2027.
Shares of the identity management provider fell more than 3% in after-hours trading on Tuesday.
Here’s how the company did versus LSEG estimates:
Earnings per share: 82 cents adjusted vs. 76 cents expected
Revenue: $742 million vs. $730 million expected
Compared to previous third-quarter reports, Okta refrained from offering preliminary guidance for the upcoming fiscal year. Finance chief Brett Tighe cited seasonality in the fourth quarter, and said providing guidance would require “some conservatism.”
Okta released a capability that allows businesses to build AI agents and automate tasks during the third quarter.
CEO Todd McKinnon told CNBC that upside from AI agents haven’t been fully baked into results and could exceed Okta’s core total addressable market over the next five years.
“It’s not in the results yet, but we’re investing, and we’re capitalizing on the opportunity like it will be a big part of the future,” he said in a Tuesday interview.
Revenues increased almost 12% from $665 million in the year-ago period. Net income increased 169% to $43 million, or 24 cents per share, from $16 million, or breakeven, a year ago. Subscription revenues grew 11% to $724 million, ahead of a $715 million estimate.
For the current quarter, the cybersecurity company expects revenues between $748 million and $750 million and adjusted earnings of 84 cents to 85 cents per share. Analysts forecast $738 million in revenues and EPS of 84 cents for the fourth quarter.
Returning performance obligations, or the company’s subscription backlog, rose 17% from a year ago to $4.29 billion and surpassed a $4.17 billion estimate from StreetAccount.
This year has been a blockbuster period for cybersecurity companies, with major acquisition deals from the likes of Palo Alto Networks and Google and a raft of new initial public offerings from the sector.
Marvell Technology Group Ltd. headquarters in Santa Clara, California, on Sept. 6, 2024.
David Paul Morris | Bloomberg | Getty Images
Semiconductor company Marvell on Tuesday announced that it will acquire Celestial AI for at least $3.25 billion in cash and stock.
The purchase price could increase to $5.5 billion if Celestial hits revenue milestones, Marvell said.
Marvell shares rose 13% in extended trading Tuesday as the company reported third-quarter earnings that beat expectations and said on the earnings call that it expected data center revenue to rise 25% next year.
The deal is an aggressive move for Marvell to acquire complimentary technology to its semiconductor networking business. The addition of Celestial could enable Marvell to sell more chips and parts to companies that are currently committing to spend hundreds of billions of dollars on infrastructure for AI.
Marvell stock is down 18% so far in 2025 even as semiconductor rivals like Broadcom have seen big valuation increases driven by excitement around artificial intelligence.
Celestial is a startup focused on developing optical interconnect hardware, which it calls a “photonic fabric,” to connect high-performance computers. Celestial was reportedly valued at $2.5 billion in March in a funding round, and Intel CEO Lip-Bu Tan joined the startup’s board in January.
Optical connections are becoming increasingly important because the most advanced AI systems need those parts tie together dozens or hundreds of chips so they can work as one to train and run the biggest large-language models.
Currently, many AI chip connections are done using copper wires, but newer systems are increasingly using optical connections because they can transfer more data faster and enable physically longer cables. Optical connections also cost more.
“This builds on our technology leadership, broadens our addressable market in scale-up connectivity, and accelerates our roadmap to deliver the industry’s most complete connectivity platform for AI and cloud customers,” Marvell CEO Matt Murphy said in a statement.
Marvell said that the first application of Celestial technology would be to connect a system based on “large XPUs,” which are custom AI chips usually made by the companies investing billions in AI infrastructure.
On Tuesday, the company said that it could even integrate Celestial’s optical technology into custom chips, and based on customer traction, the startup’s technology would soon be integrated into custom AI chips and related parts called switches.
Amazon Web Services Vice President Dave Brown said in a statement that Marvell’s acquisition of Celestial will “help further accelerate optical scale-up innovation for next-generation AI deployments.”
The maximum payout for the deal will be triggered if Celestial can record $2 billion in cumulative revenue by the end of fiscal 2029. The deal is expected to close early next year.
In its third-quarter earnings on Tuesday, Marvell earnings of 76 cents per share on $2.08 billion in sales, versus LSEG expectations of 73 cents on $2.07 billion in sales. Marvell said that it expects fourth-quarter revenue to be $2.2 billion, slightly higher than LSEG’s forecast of $2.18 billion.
Amazon Web Services’ two-track approach to artificial intelligence came into better focus Tuesday as the world’s biggest cloud pushed forward with its own custom chips and got closer to Nvidia . During Amazon ‘s annual AWS Re:Invent 2025 conference in Las Vegas, Amazon Web Services CEO Matt Garman unveiled Trainium3 — the latest version of the company’s in-house custom chip. It has four times more compute performance, energy efficiency, and memory bandwidth than previous generations. AWS said that early results of customers testing Trainium3 are reducing AI training and inference costs by up to 50%. Custom chips, like Trainium, are becoming more and more popular for the big tech companies that can afford to make them. And, their use cases are broadening. For example, Google’s tensor processing units (TPUs), co-designed by Broadcom , have also been getting a lot of attention since last month’s launch of the well-received Gemini 3 artificial intelligence model. It is powered by TPUs. There was even a report that Meta Platforms was considering TPUs in addition to Nvidia ‘s graphics processing units (GPUs), which are the gold standard for all-purpose AI workloads. At the same time, Amazon also announced that it’s deepening its work with Nvidia. In Tuesday’s keynote, Garman introduced AWS Factories, which provides on-premise AI infrastructure for customers to use in their own data centers. The service combines Trainium accelerators and Nvidia graphics processing units, which allows customers to access Nvidia’s accelerated computing platform, full-stack AI software, and GPU-accelerated applications. By offering both options, Amazon aims to keep accelerating AWS cloud capacity and, in turn, revenue growth to stay on top during a time of intense competition from Microsoft ‘s Azure and Alphabet ‘s Google Cloud, the second and third place horses in the AI race, by revenue. Earlier this year, investors were concerned when second-quarter AWS revenue growth did not live up to its closest competitors. In late October’s release of Q3 results, Amazon went a long way to putting those worries to rest. Amazon CEO Andy Jassy said at the time , “AWS is growing at a pace we haven’t seen since 2022, re-accelerating to 20.2% YoY.” He added, “We’ve been focused on accelerating capacity — adding more than 3.8 gigawatts (GW) in the past 12 months.” Tuesday’s announcements come at a pivotal time for AWS as it tries to rapidly expand its computing capacity after a year of supply constraints that put a lid on cloud growth. As great as more efficient chips are, they don’t make up for the capacity demand that the company is facing as AI adoption ramps up, which is why adding more gigawatts of capacity is what Wall Street is laser-focused on. Fortunately, Wall Street argues that the capacity headwind should flip to a tailwind. Wells Fargo said Trainium3 is “critical to supplementing Nvidia GPUs and CPUs in this capacity build” to close the gap with rivals. In a note to investors on Monday, the analysts estimate Amazon will add more than 12 gigawatts of compute by year-end 2027, boosting total AWS capacity to support as much as $150 billion in incremental annual AWS revenue if demand remains strong. In a separate note, Oppenheimer said Monday that AWS has already proven its ability to improve capacity, which has already doubled since 2022. Amazon plans to double it again by 2027. The analysts said that such an expansion could translate to 14% upside to 2026 AWS revenue and 22% upside in 2027. Analysts said each incremental gigawatt of compute added in recent quarters translated to roughly $3 billion of annual cloud revenue. Bottom line While new chips are welcome news that helps AWS step deeper into the AI chip race, Amazon’s investment in capacity and when that capacity will be unlocked is what investors are more locked in on because that’s how it will fulfill demand. The issue is not a demand issue; it’s a supply issue. We are confident in AWS’ ability to add the capacity. In fact, there’s no one company in the world that could deal with this kind of logistics problem, at this scale, better than Amazon. Amazon shares surged nearly 14% to $254 each in the two sessions following the cloud and e-commerce giant’s late Oct. 30 earnings print. The stock has since given back those gains and then some. As of Tuesday’s close, shares were up 6.5% year to date, a laggard among its “Magnificent Seven” peers, and underperforming the S & P 500 ‘s roughly 16% advance in 2025. (Jim Cramer’s Charitable Trust is long AMZN, NVDA. See here for a full list of the stocks.) As a subscriber to the CNBC Investing Club with Jim Cramer, you will receive a trade alert before Jim makes a trade. Jim waits 45 minutes after sending a trade alert before buying or selling a stock in his charitable trust’s portfolio. If Jim has talked about a stock on CNBC TV, he waits 72 hours after issuing the trade alert before executing the trade. THE ABOVE INVESTING CLUB INFORMATION IS SUBJECT TO OUR TERMS AND CONDITIONS AND PRIVACY POLICY , TOGETHER WITH OUR DISCLAIMER . NO FIDUCIARY OBLIGATION OR DUTY EXISTS, OR IS CREATED, BY VIRTUE OF YOUR RECEIPT OF ANY INFORMATION PROVIDED IN CONNECTION WITH THE INVESTING CLUB. NO SPECIFIC OUTCOME OR PROFIT IS GUARANTEED.