Connect with us

Published

on

Elon Musk’s Neuralink received approval last week from the US Food and Drug Administration to conduct human clinical trials, which one former FDA official called “really a big deal.” I do not disagree, but I am skeptical that this technology will “change everything.” Not every profound technological advance has broad social and economic implications.

With Neuralink’s device, a robot surgically inserts a device into the brain that can then decode some brain activity and connect the brain signals to computers and other machines. A person paralyzed from the neck down, for example, could use the interface to manipulate her physical environment, as well as to write and communicate.

This would indeed be a breakthrough — for people with paralysis or traumatic brain injuries. For others, I am not so sure. For purposes of argument, as there are many companies working in this space, assume this technology works as advertised. Who exactly will want to use it?

One fear is that the brain-machine connections will be expensive and that only the wealthy will be able to afford them. These people will become a new class of “super-thinkers,” lording over us with their superior intellects.

I do not think that this scenario is likely. If I were offered $100 million for a permanent brain-computer connection, I would not accept it, if only because of fear of side effects and possible neurological damage. And I would want to know for sure that the nexus of control goes from me to the computer, not vice versa.

Besides, there are other ways of augmenting my intelligence with computers, most notably the recent AI innovations. It is true that I can think faster than I can speak or type, but — I’m just not in that much of a hurry. I would rather learn how to type on my phone as fast as a teenager does.

A related vision of direct brain-computer interface is that computers will be able to rapidly inject useful knowledge into our brains. Imagine going to bed, turning on your brain device, and waking up knowing Chinese. Sounds amazing — yet if that were possible, so would all sorts of other scenarios, not all of them benign, where a computer can alter or control our brains.

I also view this scenario as remote — unlike using your brain to manipulate objects, it seems true science fiction. Current technologies read brain signals but do not control them.

Another vision for this technology is that the owners of computers will want to “rent out” the powers of human brains, much the way companies rent out space today in the cloud. Software programs are not good at some skills, such as identifying unacceptable speech or images. In this scenario, the connected brains come largely from low-wage laborers, just as both social media companies and OpenAI have used low-wage labor in Kenya to grade the quality of output or to help make content decisions.

Those investments may be good for raising the wages of those people. Many observers may object, however, that a new and more insidious class distinction will have been created — between those who have to hook up to machines to make a living, and those who do not.

Might there be scenarios where higher-wage workers wish to be hooked up to the machine? Wouldn’t it be helpful for a spy or a corporate negotiator to receive computer intelligence in real-time while making decisions? Would professional sports allow such brain-computer interfaces? They might be useful in telling a baseball player when to swing and when not to.

The more I ponder these options, the more skeptical I become about large-scale uses of brain-computer interfaces for the non-disabled. Artificial intelligence has been progressing at an amazing pace, and it doesn’t require any intrusion into our bodies, much less our brains. There are always earplugs and some future version of Google Glass.

The main advantage of the direct brain-computer interface seems to be speed. But extreme speed is important in only a limited class of circumstances, many of them competitions and zero-sum endeavors, such as sports and games.

Of course, companies such as Neuralink may prove me wrong. But for the moment I am keeping my bets on artificial intelligence and large language models, which sit a comfortable few inches away from me as I write this. 

© 2023 Bloomberg LP


Samsung Galaxy A34 5G was recently launched by the company in India alongside the more expensive Galaxy A54 5G smartphone. How does this phone fare against the Nothing Phone 1 and the iQoo Neo 7? We discuss this and more on Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated – see our ethics statement for details.

Continue Reading

Science

Scientists Chase Falling Satellite to Study Atmospheric Pollution from Spacecraft Reentries

Published

on

By

Scientists Chase Falling Satellite to Study Atmospheric Pollution from Spacecraft Reentries

Scientists take advantage of the spectacular airborne chase of a falling satellite to gather rare data on atmospheric pollution from burnt-up spacecraft. In September 2024, a group of European researchers hopped on an aeroplane outfitted with 26 cameras and flew into the night sky to watch the satellite Cluster Salsa make its flaming return to Earth over the Pacific Ocean. The mission, which was launched from Easter Island, sought chemical byproducts that would have been released during that short, meteor-like reentry event. Despite the glare of bright natural light that impeded a clear view, the researchers captured for the first time images of the satellite fracturing and chemicals being released as it fell to Earth.

Satellite Reentries May Impact Ozone and Climate, Scientists Warn

As per the report presented at the European Conference on Space Debris, reentry produced lithium, potassium, and aluminum emissions — elements with the potential to impact the ozone layer and Earth’s climate. Stefan Löhle of the University of Stuttgart mentioned that the satellite’s weak trail indicated that pieces splintered off and burned with less ferocity than predicted. The satellite started to disintegrate at about 80 kilometres above sea level, and the observations stopped at a height of around 40 kilometres due to the visual extinction.

Such events are increasingly important to monitor as satellite reentries grow in frequency. Although spacecraft such as those in SpaceX’s Starlink fleet are made to burn up completely, surviving debris and dust particles could still affect the upper atmosphere, scientists caution. The aluminum oxide from the melting satellites, for example, could be involved in long-term atmospheric effects, such as changes in thermal balance and ozone destruction.

This mission marks only the fifth time a spacecraft reentry has been observed from the air. Researchers hope to align their collected data with computer models to estimate how much mass satellites lose during disintegration and how that mass interacts chemically with the atmosphere. The data also suggest that some titanium components from the 550-kilogram Cluster Salsa may have survived reentry and landed in the Pacific Ocean.

As more satellites return to Earth, researchers plan to repeat the chase with Salsa’s sister satellites—Rumba, Tango, and Samba—expected to re-enter by 2026. Despite daytime limitations affecting some measurement techniques, these missions may help clarify how spacecraft pollution influences Earth’s upper atmosphere and climate.

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who’sThat360 on Instagram and YouTube.


Kaalamega Karigindhi OTT Release Date: When and Where to Watch Telugu Romantic Movie Online?



Kenya Orders Sam Altman’s World to Delete Citizens’ Biometric Data Within 7 Days

Continue Reading

Science

NASA Stacks Artemis 2 Second Stage While the Future of SLS Remains Uncertain

Published

on

By

NASA Stacks Artemis 2 Second Stage While the Future of SLS Remains Uncertain

NASA’s Artemis 2 mission has reached a major milestone as the second stage that powers the Artemis 2 rocket, the Interim Cryogenic Propulsion Stage (ICPS), has been stacked. Kennedy Space Centre in Florida’s technicians mounted the ICPS on top of the SLS rocket inside the Vehicle Assembly Building on May 1. Driven by its upper stage, NASA’s Orion spacecraft and four-person crew—three NASA astronauts and one Canadian—out of Earth orbit will travel a free-return path around the moon, therefore allowing NASA’s return to deep space exploration.

NASA Advances Artemis 2 Moon Mission as Future of SLS and Orion Faces Uncertainty

As per NASA’s announcement, the ICPS arrived at the VAB last month and was hoisted into position inside the rocket stage adapter. The stage is critical for completing the crew’s journey past low Earth orbit during the 10-day Artemis 2 mission. Images shared by NASA show the second stage being lowered into place, while the Orion spacecraft and service module, delivered this week by Lockheed Martin, await integration. Exploration Ground Systems will process the Orion module before joining the rest of the launch vehicle.

Artemis 2 follows Artemis 1, which launched uncrewed in 2022 and revealed issues with Orion’s heat shield that delayed future missions. The Artemis 2 crew will fly a lunar pass rather than enter lunar orbit. The success of the mission will be vital in opening the path for Artemis 3, currently set for 2027, whereupon humans would land on the moon using a SpaceX Starship lander.

Even with continuous development, ambiguity surrounds the long-term fate of the program. A 2026 budget proposal released May 2 suggests ending the SLS and Orion programs after Artemis 3. If enacted, the mission currently under assembly may be among the final uses of the massive launch vehicle, designed to carry humans beyond low Earth orbit.

Artemis 2 is still relentlessly heading towards launch readiness. Though programming objectives are always changing, NASA’s efforts to prepare the SLS and Orion spacecraft highlight a more general aim of maintaining a continuous lunar presence—a step towards eventual Mars exploration.

Continue Reading

Science

What Happens in Your Brain When You Read? New Study Maps the Reading Mind

Published

on

By

What Happens in Your Brain When You Read? New Study Maps the Reading Mind

Scientists concluded in a recent research published in April 2025 in Neuroscience & Biobehavioral Reviews provides an in-depth look into how our brain understands the written language. The study has been conducted by researchers at the Max Planck Institute for Human Cognitive and Brain Sciences. The findings of this research have been derived from 163 neuroimaging studies to understand the neural mechanisms behind reading in depth. This comprehensive analysis has shown how different areas of the brain work in synchronisation, mainly the left-hemispheric regions and the cerebellum, to process different written content.

How the Brain Handles Letters to Full Texts

Sabrina Turker, Philip Kuhnke, Gesa Hartwigsen and Beatrice Fumagalli, the researchers involved in the study, found that specific brain areas get activated based on the type of reading. Researchers found that the left occipital cortex’s single cluster was activated after reading letters, whereas words, sentences and paragraphs activated the left hemisphere. While reading pseudo words, unique areas were involved, which has shown the inability of the brain to find the difference between the language that is known and the unknown.

Silent vs. Aloud Reading: What’s the Difference?

A major discovery in this research is the difference between overt (aloud reading) and covert (silent reading) brain activity. Aloud reading triggers the regions linked to sound and movement, whereas silent reading involves more complex multiple-demand areas. According to the researchers, silent reading needs more mental resources than aloud reading.

Explicit vs. Implicit Reading Tasks

The study also revealed the exploration of how the brain responds to explicit reading, i.e. Silent word reading and lexical decision tasks. The former one involves stronger activation in the regions, just like the cerebellar cortices and left orbitofrontal, whereas the implicit reading activated both sides of the inferior frontal, together with insular regions.

Why This Matters

The insights from the study can help support individuals suffering from reading challenges. After knowing how silent reading reacts differently to the brain, educators and doctors can better customise the medical practices for treating disorders such as dyslexia.

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who’sThat360 on Instagram and YouTube.


OnePlus 13s Design Fully Revealed in New Teaser; Confirmed to Debut in Two Colour Options



Samsung Galaxy Z Fold 7, Galaxy Z Flip 7 Battery Capacities Tipped via Certification Site

Continue Reading

Trending