Connect with us

Published

on

A breakthrough in solar research has been achieved using NASA’s supercomputing technology, revealing new insights into the intricate inner workings of the Sun. The simulations, developed by NASA’s Ames Research Center, showcase turbulent motions within the Sun’s upper layers, using data collected from various Sun-observing spacecraft. These findings aim to enhance understanding of solar activity and its effects on space weather.

Advanced Techniques Reveal Fine Solar Structures

The animated simulations display the vigorous twisting and churning of solar plasma, resembling chaotic flows akin to boiling water. The model demonstrates how materials move within the Sun’s layers, bringing new clarity to solar dynamics. Dr Irina Kitiashvili, a leading scientist at NASA Ames, explained that these simulations incorporate a “realistic approach,” using advanced knowledge of solar plasma to replicate phenomena observed by NASA’s Solar Dynamics Observatory.

The research focuses on recreating detailed structures of the Sun’s subsurface layers, capturing features such as shock waves and tornado-like phenomena. These elements, spanning only a few miles, represent details previously unattainable through spacecraft observations alone. However, global models of the Sun remain beyond current computational capabilities. Instead, smaller regions are modelled to yield a deeper understanding of specific dynamics.

The Sun’s activity significantly impacts Earth, influencing seasons, weather, and space weather patterns. Accurate space weather forecasts are critical for safeguarding astronauts and spacecraft, especially during missions such as NASA’s Artemis campaign. The NASA Parker Solar Probe, set to make a record-breaking approach to the Sun in December 2024, will further support these efforts.

Exploring New Frontiers in Solar Research

The simulations were run on the Pleaides supercomputer at NASA’s Advanced Supercomputing facility, generating extensive data over several weeks. As the Sun approaches its solar maximum period, researchers anticipate uncovering additional phenomena, enhancing predictions of solar behaviour.

Continue Reading

Science

Earth’s Spin to Speed Up Briefly, Causing Shorter Days This Summer

Published

on

By

Earth’s Spin to Speed Up Briefly, Causing Shorter Days This Summer

Reports indicate that for three days this summer – July 9, July 22 and August 5 – Earth’s rotation will speed up slightly, trimming 1.3 to 1.5 milliseconds off each day. Imperceptible in everyday life, this shift underscores how the Moon’s position influences our planet’s spin. For reference, the shortest day on record was July 5, 2024, lasting 1.66 milliseconds less than 24 hours. Over billions of years Earth’s rotation has slowly lengthened, but recent data show speedups. Scientists say monitoring these tiny changes is important for understanding Earth’s dynamics and timekeeping.

Causes of Faster Spin

According to timeanddate.com, the shortest-ever recorded day was on July 5, 2024, which was 1.66 milliseconds shy of 24 hours. The acceleration is largely driven by the Moon’s gravity. On those dates (July 9, July 22 and August 5), the Moon will lie far north or south of Earth’s equator, weakening its tidal braking on our planet’s spin. As a result, Earth rotates a bit faster – like spinning a top held at its ends. Seasonal shifts in mass distribution also affect rotation. Richard Holme of the University of Liverpool notes that summer growth and melting snow in the Northern Hemisphere move mass outward from Earth’s axis, slowing the spin in the same way an ice skater slows by extending her arms.

Timekeeping and Technology

Shifts in day length are handled by precise timekeeping. The International Earth Rotation and Reference Systems Service (IERS) monitors Earth’s spin and adds leap seconds to keep Coordinated Universal Time (UTC) in sync with solar time. Normally a second is added when Earth’s rotation slows, but if the spin-up trend continues, scientists have floated a “negative leap second” – removing a second – to realign clocks.

Dr. Michael Wouters of Australia’s National Measurement Institute says this fix would be unprecedented, and notes that even if a few seconds accumulated over decades, it would likely go unnoticed. Dr. David Gozzard of the University of Western Australia points out that GPS satellites, communications networks and power grids rely on atomic clocks synced to nanoseconds, and that millisecond-scale changes in Earth’s rotation are easily absorbed by these systems.

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who’sThat360 on Instagram and YouTube.


Samsung Unpacked 2025: Galaxy Z Flip 7 Launched in India With 4.1-Inch Cover Screen, Exynos 2500 SoC



The Last of Us Part 2 Remastered Gets New Free Update That Allows Players to Experience Story Chronologically

Continue Reading

Science

James Webb Telescope Spots Rare ‘Cosmic Owl’ Formed by Colliding Galaxies

Published

on

By

James Webb Telescope Spots Rare ‘Cosmic Owl’ Formed by Colliding Galaxies

NASA’s James Webb Space Telescope has captured the “Cosmic Owl,” a startling owl-faced pair of colliding ring galaxies. This double-ring structure is exceptionally rare: ring galaxies account for just 0.01% of known galaxies, and two colliding rings is almost unheard of. The JWST image provides an exceptional natural laboratory for studying galaxy evolution. Models suggest the galactic clash began roughly 38 million years ago, meaning the owl-like shape could persist for a long time. A team led by Ph.D. student Mingyu Li of Tsinghua University in China announced the finding.

Spotting the ‘Cosmic Owl’

According to Mingyu Li, the first author of the new study , he and his team found the Owl by combing through public JWST data from the COSMOS field. The twin ring galaxies jumped out thanks to JWST’s infrared imaging. Each ring is about 26,000 light-years across (a quarter of the Milky Way), and each harbors a supermassive black hole at its core – one of the Owl’s eyes.

JWST images show the collision interface – the Owl’s beak – ablaze with activity. ALMA observations find a huge clump of molecular gas there – the raw fuel for new stars – being squeezed by the impact. Radio observations show a jet from one galaxy’s black hole slamming into the gas. Li notes the shockwave-plus-jet have ignited an intense starburst, turning the beak into a stellar nursery.

Rarity and Significance

Ring galaxies are extremely rare (≈0.01% of all galaxies), so finding two in collision is unheard of. Another team independently identified the same system and called it the “Infinity Galaxy”. Li says this event is an exceptional natural laboratory for studying galaxy evolution. In one view, researchers can see black holes feeding, gas compressing and starbursts happening together.

Li points out the collision’s shockwave and jet have triggered an intense starburst in the beak. He says this may be a crucial way to turn gas into stars rapidly, which could help explain how young galaxies built up their mass so quickly. Simulations will clarify the precise collision conditions needed to produce such a rare twin-ring “owl” shape.

Continue Reading

Science

MIT Develops Low-Resource AI System to Control Soft Robots with Just One Image

Published

on

By

MIT Develops Low-Resource AI System to Control Soft Robots with Just One Image

The use of conventional robots for industry and hazardous environments is easy for the purpose of control and modelling. However, these are too rigid to operate in confined places and uneven terrain. The soft bio-related roots are better adapted to the environment and manoeuvring in inaccessible places. Such flexible capabilities would need an array of on-board sensors and spacious models which are tailored to each robot design. Having a new and less resource-demanding approach, the researchers at MIT have developed a far less complex, deep learning control system that teaches the soft, bio-inspired robots to follow the command from a single image only.

Soft Robots Learn from a Single Image

As per Phys.org, this research has been published in the journal Nature, by training a deep neural network on two to three hours of multi-view images of various robots executing random commands, the scientists trained the network to reconstruct the range and shape of mobility from only one image. The previous machine learning control designs need customised and costly motion systems. Lack of a general-purpose control system limited the applications and made prototyping less practical.

The methods unshackle the robotics hardware design from the ability to model it manually. This has dictated precision manufacturing, extensive sensing capabilities, costly materials and reliance on conventional and rigid building blocks.

AI Cuts Costly Sensors and Complex Models

The single camera machine learning approach allows the high-precision control in tests on a variety of robotic systems, adding the 3D-printed pneumatic hand, 16-DOF Allegro hand, a soft auxetic wrist and a low-cost Poppy robot arm.

As this system depends on the vision alone, it might not be suitable for more nimble tasks which need contact sensing and tactile dynamics. The performance may also degrade in cases where visual cues are not enough.

Researchers suggest the addition of sensors and tactile materials that can enable the robots to perform different and complex tasks. There is also potential to automate the control of a wider range of robots, together with minimal or no embedded sensors.

Continue Reading

Trending