Saturday, August 31, 2024

By Watching the Sun, Astronomers are Learning More about Exoplanets

Watching the Olympics recently and the amazing effort of the hammer throwers was a wonderful demonstration of the radial velocity method that astronomers use to detect exoplanets. As the hammer spins around the athlete, their body and head bobs back and forth as the weight from the hammer tugs upon them. In the same way we can detect the wobble of a star from the gravity of planets in orbit. Local variations in the stars can add noise to the data but a team of researchers have been studying the Sun to help next-generation telescopes detect more Earth-like planets. 

To date 5,288 exoplanets have been discovered, that’s 5,288 planets in orbit around other star systems. Before 1992 we had no evidence of other planetary systems around other stars. Since then, and using various methods astronomers have detected more and more of the alien worlds. Techniques to detect the exoplanets range from monitoring starlight for tiny dips in brightness to studying the spectra of stars. Just over 1,000 exoplanets have been discovered using the radial technique making it one of the most successful methods. 

“Icy and Rocky Worlds” is a new exoplanet infographic from Martin Vargic, an artist and space enthusiast from Slovakia. It’s available as a wall poster on his website. Image Credit and Copyright: Martin Vargic.

The local variations in the properties of stars has made it difficult to find smaller planets using the radial technique but a team of astronomers led by Eric B. Ford from the Department of Astronomy and Astrophysics at the Penn State University has just published a report of their findings following observations of the Sun. Observations of the Sun between January 2021 and June 2024 using the NEID Solar spectrograph at the WIYN Observatory have informed their study. 

a gas giant orbiting a red dwarf star
A gas giant exoplanet [right] with the density of a marshmallow has been detected in orbit around a cool red dwarf star [left] by the NASA-funded NEID radial-velocity instrument on the 3.5-meter WIYN Telescope at Kitt Peak National Observatory, a Program of NSF’s NOIRLab. The planet, named TOI-3757 b, is the fluffiest gas giant planet ever discovered around this type of star.

Across the 3 years and 5 months of observations, the team identified 117,600 features which are not likely to have been caused by the weather, hardware or calibration issues so they could be used for their study. Given that the distance between the Sun and Earth is precisely known the team can use this to analyse solar observations and measure other solar variability. 

Impressively the team have been able to show that the NEID instrumentation is able to measure radial velocity of the Sun accurate to 0.489 m/s-1. Using this data the team conclude that Scalpels algorithm (a technique developed for medicine that uses machine learning to analyse and extract data from images) performs particularly well. It can reduce the root mean square (used to analyse signal amplitude) of solar radial velocity from over 2 m/s-1 down to 0.277 m/s-1

The results are significantly better than previous studies at removing solar variability from its radial velocity observations. This suggests that the next generation of exoplanet radial velocity instruments are capable, at least technically at detecting Earth-massed planets orbiting a star like the Sun. This does of course require sufficient observing time which the team estimate would be about 103 nights of observations. 

Source : Earths within Reach: Evaluation of Strategies for Mitigating Solar Variability using 3.5 years of NEID Sun-as-a-Star Observations

The post By Watching the Sun, Astronomers are Learning More about Exoplanets appeared first on Universe Today.



Estimating the Basic Settings of the Universe

The Standard Model describes how the Universe has evolved at large scale. There are six numbers that define the model and a team of researchers have used them to build simulations of the Universe. The results of these simulations were then fed to a machine learning algorithm to train it before it was set the task of estimating five of the cosmological constants, a task which it completed with incredible precision. 

The Standard Model incorporates a number of elements; the Big Bang, dark energy, cold dark matter, ordinary matter and the cosmic background radiation. It works well to describe the large scale structure of the Universe but there are gaps in our understanding. Quantum physics can describe the small scale of the Universe but struggles with gravity and there are questions around dark matter and dark energy too. Understanding these can help in our understanding of the evolution and structure of the Universe. 

Enlarged region of the Saraswati Supercluster, the largest known structure in the Universe, showing the distribution of galaxies. Credit: IUCAA

A team of researchers from the Flatiron Institute have managed to extract some hidden information in the distribution of galaxies to estimate the values of five of the parameters. The accuracy was a great improvement on values that were attained during previous attempts. Using AI technology the team’s results had less than half the uncertainty for the element that describes the clumpiness of the Universe than in the previous attempt. Their results also revealed estimates of other parameters that closely resembled observation. The paper was published in Nature Astronomy on 21 August. 

The team generated 2,000 simulated universes after carefully specifying their cosmological parameters. These included expansion rate, the distribution and clumpiness of ordinary matter, dark matter and dark energy and using these the team ran the simulations. The output was then compressed into manageable data sets and this was used to compare against over one hundred thousand real galaxies data. From this, it was possible for the researchers to estimate the parameters for the real Universe. 

The parameters the team managed to fine tune are those that describe how the Universe operates at the largest scale. These are essentially, the settings for the Universe and include the amount of ordinary matter, dark matter, dark energy, the conditions following the Big Bang and just how clumpy the matter is. Previously these settings were calculated using observations from the structure of galaxy clusters. To arrive at a more accurate group of settings observations needed to go down to smaller scale but this has not been possible. 

The full-sky image of the temperature fluctuations (shown as color differences) in the cosmic microwave background, made from nine years of WMAP observations. These are the seeds of galaxies, from a time when the universe was under 400,000 years old. Credit: NASA/WMAP

Instead of using observations, the team used their AI approach to extract the small scale information that was hidden in the existing observational data. At the heart of the approach was the AI system that learned how to correlate the parameters with the observed structure of the Universe – but at small scale. 

In the future the team hope to be able to use their new approach to solve other problems. The uncertainty about the Hubble Constant is an example where the team hope AI can help to fine tune its value. Over the next few years though, and as observational data becomes more detailed both Hubble’s Constant and the Settings of the Universe will become far better understood along with our understanding of the Universe. 

Source : Astrophysicists Use AI to Precisely Calculate Universe’s ‘Settings’

The post Estimating the Basic Settings of the Universe appeared first on Universe Today.



Dark Matter Could Have Driven the Growth of Early Supermassive Black Holes

The James Webb Space Telescope (JWST) keeps finding supermassive black holes (SMBH) in the early Universe. They’re in active galactic nuclei seen only 500,000 years after the Big Bang. This was long before astronomers thought they could exist. What’s going on?

Monster black holes like the ones at the hearts of galaxies take a really long time to grow so massive. They could start as smaller ones that gobble up nearby stars and gases, or they can grow by merging with other supermassive black holes. That typically takes billions of years and a lot of material to build up to something as massive as the four-million-solar-mass one in the heart of our Milky Way Galaxy. It’s even longer for the really big ones that contain tens of millions of stellar masses.

A James Webb Telescope image shows the J0148 quasar circled in red. Two insets show, on top, the central supermassive black hole, and on bottom, the stellar emission from the host galaxy.
A James Webb Telescope image shows the J0148 quasar circled in red. Two insets show, on top, the central supermassive black hole, and on bottom, the stellar emission from the host galaxy.

JWST has spotted many SMBH that already appear “old” and massive less than a billion years after the Big Bang. It’s not an observational fluke—they’re really there.

“How surprising it has been to find a supermassive black hole with a billion-solar-mass when the universe itself is only half a billion years old,” said astrophysicist Alexander Kusenko, a professor of physics and astronomy at UCLA. “It’s like finding a modern car among dinosaur bones and wondering who built that car in the prehistoric times.”

Building Supermassive Black Holes in Ancient Times

So, what built SMBH so early in cosmic history? One obvious process is the death of the first Population III stars that began forming as soon as the infant Universe cooled enough for them to coalesce. These were massive, metal-poor (meaning they had no elements heavier than helium), and short-lived. When they died as supernovae, they formed stellar-mass black holes. It’s possible those early ones merged and got bigger.

Another suggestion is a so-called “gravo-thermal” collapse of self-interacting dark matter halos. That basically means a negative heat transfer inside a system. That can lead to the collapse of a black hole, and from there, it could have grown.

Astronomers have also considered the participation of primordial black holes created in the moments after the Big Bang. These theoretical low-mass black holes could have formed under special conditions when dense areas of space collapsed quickly. How SMBH formed from primordial black holes isn’t understood at the moment. So, is there another formation theory?

Primordial black holes, if they exist, could have formed by the collapse of overdense regions in the very early universe. Some theories suggest these played a role in forming supermassive black holes. Credit M. Kawasaki, T.T. Yanagida.
Primordial black holes, if they exist, could have formed by the collapse of overdense regions in the very early universe. Some theories suggest these played a role in forming supermassive black holes. Credit M. Kawasaki, T.T. Yanagida.

This is where dark matter comes into play. Kusenko and his colleagues dug into the idea of dark matter-influenced collapse. They found that if dark matter decays, it plays a role in “corraling” a hydrogen gas cloud. It would not fragment (as clouds usually do). Eventually, that could lead to the relatively rapid formation of an SMBH. Since there is evidence of dark matter’s influence in the early Universe, this could explain the monster black holes in the earliest epochs of cosmic history.

From Cloud to Black Hole Formation via Dark Matter?

Of course, the conditions have to be just right for this to happen. “How quickly the gas cools has a lot to do with the amount of molecular hydrogen,” said doctoral student Yifan Lu, the first author on a paper describing the dark matter idea. “Hydrogen atoms bonded together in a molecule dissipate energy when they encounter a loose hydrogen atom. The hydrogen molecules become cooling agents as they absorb thermal energy and radiate it away. Hydrogen clouds in the early universe had too much molecular hydrogen, and the gas cooled quickly and formed small halos instead of large clouds.”

Certain radiation can destroy molecular hydrogen. That creates conditions that prevent cloud fragmentation. The radiation could be from somewhere, and Lu and others suggest an interesting idea in their paper. They state that there’s a possible “parameter space” where relic decaying particles could emit radiation that would spur the collapse. Among other things, they propose an “axion-like” dark matter particle decaying and spurring the eventual coalescence of a cloud of hydrogen into an SMBH.

Mysteries of Dark Matter and SMBH Need Answers

Dark matter itself is a mysterious “stuff” that makes up a very large part of the “stuff” of the Universe. We know about it from its gravitational effects on the objects we can see (called baryonic matter). The form that dark matter takes isn’t understood at all, however. It could be made of particles that slowly decay, or it could be made of more than one particle species. Some could be stable, others could decay at early times. In either case, the product of decay could be radiation in the form of photons, which break up molecular hydrogen and prevent hydrogen clouds from cooling too quickly. Even very mild decay of dark matter yielded enough radiation to prevent cooling, forming large clouds and, eventually, supermassive black holes.

Of course, this idea hasn’t been proven. However, the team points out that the decay of such particles of dark matter can emit light in both the optical and ultraviolet spectrum. That might explain the very precise measurements of the “cosmic optical background” (COB) seen by the New Horizons LORRI instrument. The COB is a visible light background roughly analogous to the cosmic microwave background. Think of it as the sum of all emissions from objects beyond the Milky Way Galaxy. Its presence allows astronomers to diagnose and understand the emissions from all astrophysical objects. There’s still a lot to study and understand about these possible axions (if they make up dark matter).

For More Information

Dark Matter Could Have Helped Make Supermassive Black Holes in the Early Universe
Direct Collapse Supermassive Black Holes from Relic Particle Decay
Pre-print of Paper

The post Dark Matter Could Have Driven the Growth of Early Supermassive Black Holes appeared first on Universe Today.



Friday, August 30, 2024

If Gravitons Exist, this Experiment Might Find Them

There are four fundamental forces in the Universe; strong, weak, electromagnetic and gravity. Quantum theory explains three of the four through the interaction of particles but science has yet to discover a corresponding particle for gravity. Known as the ‘graviton’, the hypothetical gravity particle is thought to constitute gravitational waves but it hasn’t been detected in gravity wave detector. A new experiment hopes to change that using an acoustic resonator to identify individual gravitons and confirm their existence. 

The four fundamental forces of nature govern the Universe. Gravity is one that many people are familiar with yet we do not fully understand how it works. Its effects are obvious though as the attraction between objects with mass. It keeps the planets in orbit around the Sun, the Moon in orbit around the Earth and us pinned to the surface of planet Earth. One of the earliest attempts to describe it was  from Isaac Newton who stated that gravity was proportional to the mass of objects and inversely proportional to the square of the distance between them. Even at the largest scale of the cosmos it seems to be essential for the structure of the Universe. 

Portrait of Newton in 1702, painted by Godfrey Kneller. Credit: National Portrait Gallery, London

One of the challenges with gravity is that, unlike the other fundamental forces, it can only be explained in a classical sense. Quantum physics can explain the other three forces by way of particles; the electromagnetic force has the photon, the strong nuclear force has the gluon, the weak nuclear force has the W and Z bosons but gravity has, well nothing yet. Other than the hypothesised graviton. The graviton can be thought of as the building block of gravity much as bricks are the building blocks of a house or atoms the building blocks of matter. 

Detectors like LIGO the Laser Interferometer Gravitational-Wave Observatory, can detect gravity waves from large scale events like mergers of black holes and neutron stars yet to date, a graviton has never been detected. That may soon be about to change though. A team of researchers led by physics professor Igor Pikovski from the Stevens Institute of Technology suggests a new solution. By utilising existing detection technology, which is essentially a heavy cylinder known as an acoustic resonator, the team propose adding improved energy state detection methods known as quantum sensing. 

LIGO Observatory

The proposed solution, explains Pikovski “is similar to the photo-electric effect that led Einstein to the quantum theory of light, just with gravitational waves replacing electromagnetic waves.” The secret is the discrete steps of energy that are exchanged between the material and the waves as single gravitons are absorbed. The team will use LIGO to confirm gravity wave detections and cross reference with their own data. 

The new approach has been inspired by gravity wave data that have been detected on Earth. Waves detected in 2017 came from a collision event between two city-sized super dense neutron stars. The team calculated the parameters that would facilitate the absorption probability for a single graviton. 

The team began thinking through a possible experiment. Using data from gravitational waves that have previously been measured on Earth, such as those that arrived in 2017 from a collision of two Manhattan-sized (but super-dense) faraway neutron stars, they calculated the parameters that would optimise the absorption probability for a single graviton. Their development led to devices similar to the Weber bar (thick, heavy 1 ton cylindrical bars) to allow gravitons to be detected. 

The bars would be suspended in the newly designed quantum detector, cooled to the lowest possible energy state and the passage of a gravity wave would set it vibrating. The team then hope to be able to measure the vibration using super-sensitive energy detectors to see how the vibrations changed in discrete steps, indicating a graviton event. 

It’s an exciting time for gravity based physics and we are most definitely getting closer to unravelling its mysteries. Unfortunately though, the super-sensitive detectors are not available yet but according to Pikovski’s team, they are not far away. Pikovski summed it up “We know that quantum gravity is still unsolved, and it’s too hard to test it in its full glory but we can now take the first steps, just as scientists did over a hundred years ago with quanta of light.”

Source : New research suggests a way to capture physicists’ most wanted particle — the graviton

The post If Gravitons Exist, this Experiment Might Find Them appeared first on Universe Today.



How Vegetation Could Impact the Climate of Exoplanets

The term ‘habitable zone’ is a broad definition that serves a purpose in our age of exoplanet discovery. But the more we learn about exoplanets, the more we need a more nuanced definition of habitable.

New research shows that vegetation can enlarge the habitable zone on any exoplanets that host plant life.

Every object in a solar system has an albedo. It’s a measurement of how much starlight the object reflects back into space. In our Solar System, Saturn’s moon, Enceladus, has the highest albedo because of its smooth, frozen surface. Its albedo is about 0.99, meaning about 99% of the Sun’s energy that reaches it is reflected back into space.

There are many dark objects in space with low albedoes. Some say that another of Saturn’s moons, Iapetus, has the lowest albedo.

Earth, the only living planet, has an albedo of about 0.30, meaning it reflects 30% of the Sunlight that reaches it back into space. Many factors affect the albedo. Things like the amount of ice cover, clouds in the atmosphere, land cover vs ocean cover, and even vegetation all affect Earth’s albedo.

This image made of satellite data shows the regions of Earth covered by forests with trees at least five meters (16.5 ft.) tall. Image Credit: NASA/LandSat

We live in an age of exoplanet discovery. We now know of more than 5,000 confirmed exoplanets, with many more on the way. Though all planets are interesting scientifically, we’re particularly interested in exoplanets that are potentially habitable.

A team of Italian researchers is examining exoplanet habitability through the lens of vegetation and albedo. Their work is in a paper to be published in the Monthly Notices of the Royal Astronomical Society titled “Impact of vegetation albedo on the habitability of Earth-like exoplanets.” The lead author is Erica Bisesi, a Postdoctoral Researcher at the Italian National Institute for Astrophysics’ Trieste Astronomical Observatory.

“Vegetation can modify the planetary surface albedo via the Charney mechanism, as plants are usually darker than the bare surface of the continents,” the researchers write in their paper. Compared to a dead planet with bare continents, an exoplanet with vegetation cover should be warmer if they’re both the same distance from similar stars.

The Charney mechanism is named after Jule Charney, an American meteorologist who is considered by many to be the father of modern meteorology. It’s a feedback loop between vegetation cover and how it affects rainfall.

In their work, the researchers updated the Earth-like Surface Temperature Model to include two types of dynamically competing vegetation: grasslands and forests, with forests included in the seedling and mature stages.

“With respect to a world with bare granite continents, the effect of vegetation-albedo feedback is to increase the average surface temperature,” the authors explain. “Since grasses and trees exhibit different albedos, they affect temperature to different degrees.”

On Earth, grasslands are found on every continent except Antarctica. They're one of the largest biomes on Earth. Image Credit: NASA Earth Observatory
On Earth, grasslands are found on every continent except Antarctica. They’re one of the largest biomes on Earth. Image Credit: NASA Earth Observatory

Since grasses and trees affect albedo differently, vegetation’s effect on planetary albedo is linked to the outcome of their dynamic competition. “The change in albedo due to vegetation extends the habitable zone and enhances the overall planetary habitability beyond its traditional outer edge,” the authors write.

The researchers considered four situations:

  • Complete tree dominance (forest worlds).
  • Complete grass dominance (grassland worlds).
  • Tree/Grass coexistence.
  • Bi-directional worlds

In a bi-directional world, vegetation converges to grassland or to forest, depending on the initial vegetation fractions. In these worlds, seed propagation across latitudes widens the region where forests and grasslands coexist.

The researchers found that vegetation cover lowers a planet’s albedo and warms the climate, nudging the outer limit of the habitable zone. However, they also arrived at more specific results.

They found that the outcome of dynamic competition between trees and grasses affected how vegetation is distributed across latitudes. “The achieved temperature-vegetation state is not imposed, but it emerges from the dynamics of the vegetation-climate system,” they explain.

This figure from the research shows how Earth's liquid water habitability index is shifted outward by different vegetation regimes. It's based on Earth's modern distribution of continents. Image Credit: Bisesi et al. 2024.
This figure from the research shows how Earth’s liquid water habitability index is shifted outward by different vegetation regimes. It’s based on Earth’s modern distribution of continents. Image Credit: Bisesi et al. 2024.

The researchers worked with the idea of a ‘pseudo-Earth.’ The pseudo-Earth has a constant fraction of oceans at all bands of latitude, affecting the distribution of continents and vegetated surfaces relative to the equator, where most of the Sun’s energy strikes the planet.

This figure from the research shows how a pseudo-Earth's liquid water habitability index is shifted outward by different vegetation regimes. It's based on an equal distribution of oceans at all bands of latitude. Image Credit: Bisesi et al. 2024.
This figure from the research shows how a pseudo-Earth’s liquid water habitability index is shifted outward by different vegetation regimes. It’s based on an equal distribution of oceans at all bands of latitude. Image Credit: Bisesi et al. 2024.

The researchers also worked with a hypothetical dry pseudo-Earth. On this Earth, ocean cover is limited to 30%, while the Earth and the pseudo-Earth both have 70% ocean cover.

The simulated dry pseudo-Earth has less ocean coverage than Earth, meaning there’s more surface area for vegetation to cover. Image Credit: Bisesi et al. 2024.

The team reached some conclusions about vegetation cover, albedo, and habitability.

The more continents a planet has, the greater the climate warming effect from vegetation. When the simulations resulted in a grass-dominated world, the effect was weaker because grass raises albedo. When the simulations resulted in a forest-dominated world, the effect was greater.

The researchers’ key point is that none of this is static. Outcomes are driven by the competition between grasslands and forests for resources, which in turn is driven by the average temperature in each latitudinal band. “In general, thus, the achieved temperature-vegetation state is not imposed, but it emerges from the dynamics of the vegetation-climate system,” they explain.

This is especially pronounced on the dry pseudo-Earth. Because there is so much land cover, vegetation has an even stronger effect on albedo and climate. “However, the ocean fraction cannot be too small, as
in this case, the whole hydrological cycle could be modified,” the researchers add.

Overall, vegetation’s effect on albedo and climate is small. But we can’t dismiss its effect on habitability. Habitability is determined by a myriad of factors.

This issue is very complex. For instance, on a planet where grasslands and forests coexist, external factors like stellar luminosity and orbital variations can be buffered depending on where the continents are and how much their vegetation affects albedo purely by location.

The authors consider their work as a basic first step in this issue. They only included certain types of grasslands and forests, didn’t include the relative availability of water, and didn’t include atmospheric CO2 concentrations.

“The dynamics explored here are extremely simplified and represent only a first step in the analysis of vegetation habitability interactions.” they write. “Future work will also include a simplified carbon balance model in the study of planetary habitability.”

“This endeavour should be seen as a first step of a research program aimed at including the main climate-vegetation feedbacks known for Earth in exoplanetary habitability assessments,” they write.

The post How Vegetation Could Impact the Climate of Exoplanets appeared first on Universe Today.



A New Test Proves How to Make the Event Horizon Telescope Even Better

Want a clear view of a supermassive black hole’s environment? It’s an incredible observational challenge. The extreme gravity bends light as it passes through and blurs the details of the event horizon, the region closest to the black hole. Astronomers using the Event Horizon Telescope (EHT) just conducted test observations aimed at “deblurring” that view.

The EHT team collaborated with scientists at the Atacama Large Millimeter/submillimeter Array (ALMA) and other facilities to do the tests. The antennas detected light from the centers of distant galaxies at a radio frequency of 354 GHz, equivalent to a wavelength of 0.87 mm.

A map of the Event Horizon Telescope observatories used in recent test iobservations at 0.87 mm of distant galaxies, to bump up its resolution. Credit: ESO/M. Kornmesser
A map of the Event Horizon Telescope observatories used in recent test observations at 0.87 mm of distant galaxies, to bump up its resolution. Credit: ESO/M. Kornmesser

This pilot experiment achieved observations with detail as fine as 19 microarcseconds. That’s the highest-ever resolution ever achieved from Earth’s surface. Although there are no images from the tests, the observations “saw” strong light signals from several distant galaxies—and that was only using a few antennas. Once the team focused the full worldwide EHT array on targets, they could see objects at a resolution of 13 microarcseconds. That’s about like looking at a bottle cap on the surface of the Moon—from Earth’s surface!

Sharpening the Event Horizon Telescope

These observational tests are a big breakthrough because it means scientists can make images of black holes that are 50% sharper than previous observations. The EHT’s groundbreaking first observations of M87’s black hole and Sagittarius A* in our galaxy happened just a few years ago, at a wavelength of 1.33 mm. Those images were amazing, but the science teams wanted to do better.

“With the Event Horizon Telescope, we saw the first images of black holes using the 1.3-mm wavelength observations, but the bright ring we saw, formed by light bending in the black hole’s gravity, still looked blurry because we were at the absolute limits of how sharp we could make the images,” said the observation’s co-lead Alexander Raymond of the Jet Propulsion Laboratory. “At 0.87 mm, our images will be sharper and more detailed, which in turn will likely reveal new properties, both those that were previously predicted and maybe some that weren’t.”

The first ever actual image of a black hole, taken in 2019. This shows the black hole at the heart of galaxy M87 Credit: Event Horizon Telescope Collaboration
The first ever actual image of a black hole, taken in 2019. This shows the black hole at the heart of galaxy M87 Credit: Event Horizon Telescope Collaboration

According to EHT Founding Director Sheperd “Shep” Doeleman, an astrophysicist at the CfA and co-lead on a recent paper about the observations, the recent tests will improve the view of our galaxy’s central supermassive black hole, as well as others. “Looking at changes in the surrounding gas at different wavelengths will help us solve the mystery of how black holes attract and accrete matter, and how they can launch powerful jets that stream over galactic distances,” he said. In addition, the new technique should reveal even more dim, distant black holes than the EHT has already seen.

Creating a Big Radio Eye to Study Black Holes

Think of the Event Horizon Telescope as a giant, Earth-sized virtual radio telescope. Instead of one massive dish the size of our planet, it links together multiple radio dishes across the globe. The technique is called “very long baseline interferometry” with each observatory sending its data to a central processing center. For this test, the array consisted of six facilities, including the Atacama Array. The experiment succeeded in expanding the wavelength range of the EHT. Usually, to get better resolution, astronomers have to build bigger telescopes, but this one’s already Earth-sized. So, goosing the wavelength was the only choice.

The current locations of observatories that make up the Event Horizon Telescope. (Courtesy EHT)
The current locations of observatories that make up the full Event Horizon Telescope. (Courtesy EHT)

The test observations at higher resolution mark the first time the VLBI technique was used successfully at a wavelength of 0.87 mm. It’s a challenging measurement to make because water vapor in the atmosphere absorbs more waves at 0.87mm than at 1.3mm. As a result, astronomers worked to improve the EHT’s resolution by increasing the bandwidth of the instrumentation. Then, they had to wait for good observing conditions at all the test sites.

The improvements should allow astronomers to get high-fidelity “movies” of the event horizon around a black hole. Of course, astronomers want more upgrades to the existing EHT arrays. Planned improvements include new antennas, as well as improvements to detectors and other instruments. The result should be some pretty spectacular images and animations of material trapped in the extreme gravitational clutch of a black hole.

Revisiting Old Black Hole Friends

Future observations will include return observations of the supermassive black holes in M87 and the heart of the Milky Way Galaxy. Both are surrounded by accretion disks full of material swirling into the black hole. Once that material crosses the event horizon (the gravitational point of no return), it’s gone forever. So, it’s important to track that kind of action around a black hole. That’s where the EHT comes in handy.

Researchers using the Event Horizon Telescope hope to generate more and better images like this of supermassive black hole Sag. A's event horizon. Image Credit: EHT.
Researchers using the Event Horizon Telescope hope to generate more and better images like this of supermassive black hole Sag. A’s event horizon. Image Credit: EHT.

According to Shep Doeleman, the details should be amazing. “Consider the burst of extra detail you get going from black and white photos to color,” he said. “This new ‘color vision’ allows us to tease apart the effects of Einstein’s gravity from the hot gas and magnetic fields that feed the black holes and launch powerful jets that stream over galactic distances.”

With this in mind, he added that the Collaboration is excited to reimage M87* and Sgr A* at both 1.3mm and 0.87mm and move from detecting black hole “shadows” to more precisely measuring their sizes and shapes, which can help to estimate a black hole’s spin and orientation on the sky.

If all that happens as they hope, the 400-member EHT consortium will certainly be able to fulfill its founding aim. That’s to provide the most detailed radio images of the mysterious beasts that lurk in the hearts of most galaxies.

For More Information

EHT Scientists Make Highest-resolution Observations Yet from the Surface of Earth
Event Horizon Telescope Main Page
First Very Long Baseline Interferometry Detections at 870 µm

The post A New Test Proves How to Make the Event Horizon Telescope Even Better appeared first on Universe Today.



Thursday, August 29, 2024

NASA's New Solar Sail Extends Its Booms and Sets Sail

Solar sails are an exciting way to travel through the Solar System because they get their propulsion from the Sun. NASA has developed several solar sails, and their newest, the Advanced Composite Solar Sail System (or ACS3), launched a few months ago into low-Earth orbit. After testing, NASA reported today that they extended the booms, deploying its 80-square-meter (860 square feet) solar sail. They’ll now use the sail to raise and lower the spacecraft’s orbit, learning more about solar sailing.

“The Sun will continue burning for billions of years, so we have a limitless source of propulsion. Instead of launching massive fuel tanks for future missions, we can launch larger sails that use ‘fuel’ already available,” said Alan Rhodes, the mission’s lead systems engineer at NASA’s Ames Research Center, earlier this year. “We will demonstrate a system that uses this abundant resource to take those next giant steps in exploration and science.”

And for all you skywatchers out there, NASA said that given the reflectivity of the large sail and its position in orbit (about 1,000 km/600 miles) above Earth, ACS3 should be easily visible at times in the night sky. The Heavens Above website already has ACS3 listed on their page (just put in your location to see when to catch the solar sail passing over your area.) There should be info and updates available on social media, so follow NASA.gov and @NASAAmes on X and Instagram for updates.

ACS3 is part of NASA’s Small Spacecraft Technology program, which has the objective of deploying small missions that demonstrate unique capabilities rapidly. ACS3 launched in April 2024 aboard Rocket Lab’s Electron rocket from New Zealand. The spacecraft is a twelve-unit (12U) CubeSat built by NanoAvionics that’s about the size of a microwave oven. The biggest challenge designing and creating lightweight booms that could be small enough to fit inside the spacecraft while being able to extend to about 9 meters (30 ft) per side, and being strong enough to support the solar sail. The lightweight but strong composite carbon fiber boom system unrolled from the spacecraft to form rigid tubes that support the ultra-thin, reflective polymer sail.

This video shows how the booms work and the sail deploys:

When fully deployed, the sail forms a square that is about half the size of a tennis court. To change direction, the spacecraft angles its sails. Now with the boom deployment, the ACS3 team will perform maneuvers with the spacecraft, angling the sails and to change the spacecraft’s orbit.

The primary goal of the mission was to demonstrate boom deployment. With that now successfully achieved, the ACS3 team also hopes the mission will prove that their solar sail spacecraft can actually work for future solar sail-equipped science and exploration missions.?

This image shows the ACS3 being unfurled at NASA’s Langley Research Center. The solar wind is reliable but not very powerful. It requires a large sail area to power a spacecraft effectively. The ACS2 is about 9 meters (30 ft) per side, requiring a strong, lightweight boom system. Image Credit: NASA

Since ACS3 is a demonstration mission, the goal is to build larger sails that can generate more thrust. With these unique composite carbon fiber booms, the ACS3 system has the potential to support sails as large as 2,000 square meters, or about 21,500 square feet, or about half the area of a soccer field.

“The hope is that the new technologies verified on this spacecraft will inspire others to use them in ways we haven’t even considered,” Rhodes said.

And look for photos of the ACS3 fully deployed sail next week. The spacecraft has four cameras which captured a panoramic view of the reflective sail and supporting composite booms. NASA said that high-resolution imagery from these cameras will be available on Wednesday, Sept. 4.

NASA is providing updates on this mission on their Small Satellite Missions blog page.

The post NASA's New Solar Sail Extends Its Booms and Sets Sail appeared first on Universe Today.



Webb Discovers Six New “Rogue Worlds” that Provide Clues to Star Formation

Rogue Planets, or free-floating planetary-mass objects (FFPMOs), are planet-sized objects that either formed in interstellar space or were part of a planetary system before gravitational perturbations kicked them out. Since they were first observed in 2000, astronomers have detected hundreds of candidates that are untethered to any particular star and float through the interstellar medium (ISM) of our galaxy. In fact, some scientists estimate that there could be as many as 2 trillion rogue planets (or more!) wandering through the Milky Way alone.

In recent news, a team of astronomers working with the James Webb Space Telescope (JWST) announced the discovery of six rogue planet candidates in an unlikely spot. The planets, which include the lightest rogue planet ever identified (with a debris disk around it), were spotted during Webb‘s deepest survey of the young nebula NGC 1333, a star-forming cluster about a thousand light-years away in the Perseus constellation. These planets could teach astronomers a great deal about the formation process of stars and planets.

The team was led by Adam Langeveld, an Assistant Research Scientist in the Department of Physics and Astronomy at Johns Hopkins University (JHU). He was joined by colleagues from the Carl Sagan Institute, the Instituto de Astrofísica e Ciências do Espaço, the Trottier Institute for Research on Exoplanets, the Mont Mégantic Observatory, the Herzberg Astronomy and Astrophysics Research Centre, the University of Texas at Austin, the University of Victoria, the Scottish Universities Physics Alliance (SUPA) at the University of St Andrews. The paper detailing the survey’s findings has been accepted for publication in The Astronomical Journal.

Most of the rogue planets detected to date were discovered using Gravitational Microlensing, while others were detected via Direct Imaging. The former method relies on “lensing events,” where the gravitational force of massive objects alters the curvature of spacetime around them and amplifies light from more distant objects. The latter consists of spotting brown dwarfs (objects that straddle the line between planets and stars) and massive planets directly by detecting the infrared radiation produced within their atmospheres.

In their paper, the team describes how the discovery occurred during an extremely deep spectroscopic survey of NGC1333. Using data from Webb‘s Near-Infrared Imager and Slitless Spectrograph (NIRISS), the team measured the spectrum of every object in the observed portion of the star cluster. This allowed them to reanalyze spectra from 19 previously observed brown dwarfs and led to the discovery of a new brown dwarf with a planetary-mass companion. This latter observation was a rare find that already challenges theories of how binary systems form. But the real kicker was the detection of six planets with 5-10 times the mass of Jupiter (aka. super-Jupiters).

This means these six candidates are among the lowest-mass rogue planets ever found that formed through the same process as brown dwarfs and stars. This was the purpose of the Deep Spectroscopic Survey for Young Brown Dwarfs and Free-Floating Planets survey, which was to investigate massive objects that are not quite large enough to become stars. The fact that Webb’s observations revealed no objects lower than five Jupiter masses (which it is sensitive enough to detect) is a strong indication that stellar objects lighter than are more likely to form the way planets do.

Said lead author Langeveld in a statement released by JHU’s new source (the Hub):

“We are probing the very limits of the star-forming process. If you have an object that looks like a young Jupiter, is it possible that it could have become a star under the right conditions? This is important context for understanding both star and planet formation.”

New wide-field view mosaic from the James Webb Space Telescope spectroscopic survey of NGC1333 with three of the newly discovered free-floating planetary-mass objects indicated by green markers. Credit: ESA/Webb, NASA & CSA, A. Scholz, K. Muzic, A. Langeveld, R. Jayawardhana

The most intriguing of the rogue planets was also the lightest: an estimated five Jupiter masses (about 1,600 Earths). Since dust and gas generally fall into a disk during the early stages of star formation, the presence of this debris ring around the one planet strongly suggests that it formed in the same way stars do. However, planetary systems also form from debris disks (aka. circumsolar disks), which suggests that these objects may be able to form their own satellites. This suggests that these massive planets could be a nursery for a miniature planet system – like our Solar System, but on a much smaller scale.

Said Johns Hopkins Provost Ray Jayawardhana, an astrophysicist and senior author of the study (who also leads the survey group):

“It turns out the smallest free-floating objects that form like stars overlap in mass with giant exoplanets circling nearby stars. It’s likely that such a pair formed the way binary star systems do, from a cloud fragmenting as it contracted. The diversity of systems that nature has produced is remarkable and pushes us to refine our models of star and planet formation…

“Our observations confirm that nature produces planetary mass objects in at least two different ways—from the contraction of a cloud of gas and dust, the way stars form, and in disks of gas and dust around young stars, as Jupiter in our own solar system did.”

In the coming months, the team plans to use Webb to conduct follow-up studies of these rogue planets’ atmospheres and compare them to those of brown dwarfs and gas giants. They also plan to search the star-forming region for other objects with debris disks to investigate the possibility of mini-planetary systems. The data they obtain will also help astronomers refine their estimates on the number of rogue planets in our galaxy. The new Webb observations indicate that such bodies account for about 10% of celestial bodies in the targeted cluster.

Current estimates place the number of stars in our galaxy between 100 and 400 billion stars and the number of planets between 800 billion and 3.2 trillion. At 10%, that would suggest that there are anywhere from 90 to 360 billion rogue worlds floating out there. As we have explored in previous articles, we might be able to explore some of them someday, and our Sun may even capture a few!

Further Reading: HUB

The post Webb Discovers Six New “Rogue Worlds” that Provide Clues to Star Formation appeared first on Universe Today.



A NASA Rocket Has Finally Found Earth’s Global Electric Field

Scientists have discovered that Earth has a third field. We all know about the Earth’s magnetic field. And we all know about Earth’s gravity field, though we usually just call it gravity.

Now, a team of international scientists have found Earth’s global electric field.

It’s called the ambipolar electric field, and it’s a weak electric field that surrounds the planet. It’s responsible for the polar wind, which was first detected decades ago. The polar wind is an outflow of plasma from the polar regions of Earth’s magnetosphere. Scientists hypothesized the ambipolar field’s existence decades ago, and now they finally have proof.

The discovery is in a new article in Nature titled “Earth’s ambipolar electrostatic field and its role in ion escape to space.” The lead author is Glyn Collinson from the Heliophysics Science Division at NASA Goddard Space Flight Center.

“It’s like this conveyor belt, lifting the atmosphere up into space.”

Glyn Collinson, Heliophysics Science Division, NASA Goddard Space Flight Center

The Space Age gained momentum back in the 1960s as the USA and USSR launched more and more satellites. When spacecraft passed over the Earth’s poles, they detected an outflow of particles from Earth’s atmosphere into space. Scientists named this the polar wind, but for decades, it was mysterious.

Scientists expect some particles from Earth to “leak” into space. Sunlight can cause this. But if that’s the case, the particles should be heated. The wind is mysterious because many particles in it are cold despite moving at supersonic speeds.

“Something had to be drawing these particles out of the atmosphere,” said lead author Collinson.

Collinson is also the Principal Investigator for NASA’s “Endurance” Sounding Rocket Mission. “The purpose of the Endurance mission was to make the first measurement of the magnitude and structure of the electric field generated by Earth’s ionosphere,” NASA writes in their mission description. Endurance launched on May 22nd, 2022, from Norway’s Svalbard Archipelago.

This image shows NASA's Endurance rocket launching from Ny-Ã…lesund, Svalbard, Norway. It flew for 19 minutes to an altitude of about 780 km (484 mi) above Earth's sunlit polar cap. It carried six science instruments and could only be launched in certain conditions to be successful. Image Credit: NASA/Brian Bonsteel.
This image shows NASA’s Endurance rocket launching from Ny-Ã…lesund, Svalbard, Norway. It flew for 19 minutes to an altitude of about 780 km (484 mi) above Earth’s sunlit polar cap. It carried six science instruments and could only be launched in certain conditions to be successful. Image Credit: NASA/Brian Bonsteel.

“Svalbard is the only rocket range in the world where you can fly through the polar wind and make the measurements we needed,” said Suzie Imber, a space physicist at the University of Leicester, UK, and co-author of the paper.

Svalbard is key because there are open magnetic field lines above Earth’s polar caps. These field lines provide a pathway for ions to outflow to the magnetosphere.

This figure from the research shows Endurance's flight profile and its path over Earth. The rocket had to fly near the open magnetic field lines that exist at Svalbard's high polar latitudes. Image Credit: Collinson et al. 2024.
This figure from the research shows Endurance’s flight profile and its path over Earth. The rocket had to fly near the open magnetic field lines that exist at Svalbard’s high polar latitudes. Image Credit: Collinson et al. 2024.

After it was launched, Collinson said, “We got fabulous data all through the flight, though it will be a while before we can really dig into it to see if we achieved our science objective or not.”

Now, the data is in, and the results show that Earth has a global electric field.

Prior to its discovery, scientists hypothesized that the field was weak and that its effects could only be felt over hundreds of kilometres. Even though it was first proposed 60 years ago, scientists had to wait for technology to advance before they could measure it. In 2016, Collinson and his colleagues began inventing a new instrument that could measure the elusive field.

At about 250 km (150 mi) above the Earth’s surface, atoms break apart into negatively charged electrons and positively charged ions. Electrons are far lighter than ions, and the tiniest energetic jolt can send them into space. Ions are more than 1800 times heavier, and gravity draws them back to the surface.

If gravity were the only force at work, the two populations would separate over time and simply drift apart. But that’s not what happens.

Electrons and ions have opposite electrical charges. They’re attracted to one another and an electric field forms that keeps them together. This counteracts some of gravity’s power.

The field is called ambipolar because it’s bidirectional. That means it works in both directions. As ions sink down due to gravity, the electrical charges mean that the ions drag some of the electrons down with them. However, at the same time, electrons lift ions high into the atmosphere with them as they attempt to leave the atmosphere and escape into space.

via GIPHY

The result of all this is that the ambipolar field extends the atmosphere’s height, meaning some of the ions escape with the polar wind.

After decades of hypothesizing and theorizing, the Endurance rocket measured a change in electric potential of only 0.55 volts. That’s extremely weak but enough to be measurable.

“A half a volt is almost nothing — it’s only about as strong as a watch battery,” Collinson said. “But that’s just the right amount to explain the polar wind.”

Hydrogen ions are the most plentiful particles in the polar wind. Endurance’s results show that these ions experience an outward force from the magnetic field that’s 10.6 times more powerful than gravity. “That’s more than enough to counter gravity — in fact, it’s enough to launch them upwards into space at supersonic speeds,” said Alex Glocer, Endurance project scientist at NASA Goddard and co-author of the paper.

Hydrogen ions are light, but even the heavier particles in the polar wind are lifted. Oxygen ions in the weak electrical field effectively weigh half as much, yet they’re boosted to greater heights, too. Overall, the ambipolar field makes the ionosphere denser at higher altitudes than it would be without the field’s lofting effect. “It’s like this conveyor belt, lifting the atmosphere up into space,” Collinson added.

“The measurements support the hypothesis that the ambipolar electric field is the primary driver of ionospheric H+ outflow and of the supersonic polar wind of light ions escaping from the polar caps,” the authors explain in their paper.

“We infer that this increases the supply of cold O+ ions to the magnetosphere by more than 3,800%,” the authors write. At that point, other mechanisms come into play. Wave-particle interactions can heat the ions, accelerating them to escape velocity.

These results raise other questions. How does this field affect Earth? Has the field affected the planet’s habitability? Do other planets have these fields?

Back in 2016, the European Space Agency’s Venus Express mission detected a 10-volt electric potential surrounding the planet. This means that positively charged particles would be pulled away from the planet’s surface. This could draw away oxygen.

Scientists think that Venus may have once had plentiful water. However, since sunlight splits water into hydrogen and oxygen, the electric field could’ve siphoned the oxygen away, eliminating the planet’s water. This is theoretical, but it begs the question of why the same thing hasn’t happened on Earth.

The ambipolar field is fundamental to Earth. Its role in the evolution of the planet’s atmosphere and biosphere is yet to be understood, but it must play a role.

“Any planet with an atmosphere should have an ambipolar field,” Collinson said. “Now that we’ve finally measured it, we can begin learning how it’s shaped our planet as well as others over time.”

The post A NASA Rocket Has Finally Found Earth’s Global Electric Field appeared first on Universe Today.



What Type of Excavator Is Most Suitable for Asteroids?

Digging in the ground is so commonplace on Earth that we hardly ever think of it as hard. But doing so in space is an entirely different proposition. On some larger worlds, like the Moon or Mars, it would be broadly similar to how digging is done on Earth. But their “milligravity” would make the digging experience quite different on the millions of asteroids in our solar system. Given the potential economic impact of asteroid mining, there have been plenty of suggested methods on how to dig on an asteroid, and a team from the University of Arizona recently published the latest in a series of papers about using a customized bucket wheel to do so.

Bucket wheel designs seem to be gaining popularity in space mining more generally lately. NASA’s ISRU Pilot Excavator (IPEx) uses a similar design and has been advanced to Technology Readiness Level 5, according to its latest yearly report. However, it was designed for use on the Moon, where gravity is significantly larger than that of the asteroids that hold vastly more valuable materials.

According to the paper, the lowest 10% of asteroids have higher concentrations of platinum group metals, such as palladium and osmium, than the Moon does. They are also much more “energy accessible,” meaning that you would only need a delta-V of about 5% that of the Moon to get resources off an asteroid undergoing active mining. Since delta-V is equivalent to fuel weight and is therefore directly equivalent to cost, lower delta-V makes mining on these tiny bodies much more economically attractive.

This video, from nine years ago, shows how long the development path for asteroid mining technology is.

But they have their own engineering challenges to face. Most asteroids are known as “rubble piles,” meaning they are made up of clumps of rock simply stuck together by whatever minimal gravity their mass gives them. Even metal-rich M-type asteroids, such as Psyche, could be primarily composed of these small chunks of material. Such an environment would not be very hospitable to traditional mining techniques.

The University of Arizona researchers, led by Dr. Jekan Thangavelautham, have taken a rapid iteration approach to solving that problem. They developed a model representing the forces expected on the surface of an asteroid and applied those forces to models of different bucket wheel designs, selecting features that best suit the environment.

They also took the next step and started 3D printing prototypes of the different designs. They intended to use those printed prototypes to collect physical data on the mechanics of excavation; however, to do so, they needed realistic asteroid regolith simulant material. That doesn’t currently exist, so they decided to make their own. A combination of styrofoam and 3D-printed resin seemed to do the trick, however they weren’t able to make enough simulant yet to test a planned test assembly for this paper thoroughly.

Artist’s depiction of an implementation of a bucket wheel excavator
Credit – Hansen, Muniyasamy, & Thangavelautham

One of the other important findings of the paper was the impact different characteristics of the asteroid itself would have on two of the most important parameters for the design—the bucket volume and the cutting velocity (i.e., how fast the buckets move). Some characteristics, such as the resource concentration, had little impact on those two parameters. However, other obvious ones, such as the density, had a major impact. 

The research team found that high-volume, slow-moving buckets were ideal in this environment. However, part of that consideration was how quickly an orbiting support craft would fill up with material being excavated. To increase the throughput time of material from the bucket wheel to the storage system, the researchers suggest the use of a screw feeder, which would also allow the bucket to operate continuously – another necessity given the economic constraints of the system.

Additionally, they found that claws were necessary to hold onto the regolith. An extensible tubing system is also a “nice-to-have,” though it becomes more necessary if there are many buckets per wheel.

Details of this work are contained in the paper, and an associated presentation was given by the researchers at the ASCEND conference at the end of July. While these milestones are a step in the right direction, these technologies are still at a relatively low readiness level. However, they will eventually be needed if humans utilize some of the most easily accessible resources in the solar system. As our expansion to other worlds picks up, it’s only a matter of time before a bucket excavator lands on an asteroid and starts going to work.

Learn More:
Hansen, Muniyasamy, & Thangavelautham – Modified Bucket Wheel Design and Mining Techniques for Asteroid Mining
UT – Heavy Construction on the Moon
UT – A Handy Attachment Could Make Lunar Construction a Breeze
UT – Robotic asteroid mining spacecraft wins a grant from NASA

Lead Image:
Artist’s depiction of NASA’s IPEx Bucket Excavator Robot.
Credit – NASA

The post What Type of Excavator Is Most Suitable for Asteroids? appeared first on Universe Today.



The Rubin Observatory Will Unleash a Flood of NEO Detections

After about 10 years of construction, the Vera Rubin Observatory (VRO) is scheduled to see its first light in January 2025. Once it’s up and running, it will begin its Legacy Survey of Space and Time (LSST), a decade-long effort to photograph the entire visible sky every few nights. It’ll study dark energy and dark matter, map the Milky Way, and detect transient astronomical events and small Solar System objects like Near Earth Objects (NEOs).

New research shows the LSST will detect about 130 NEOs per night in the first year of observations.

NEOs are small Solar System bodies, usually asteroids, that orbit the Sun and come within 1.3 astronomical units of the Sun. When a NEO crosses Earth’s orbit at some point, it’s considered a potentially hazardous object (PHO). NASA is currently cataloguing NEOs, and while they’ve made progress, there are many more left to find.

According to new research, the upcoming LSST will detect about 130 NEOs per night. The research is “Expected Impact of Rubin Observatory LSST on NEO Follow-up,” and it’s still in peer-review but available on the prepress site arxiv.org. The lead author is Tom Wagg, a PhD student at the DiRAC Institute and the Department of Astronomy at the University of Washington in Seattle.

“We simulate and analyze the contribution of the Rubin Observatory Legacy Survey of Space and
Time (LSST) to the rate of discovery of Near Earth Object (NEO) candidates,” the authors write. They also analyzed submission rates for the NEO Confirmation Page (NEOCP) and how that will affect the worldwide follow-up observation system for NEOs.

The problem with NEOs is that they don’t necessarily remain NEOs. A subset of them—about one-fifth—pass so close to Earth that even a small perturbation can send them on an intersecting path with Earth’s orbit. These are sources of potentially catastrophic collisions. A further subset of these are called Potentially Hazardous Asteroids (PHAs), and they’re massive enough to make it through Earth’s atmosphere and strike the planet’s surface. To be considered a PHA, an object has to be about 140 meters in diameter.

The Minor Planet Center maintains a database of NEOs, and more are being added constantly. New detections are recorded on the NEO confirmation page (NEOCP), but at first, they’re only candidates. Follow-up observations require resources to accurately determine a candidate’s orbit and size.

If the LSST contributes 130 more NEO detections each day, which is eight times the current detection rate, the survey will create an enormous amount of follow-up work. According to a standard computer algorithm named digest2 that evaluates them, NEOs are only considered candidates if they meet certain criteria, and that can only be determined by follow-up observations with other telescopes.

Illustration of a Near Earth Object. Credit: NASA/JPL-Caltech
Illustration of a Near Earth Object. Credit: NASA/JPL-Caltech

But with so many more detections on the horizon, there could be problems.

“The aim of this paper is to quantify the impact of Rubin on the NEO follow-up community and consider possible strategies to mitigate this impact,” the authors write.

Most of the NEOs the LSST finds will be found using a method called “tracklet linking.” Tracklet linking is “a computational technique where at least three pairs of observations (“tracklets”) observed over a 15-night period are identified as belonging to the same object,” the authors explain. The problem is that the tracklet linking can take time and comes at a cost. “… the object is not identified as interesting until the third tracklet is imaged – at best, two nights after the first observation or, at worst, nearly two weeks later,” the authors write. This means that the system may miss interesting or hazardous objects until it’s too late to observe them for confirmation.

With other telescopes, there’s a way around this. They can capture several back-to-back images of tracklets to create more robust detections that can be immediately followed up on. However, the VRO can’t do that because the LSST is an automated survey.

What it can do is serendipitously capture three or more tracklets in smaller sections of the sky where its observing fields will overlap. “Such tracklets could be immediately identified and, assuming they meet the digest2 score criteria, submitted to the Minor Planet Centre and included on the NEOCP,” the authors write. Because of the scale, the authors say this process could be automated and would require no human vetting.

The researchers simulated LSST detections to test their idea and see if it could reduce the follow-up observation workload. “We present an algorithm for predicting whether LSST will later re-detect an object
given a single night of observations (therefore making community follow-up unnecessary),” they explain. They wanted to determine how effective it would be in reducing the number of objects that require follow-up observations.

They started by simulating almost 3600 days of the LSST, consisting of almost one billion observations.

This figure from the study shows the number of asteroids detected in one night of LSST simulations. About 350,000 asteroids were observed, including about 1,000 NEOs. The grey curved line represents the ecliptic. Image Credit: Wagg et al. 2024.

From their data, they selected observations that corresponded to tracklets. Single tracklets don’t determine an orbit, but they can constrain potential orbits when compared to known Solar System orbits. The digest2 algorithm works by comparing observed tracklets to a simulated catalogue of Solar System objects to estimate the probability that an object is a NEO. It takes all the data and estimates the possible orbits of the objects.

This figure from the research shows the variant orbits computed for one simulated NEO tracklet. The white arrow indicates the initial sight line for the observation. The blue dotted line indicates the orbit of the Earth. The background stars are included for illustrative purposes only. Image Credit: Wagg et al. 2024.
This figure from the research shows the variant orbits computed for one simulated NEO tracklet. The white arrow indicates the initial sight line for the observation. The blue dotted line indicates the orbit of the Earth. The background stars are included for illustrative purposes only. Image Credit: Wagg et al. 2024.

At this point, the number of candidate NEOs is still overwhelming. The candidate population is not a high-purity sample and still contains non-NEOs like main-belt asteroids.

Most of the impurity is caused by main-belt asteroids, and as these were recognized, the purity would rise. The simulations show that purity would continually rise, and after about five months, it would level off. A similar thing happens with submission rates. After about 150 nights, the submission rate reaches a steady state of about 95 per night.

The LSST repeatedly images the sky in overlapping fields. The researchers thought that if they could determine which tracklets were going to be re-observed by the LSST as it goes about its business, they could reduce the follow-up observation burden.

“If we could predict which objects will be followed up by LSST itself, this would reduce the load on the follow-up system and allow the community to focus on the ones that truly require external follow-up to be designated,” the authors explain. The researchers developed an algorithm for computing the ensemble of ranges and radial velocities of a single observed tracklet.

“We now examine the effect of applying the LSST detection probability algorithm to reduce the load on the NEOCP,” the authors write. The following image illustrates this.

This figure from the research shows the estimated probability of detection by the algorithm and the number of objects, with the dotted black line being the threshold for confirmation. On the right is a contingency matrix with two Truth columns and two Prediction rows. All in all, it shows that the algorithm detected 180 NEOs, with 400 being sent for confirmation needlessly, as the LSST will have confirmed them. Lost objects are objects that have been de-prioritized for follow-up observations but won't receive adequate follow-ups by the Rubin itself. Image Credit: Wagg et al. 2024.
This figure from the research shows the estimated probability of detection by the algorithm and the number of objects, with the dotted black line being the threshold for confirmation. On the right is a contingency matrix with two Truth columns and two Prediction rows. All in all, it shows that the algorithm detected 180 NEOs, with 400 being sent for confirmation needlessly, as the LSST will have confirmed them. Lost objects are objects that have been de-prioritized for follow-up observations but won’t receive adequate follow-ups by the Rubin itself. Image Credit: Wagg et al. 2024.

Overall, the algorithm predicted the correct outcome 68% of the time. Also, about 64 of the objects submitted to the NEOCP per night would require external follow-up, but only around 8.3%, or about five, of those objects would be NEOs. The algorithm would only improve accuracy minimally, but it would reduce the follow-up workload by a factor of two.

The researchers say that other tweaks to the algorithm can improve it and make LSST NEO detections more robust without the need for so many demanding follow-up observations.

In their conclusion, the authors write, “LSST contributions will increase the nightly NEOCP submission rate
by a factor of about 8 over the first year to an average of 129 objects per night.” However, the fraction that will be confirmed is low at about 8.3%, but will rise over time.

The LSST is expected to generate 200 petabytes of uncompressed data during its ten-year run, which is about 200 million gigabytes. This study shows that managing the amount of data that the LSST will generate requires new methods.

It may seem like a far-away concern, but understanding the threat to Earth posed by NEOs is critical. While efforts are being made to understand how we can protect the planet from them, cataloguing them all is important.

The post The Rubin Observatory Will Unleash a Flood of NEO Detections appeared first on Universe Today.



Wednesday, August 28, 2024

What if you Flew Your Warp Drive Spaceship into a Black Hole?

Warp drives have a long history of not existing, despite their ubiquitous presence in science fiction. Writer John Campbell first introduced the idea in a science fiction novel called Islands of Space. These days, thanks to Star Trek in particular, the term is very familiar. It’s almost a generic reference for superliminal travel through hyperspace. Whether or not warp drive will ever exist is a physics problem that researchers are still trying to solve, but for now, it’s theoretical.

Recently, two researchers looked at what would happen if a ship with warp drive tried to get into a black hole. The result is an interesting thought experiment. It might not lead to starship-sized warp drives but might allow scientists to create smaller versions someday.

NASA's Eagleworks is currently attempting to test Alcubierre warp drive concept. Credit: 2012
NASA’s Eagleworks attempted to test Alcubierre warp drive concept. Credit: 2012

Remo Garattini and Kirill Zatrimaylov theorized that such a drive could survive inside a so-called Schwarzschild black hole. That’s provided the ship crosses the event horizon at a speed lower than that of light. Theoretically, the black hole’s gravitational field would decrease the amount of negative energy required to keep the drive going. If it did, the ship could pass through and somehow use it to get somewhere else without getting crushed. Furthermore, the mathematics behind this idea points the way toward the possible creation of mini-warp drives in lab settings.

What’s a Warp Drive?

Could scientists build a micro- or mini-warp drive in the lab? Good questions. To understand the team’s work, let’s look at the major players in this research: warp drives and black holes.

The idea is inspired by the fact that nothing can go faster than light speed. Given the distances in space, traveling to the nearest star would take years (if we could go at light speed). Going across a galaxy or to more distant galaxies would take years and many lifetimes. So, if you want to be a space-faring species, you must travel faster than light (FTL).

How would you do that? This is where warp drives come in. Theoretically, they allow you to put your spaceship inside a bubble that could slip through space at FTL speeds. That’s how the starships in Star Trek (and other SF stories) get across huge distances so quickly. The Star Trek ships use an energy source in a “warp core” to power warp field generators. They create the warp bubble in subspace. The ship uses that to go wherever the crew needs to be.

Do Physicists Like Warp Drive?

Such a warp drive is a tantalizing idea with many caveats. For example, generating a warp field requires an insane amount of energy. Some physicists suggest that it would take more energy than we’re capable of generating. Creating that energy would require huge amounts of exotic matter—something like “unobtanium”. So, that’s a problem right there.

Others say that creating such a drive goes against our current understanding of spacetime physics. However, that hasn’t stopped anybody from speculating on ways to make it happen. For example, Mexican physicist Miguel Alcubierre had an idea for such a drive in 1994. He suggested that it could create a bubble that would shift space around an object. He has continued his research about a ship that could get somewhere faster than light. However, he and others still point out various problems with both creating and sustaining a warp drive. That includes the idea that such a drive effectively isolates itself from the rest of the Universe. Among other things, it means the ship can’t control the drive that’s making it go. So, there are a still few bugs to work out.

This artist's illustration shows a spacecraft using an Alcubierre Warp Drive to warp space and 'travel' faster than light. Image Credit: NASA
This artist’s illustration shows a spacecraft using an Alcubierre Warp Drive to warp space and ‘travel’ faster than light. Image Credit: NASA

About Black Holes

We are most familiar with black holes in terms of stellar mass and supermassive ones. These also sport accretion disks that convey material into the black hole. For example, the central supermassive black hole named Sagittarius A* in our own Milky Way Galaxy periodically gobbles down material. Then, it emits a belch of radiation. Other, more active galaxies send out jets of material emitted as the central supermassive black hole feeds continuously.

Simulation of a black hole. (Credit: NASA/ESA/Gaia/DPAC)
Simulation of a black hole. (Credit: NASA/ESA/Gaia/DPAC)

A black hole is a concentration of mass with gravity so strong that nothing, even light, can escape. In their study about black holes and warp drives, the authors used Schwarzschild black holes. These so-called simple “static” black holes curve spacetime, have no electric charge and are non-rotating. Essentially, they are good approximations for mathematical explorations of the characteristics of slowly rotating objects in space.

When A Ship with Warp Drive Crosses into a Black Hole

The Schwarzchild black hole is the “perfect” black hole to use in this theoretical exploration of a warp drive crossing the event horizon. To figure out the scenario, Garattini and Zatrimalov decided to mathematically combine the equations describing the black hole and the ones describing the warp drive. Among other things, they found that it’s possible to “embed” the warp drive in the outer region of the black hole. The warp bubble itself is much smaller than the black hole and needs to be moving toward it. The black hole’s gravity affects the energy conditions needed to create and sustain the warp drive. That means you can theoretically decrease the amount of negative energy required to sustain the warp bubble. In addition, the researchers suggest that if the warp bubble is moving at less than the speed of light, it effectively erases the black hole horizon.

The research team also described the idea that such an occurrence could evoke the conversion of virtual particles into real ones in an electric field. If so, it could lead to the creation of mini warp drives in the lab.

Changing the Black Hole a Bit

Interestingly, the team also suggests that, if the warp bubble is moving slowly and is much smaller than the black hole horizon, it could increase the entropy of the black hole. However, as they state in their closing arguments, “there are potential problematic issues in other physical situations: namely, when the warp drive is completely absorbed by the black hole, it may decrease its mass, and, therefore, its entropy.

Likewise, when there is a larger warp bubble passing through a black hole, it would produce a ”screening” effect and de facto eliminate the horizon, making it impossible to define the black hole entropy in the Hawking sense. If warp drives are possible in nature, these issues indicate that we still do not understand them from the thermodynamic point of view.”

Warp Drive Technology Remains to be Seen

So, while this research may prove valuable theoretically, and could lead to lab production of mini black holes, many questions remain. Perhaps in the future, when we understand the quantum mechanics behind both of these objects, we might find warp technology a slam-dunk. If so, then, as ships travel through black holes, we could face a weird time. For example, signals from inside a black hole could get carried out by a warp bubble merging from the singularity. That would allow us to send images or recordings of what it’s like inside the event horizon—something nobody knows about today. There’s also a chance that those fearsome black holes could make a warp drive less difficult to achieve since they won’t need so much exotic “negative energy” source material.

For More Information

Black Holes, Warp Drives, and Energy Conditions
The Warp Drive: Hyper-fast Travel Within General Relativity
Schwarzschild Black Hole Simulations

The post What if you Flew Your Warp Drive Spaceship into a Black Hole? appeared first on Universe Today.