Within nearly every galaxy is a supermassive black hole. The beast at the heart of our galaxy contains the mass of millions of suns, while some of the largest supermassive black holes can be more than a billion solar masses. For years, it was thought that these black holes grew in mass over time, only reaching their current size after a billion years or more. But observations from the Webb telescope show that even the youngest galaxies contain massive black holes. So how could supermassive black holes grow so large so quickly? The key to the answer could be the powerful jets black holes can produce.
Although it seems counterintuitive, it is difficult for a black hole to consume matter and grow. The gravitational pull of a black hole is immensely strong, but the surrounding matter is much more likely to be trapped in orbit around the gravitational well than to fall directly in. To enter a black hole, material needs to slow down enough to fall inward. When a black hole has a jet of material speeding away from its polar region, this high-velocity plasma can pull rotational motion from the surrounding material, thus allowing it to fall into the black hole. For this reason, black holes with powerful jets also undergo the most powerful growth.
We can see many fast-growing black holes in the distant Universe as quasars, or active galactic nuclei. We know, then, that in the middle age of the cosmos, many supermassive black holes were gaining mass rapidly. One idea is that the youngest supermassive black holes also had active jets, which would allow them to gain a million solar masses or more quite quickly. But proving this is difficult.
The problem is that it’s extremely difficult to observe jets from the earliest period of the cosmos. Light from that distant time is so redshifted that their once brilliant beacon has become dim radio light. Before this recent study, the most distant jet we observed had a redshift of z = 6.1, meaning it traveled for nearly 12.8 billion years to reach us. In this new study, the team discovered a blazar with a redshift of z = 7.0, meaning it comes from a time when the Universe was just 750 million years old.
A blazar occurs when the jet of a supermassive black hole is lined up to be pointed directly at us. Since we’re looking directly into the beam, we see the jet at its most powerful. Blazars normally allow us to calculate the true intensity of a jet, but in this case, the redshift is so strong that our conclusions must be a bit more subtle.
One possibility is that the jet of this particular supermassive black hole really is pointed directly our way. Based on this, the black hole is growing so quickly that it would easily gain more than a million solar masses within the first billion years of time. But it would be extremely rare for a black hole jet to point directly at us from that distance. So statistically, that would mean there are many more early black holes that are just as active and growing just as quickly. They just aren’t aligned for us to observe.
Another possibility is that the blazar isn’t quite aligned in our direction, but the cosmic expansion of space and time has focused its energy toward us over 12.9 billion years. In other words, the blazar may appear more energetic than it actually is, thanks to relativistic cosmology. But if that is the case, then the jet of this black hole is less energetic but still powerful. And statistically, that would mean most early black holes are equally powerful.
So this latest work tells us that either there was a fraction of early black holes that grew to beasts incredibly fast, or that most black holes grew quickly, beginning at a time even earlier than we can observe. In either case, it is clear that early black holes created jets, and these jets allowed the first supermassive black holes to appear early in cosmic time.
The black hole information paradox has puzzled physicists for decades. New research shows how quantum connections in spacetime itself may resolve the paradox, and in the process leave behind a subtle signature in gravitational waves.
For a long time we thought black holes, as mysterious as they were, didn’t cause any trouble. Information can’t be created or destroyed, but when objects fall below the event horizons, the information they carry with them is forever locked from view. Crucially, it’s not destroyed, just hidden.
But then Stephen Hawking discovered that black holes aren’t entirely black. They emit a small amount of radiation and eventually evaporate, disappearing from the cosmic scene entirely. But that radiation doesn’t carry any information with it, which created the famous paradox: when the black hole dies, where does all its information go?
One solution to this paradox is known as non-violent nonlocality. This takes advantage of a broader version of quantum entanglement, the “spooky action at a distance” that can tie together particles. But in the broader picture, aspects of spacetime itself become entangled with each other. This means that whatever happens inside the black hole is tied to the structure of spacetime outside of it.
Usually spacetime is only altered during violent processes, like black hole mergers or stellar explosions. But this effect is much quieter, just a subtle fingerprint on the spacetime surrounding an event horizon.
If this hypothesis is true, the spacetime around black holes carries tiny little perturbations that aren’t entirely random; instead, the variations would be correlated with the information inside the black hole. Then when the black hole disappears, the information is preserved outside of it, resolving the paradox.
In a recent paper appearing in the journal preprint server arXiv, but not yet peer-reviewed, a pair of researchers at Caltech investigated this intriguing hypothesis to explore how we might be able to test it.
The researchers found that these signatures in spacetime also leave an imprint in the gravitational waves when black holes merge. These imprints are incredibly tiny, so small that we are not yet able to detect them with existing gravitational wave experiments. But they do have a very unique structure that stands on top of the usual wave pattern, making them potentially observable.
The next generation of gravitational wave detectors, which aim to come online in the next decade, might have enough sensitivity to tease out this signal. If they see it, it would be tremendous, as it would finally point to a clear solution of the troubling paradox, and open up a new understanding of both the structure of spacetime and the nature of quantum nonlocality.
In April 2019, the Event Horizon Telescope (EHT) collaboration made history when it released the first-ever image of a black hole. The image captured the glow of the accretion disk surrounding the supermassive black hole (SMBH) at the center of the M87 galaxy, located 54 million light-years away. Because of its appearance, the disk that encircles this SMBH beyond its event horizon (composed of gas, dust, and photons) was likened to a “ring of fire.” Since then, the EHT has been actively imaging several other SMBH, including Sagittarius A* at the center of the Milky Way!
In addition, the EHT has revealed additional details about M87, like the first-ever image of a photon ring and a picture that combines the SMBH and its relativistic jet emanating from its center. Most recently, the EHT released the results of its latest observation campaign. These observations revealed a spectacular flare emerging from M87’s powerful relativistic jet. This flare released a tremendous amount of energy in multiple wavelengths, including the first high-energy gamma-ray outburst observed in over a decade.
The study presents the data from the second EHT observational campaign conducted in April 2018 that obtained nearly simultaneous spectra of the galaxy with the broadest wavelength coverage ever collected. Giacomo Principe, the paper coordinator, is a researcher at the University of Trieste associated with the Instituto Nazionale di Astrofisica (INAF) and the Institute Nazionale di Fisica Nucleare (INFN). As he explained in a recent EHT press release:
“We were lucky to detect a gamma-ray flare from M87 during this EHT multi-wavelength campaign. This marks the first gamma-ray flaring event observed in this source in over a decade, allowing us to precisely constrain the size of the region responsible for the observed gamma-ray emission. Observations—both recent ones with a more sensitive EHT array and those planned for the coming years—will provide invaluable insights and an extraordinary opportunity to study the physics surrounding M87’s supermassive black hole. These efforts promise to shed light on the disk-jet connection and uncover the origins and mechanisms behind the gamma-ray photon emission.”
During the campaign, the Fermi space telescope gathered data indicating an increase in high-energy gamma rays using its LAT instrument. Chandra and NuSTAR followed by collecting high-quality data in the X-ray band, while the Very Long Baseline Array (VLBA) and the East Asia VLBI Network (EAVN) obtained data in radio frequencies. The flare these observations revealed lasted approximately three days and occupied a region roughly three light-days in size, about 170 times the distance between the Sun and the Earth (~170 AU).
The flare itself was well above the energies typically detected around black holes and showed a significant variation in the position angle of the asymmetry of the black hole’s ‘event horizon’ and its position. As Daryl Haggard, a professor at McGill University and the co-coordinator of the EHT multi-wavelength working group, explained, this suggests a physical relation between these structures on very different scales:
“In the first image obtained during the 2018 observational campaign, we saw that the emission along the ring was not homogeneous, instead it showed asymmetries (i.e., brighter areas). Subsequent observations conducted in 2018 and related to this paper confirmed that finding, highlighting that the asymmetry’s position angle had changed.”
“How and where particles are accelerated in supermassive black hole jets is a long-standing mystery,” added University of Amsterdam professor Sera Markoff, another EHT multi-wavelength working group co-coordinator. “For the first time, we can combine direct imaging of the near event horizon regions during gamma-ray flares caused by particle acceleration events and thus test theories about the flare origins.”
This discovery could create opportunities for future research and lead to breakthroughs in our understanding of the Universe.
Almost every large galaxy has a supermassive black hole churning away at its core. In most cases, these black holes spin in concert with their galaxy, like the central hub of a cosmic wagon wheel. But on December 18, 2024, NASA researchers announced they had discovered a galaxy whose black hole appears to have been turned on its side, spinning out of alignment with its host galaxy.
The galaxy, NGC 5084, was discovered centuries ago by German astronomer William Herschel, but it took new techniques, recently developed at NASA’s Ames Research Center, to reveal the unusual properties of the black hole.
The new method is called SAUNAS (Selective Amplification of Ultra Noisy Astronomical Signal). It enables astronomers to tease out low-brightness X-ray emissions that were previously drowned out by other radiation sources.
When the team put their new technique to the test by combing through old archival data from the Chandra X-ray observatory – a space telescope that acts as the X-ray counterpart to Hubble’s visible-light observations – they found their first clue that something unusual was going on in NGC 5084.
Four large X-ray plumes, made visible by the new technique, appeared in the data. These streams of plasma extend out from the centre of the galaxy, two in line with the galactic plane, and two extending above and below.
While plumes of hot, charged gas are not unusual above or below the plane of large galaxies, it is unusual to find four of them, rather than just one or two, and even more unusual to find them in line with the galactic plane.
To make sure that they weren’t just seeing some error or artifact in the Chandra data, they started looking more closely at other images of the galaxy, including both the Hubble space telescope and the Atacama Large Millimeter Array (ALMA).
These observations revealed a dusty inner disk spinning in the centre of the galaxy at a 90-degree angle to the rest of NGC 5084.
The team also looked at the galaxy in radio wavelengths using the NRAO’s Expanded Very Large Array. All together, these observations painted a picture of a very strange galactic core.
“It was like seeing a crime scene with multiple types of light,” said Ames research scientist Alejandro Serrano Borlaff, lead author of the paper published this week in The Astrophysical Journal. “Putting all the pictures together revealed that NGC 5084 has changed a lot in its recent past.”
Borlaff’s coauthor and astrophysicist at Ames, Pamela Marcum, added that “detecting two pairs of X-ray plumes in one galaxy is exceptional. The combination of their unusual, cross-shaped structure and the ‘tipped-over,’ dusty disk gives us unique insights into this galaxy’s history.”
The plumes of plasma suggest that the galaxy has been disturbed in some way during its lifetime. It might be explained, for example, by a collision with another galaxy, which caused the black hole to tip on its side.
With this discovery, SAUNAS has demonstrated that it can bring new life to old data, uncovering new surprises in familiar galaxies. This surprise twist on a galaxy we’ve known about since 1785 offers tantalizing hope that there might be other weird and wonderful discoveries to come, even in places we thought we’d seen everything.
New research suggests that our best hopes for finding existing life on Mars isn’t on the surface, but buried deep within the crust.
Several years ago NASA’s Curiosity rover measured traces of methane in the Martian atmosphere at levels several times the background. But a few months later, the methane disappeared, only for it to reappear again later in the year. This discovery opened up the intriguing possibility of life still clinging to existence on Mars, as that could explain the seasonal variability in the presence of methane.
But while Mars was once home to liquid water oceans and an abundant atmosphere, it’s now a desolate wasteland. What kind of life could possibly call the red planet home? Most life on Earth wouldn’t survive long in those conditions, but there is a subgroup of Earthly life that might possibly find Mars a good place to live.
These are the methanogens, a type of single-celled organism that consume hydrogen for energy and excrete methane as a waste product. Methanogens can be found in all sorts of otherwise-inhospitable places on Earth, and something like them might be responsible for the seasonal variations in methane levels on Mars.
In a recent paper submitted for publication in the journal AstroBiology, a team of scientists scoured the Earth for potential analogs to Martian environments, searching for methanogens thriving in conditions similar to what might be found on Mars.
The researchers found three potential Mars-like conditions on Earth where methanogens make a home. The first is deep in the crust, sometimes to a depth of several kilometers, where tiny cracks in rocks allow for liquid water to seep in. The second is lakes buried under the Antarctic polar ice cap, which maintain their liquid state thanks to the immense pressures of the ice above them. And the last is super-saline, oxygen-deprived basins in the deep ocean.
All three of these environments have analogs on Mars. Like the Earth, Mars likely retains some liquid water buried in its crust. And its polar caps might have liquid water lakes buried underneath them. Lastly, there has been tantalizing – and heavily disputed – evidence of briny water appearing on crater walls.
In the new paper, the researchers mapped out the temperature ranges, salinity levels, and pH values across sites scattered around the Earth. They then measured the abundance of molecular hydrogen in those sites, and determined where methanogens were thriving the most.
For the last step, the researchers combed through the available data about Mars itself, finding where conditions best matched the most favorable sites on Earth. They found that the most likely location for possible life was in Acidalia Planitia, a vast plain in the northern hemisphere.
Or rather, underneath it. Several kilometers below the plain, the temperatures are warm enough to support liquid water. That water might have just the right pH and salinity levels, along with enough dissolved molecular hydrogen, to support a population of methanogen-like creatures.
Entanglement is perhaps one of the most confusing aspects of quantum mechanics. On its surface, entanglement allows particles to communicate over vast distances instantly, apparently violating the speed of light. But while entangled particles are connected, they don’t necessarily share information between them.
In quantum mechanics, a particle isn’t really a particle. Instead of being a hard, solid, precise point, a particle is really a cloud of fuzzy probabilities, with those probabilities describing where we might find the particle when we go to actually look for it. But until we actually perform a measurement, we can’t exactly know everything we’d like to know about the particle.
These fuzzy probabilities are known as quantum states. In certain circumstances, we can connect two particles in a quantum way, so that a single mathematical equation describes both sets of probabilities simultaneously. When this happens, we say that the particles are entangled.
When particles share a quantum state, then measuring the properties of one can grant us automatic knowledge of the state of the other. For example, let’s look at the case of quantum spin, a property of subatomic particles. For particles like electrons, the spin can be in one of two states, either up or down. Once we entangle two electrons, their spins are correlated. We can prepare the entanglement in a certain way so that the spins are always opposite of each other.
If we measure the first particle, we might randomly find the spin pointing up. What does this tell us about the second particle? Since we carefully arranged our entangled quantum state, we now know with 100% absolute certainty that the second particle must be pointing down. Its quantum state was entangled with the first particle, and as soon as one revelation is made, both revelations are made.
But what if the second particle was on the other side of the room? Or across the galaxy? According to quantum theory, as soon as one “choice” is made, the partner particle instantly “knows” what spin to be. It appears that communication can be achieved faster than light.
The resolution to this apparent paradox comes from scrutinizing what is happening when – and more importantly, who knows what when.
Let’s say I’m the one making the measurement of particle A, while you are the one responsible for particle B. Once I make my measurement, I know for sure what spin your particle should have. But you don’t! You only get to know once you make your own measurement, or after I tell you. But in either case nothing is transmitted faster than light. Either you make your own local measurement, or you wait for my signal.
While the two particles are connected, nobody gets to know anything in advance. I know what your particle is doing, but I only get to inform you at speed slower than light – or you just figure it out for yourself.
So while the process of entanglement happens instantaneously, the revelation of it does not. We have to use good old-fashioned no-faster-than-light communication methods to piece together the correlations that quantum entanglement demand.
Neutrinos are tricky little blighters that are hard to observe. The IceCube Neutrino Observatory in Antarctica was built to detect neutrinos from space. It is one of the most sensitive instruments built with the hope it might help uncover evidence for dark matter. Any dark matter trapped inside Earth, would release neutrinos that IceCube could detect. To date, and with 10 years of searching, it seems no excess neutrinos coming from Earth have been found!
Neutrinos are subatomic particles which are light and carry no electrical charge. Certain events, such as supernovae and solar events generate vast quantities of neutrinos. By now, the universe will be teeming with neutrinos with trillions of them passing through every person every second. The challenge though is that neutrinos rarely interact with matter so observing and detecting them is difficult. Like other sub-atomic particles, there are different types of neutrino; electron neutrinos, muon neutrinos and tau neutrinos, with each associated with a corresponding lepton (an elementary particle with half integer spin.) Studying neutrinos of all types is key to helping understand fundamental physical processes across the cosmos.
The IceCube Neutrino Observatory began capturing data in 2005 but it wasn’t until 2011 that it began full operations. It consists of over 5,000 football-sized detectors arranged within a cubic kilometre of ice deep underground. Arranged in this fashion, the detectors are designed to capture the faint flashes of Cherenkov radiation released when neutrinos interact with the ice. The location near the South Pole was chosen because the ice acts as a natural barrier against background radiation from Earth.
Using data from the IceCube Observatory, a team of researchers led by R. Abbasi from the Loyola University Chicago have been probing the nature of dark matter. This strange and invisible component of the universe is thought to make up 27% of the mass-energy content of the universe. Unfortunately, dark matter doesn’t emit, absorb or reflect light making it undetectable by conventional means. One train of thought is that dark matter is made up of Weakly Interacting Massive Particles (WIMPs.) They can be captured by objects like the Sun leading to their annihilation and transition into neutrinos. It’s these, that the team have been hunting for.
The paper published by the team articulates their search for muon neutrinos from the centre of the Earth within the 10 years of data captured by IceCube. The team searched chiefly for WIMPs within the mass range of 10GeV to 10TeV but due to the complexity and position of the source (the centre of the Earth,) the team relied upon running Monte Carlo simulations. The name is taken from casino’s in Monaco and involves running many random simulations. This technique is used where exact calculations are unable to compute the answer and so the simulations are based on the concept that randomness can be used to solve problems.
After running many simulations of this sort, the team found no excess neutrino flux over the background levels from Earth. They conclude however that whilst no evidence has been found yet, that an upgrade to the IceCube Observatory may yield more promising results as they can probe lower neutrino mass events and hopefully one day, solve the mystery of the nature of dark matter.
A team of astronomers have detected a surprisingly fast and bright burst of energy from a galaxy 500 million light years away. The burst of radiation peaked in brightness just after 4 day and then faded quickly. The team identified the burst, which was using the Catalina Real-Time Transient Survey with supporting observations from the Gran Telescopio Canarias, as the result of a small black hole consuming a star. The discovery provides an exciting insight into stellar evolution and a rare cosmic phenomenon.
Black holes are stellar corpses where the gravity is so intense that nothing, not even light can escape. They form when massive stars collapse under their own gravity at the end of their life forming an infinitely small point known as a singularity. The region of space around the singularity is bounded by the event horizon, the point beyond which, nothing can escape. Despite the challenges of observing them, they can be detected by observing the effects of their gravity on nearby objects like gas clouds. There are still many mysteries surrounding black holes so they remain an intense area of study.
A team of astronomers led by Claudia Gutiérrez from the Institute of Space Sciences and the Institute of Space Studies of Catalina used data from the Catalina Real-Time Transient Survey (CRTS) to explore transient events. The CRTS was launched in 2004 and is a wide field survey that looks for variable objects like supernova and asteroids. It uses a network of telescopes based in Arizona to scan large areas of sky to detect short-lived events. It has been of great use providing insights into the life cycle of stars and the behaviour of distant galaxies.
The team detected the bright outburst in a galaxy located 500 million light years away and published their results in the Astrophysical Journal. The event took place in a tiny galaxy about 400 times less massive than the Milky Way. The burst was identified as CSS161010, it reached maximum brightness in only 4 days and 2.5 days later had it’s brightness reduced by half. Subsequent work revealed that previous detection had been picked up by the All-Sky Automated Survey for SuperNovae. Thankfully the detection was early enough to allow follow up observations by other ground based telescopes. Typically these types of events are difficult to study due to their rapid evolution.
Only a handful of events like CSS161010 have been detected in recent years but until now their nature was a mystery. The team led by Gutiérrez have analysed the spectral properties and found hydrogen lines revealing material travelling at speeds up to 10% of the speed of light. The changes observed in the hydrogen emission lines is similar to that seen in active galactic nuclei where supermassive black holes exist. The observation suggests it relates to a black hole, although not a massive one.
The brightness of the object reduced 900 times over the following two months. Further spectral analysis at this time still revealed blue shifted hydrogen lines indicating high speed gas outflows. This was not something usually seen from supernova events suggesting a different origin. The team believe that the event is the result of a small black hole swallowing a star.
Meet the brown dwarf: bigger than a planet, and smaller than a star. A category of its own, it’s one of the strangest objects in the universe.
Brown dwarfs typically are defined to have masses anywhere from 12 times the mass of Jupiter right up to the lower limit for a star. And despite their names, they are not actually brown. The largest and youngest ones are quite hot, giving off a steady glow of radiation. In fact, the largest brown dwarfs are almost indistinguishable from red dwarfs, the smallest of the stars. But the smallest, oldest, and coldest ones are so dim they can only be detected with our most sensitive infrared telescopes.
Unlike stars, brown dwarfs don’t generate their own energy through nuclear fusion, at least not for very long. Instead they emit radiation from the leftover heat of their own formation. As that heat escapes, the brown dwarf continues to dim, sliding from fiery red to mottled magenta to invisible infrared. The greater the mass at its birth, the more heat it can trap and the longer it can mimic a proper star, but the ultimate end fate is the same for every single brown dwarf, regardless of its pedigree.
At first it may seem like brown dwarfs are just extra-large planets, but they get to do something that planets don’t. While brown dwarfs can’t fuse hydrogen in their cores – that takes something like 80 Jupiter masses to accomplish – they can briefly partake in another kind of fusion reaction.
In the cooler heart of a brown dwarf, deuterium, which is a single proton and neutron, can convert into Helium-3, and in the process release energy. This process doesn’t last long; in only a few million years even the largest brown dwarfs use up all their available deuterium, and from there they will just cool off.
As for their size, they tend not to be much larger in diameter than a typical gas giant like Jupiter. That’s because unlike a star, there isn’t an additional source of energy, and thereby pressure, to prop themselves up. Instead, all that’s left is the exotic quantum force known as degeneracy pressure, which means that you can only squeeze so many particles into so small a volume. In this case, brown dwarfs are very close to the limit for degeneracy pressure to maintain their size given their mass.
This means that despite outweighing Jupiter, they won’t appear much larger. And unlike Jupiter, they are briefly capable of nuclear fusion. After that, however, they spend the rest of their lives wandering the galaxy, slowly chilling out.
In 1971, the Soviet Mars 3 lander became the first spacecraft to land on Mars, though it only lasted a couple of minutes before failing. More than 50 years later, it’s still there at Terra Sirenum. The HiRISE camera NASA’s Mars Reconnaissance Orbiter may have imaged some of its hardware, inadvertently taking part in what could be an effort to document our Martian artifacts.
Is it time to start cataloguing and even preserving these artifacts so we can preserve our history?
Some anthropologists think so.
Justin Holcomb is an assistant research professor of anthropology at the University of Kansas. He and his colleagues argue that it’s time to take Martian archaeology seriously, and the sooner we do, the better and more thorough the results will be. Their research commentary, “The emerging archaeological record of Mars,” was recently published in Nature Astronomy.
Artifacts of the human effort to explore the planet are littered on its surface. According to Holcomb, these artifacts and our effort to reach Mars are connected to the original human dispersal from Africa.
“Our main argument is that Homo sapiens are currently undergoing a dispersal, which first started out of Africa, reached other continents and has now begun in off-world environments,” said lead author Holcomb. “We’ve started peopling the solar system. And just like we use artifacts and features to track our movement, evolution and history on Earth, we can do that in outer space by following probes, satellites, landers and various materials left behind. There’s a material footprint to this dispersal.”
It’s tempting to call debris from failed missions wreckage or even space junk like we do the debris that orbits Earth. But things like spent parachutes and heat shields are more than just wreckage. They’re artifacts the same way other cast-offs are artifacts. In fact, what archaeologists often do in the field is sift through trash. “Trash is a proxy for human behaviour,” said one anthropologist.
In any case, one person’s trash can be another person’s historical artifact.
“These are the first material records of our presence, and that’s important to us,” Holcomb said. “I’ve seen a lot of scientists referring to this material as space trash, galactic litter. Our argument is that it’s not trash; it’s actually really important. It’s critical to shift that narrative towards heritage because the solution to trash is removal, but the solution to heritage is preservation. There’s a big difference.”
14 missions to Mars have left their mark on the red planet in the form of artifacts. According to the authors, this is the beginning of the planet’s archaeological record. “Archaeological sites on the Red Planet include landing and crash sites, which are associated with artifacts including probes, landers, rovers and a variety of debris discarded during landing, such as netting, parachutes, pieces of the aluminum wheels (for example, from the Curiosity rover), and thermal protection blankets and shielding,” they write.
Other features include rover tracks and rover drilling and sampling sites.
We’re already partway to taking our abandoned artifacts seriously. The United Nations keeps a list of objects launched into space called the Register of Objects Launched into Outer Space. It’s a way of identifying which countries are liable and responsible for objects in space (but not which private billionaires.) The Register was first implemented in 1976, and it says that about 88% of crewed spacecraft, elements of the ISS, satellites, probes, and landers launched into space are registered.
UNESCO also keeps a register of heritage sites, including archaeological and natural sites. The same could be done for Mars.
There’s already one attempt to start documenting and mapping sites on Mars. The Perseverance Rover team is documenting all of the debris they encounter to make sure it can’t contaminate sampling sites. There are also concerns that debris could pose a hazard to future missions.
According to one researcher, there is over 1700 kg (16,000) pounds of debris on Mars, not including working spacecraft. While much of it is just scraps being blown around by the wind and broken into smaller pieces, there are also larger pieces of debris and nine intact yet inoperative spacecraft.
So far, there have been only piecemeal attempts to document these Martian artifacts.
“Despite efforts from the USA’s Perseverance team, there exists no systematic strategy for documenting, mapping and keeping track of all heritage on Mars,” the authors write. “We anticipate that cultural
resource management will become a key objective during planetary exploration, including systematic surveying, mapping, documentation, and, if necessary, excavation and curation, especially as we expand
our material footprint across the Solar System.”
Holcomb and his co-authors say we must understand that our spacecraft debris is the archaeological record of our attempt to explore not just Mars but the entire Solar System. Our effort to understand Mars is also part of our effort to understand our own planet and how humanity arose. “Any future accidental destruction of this record would be permanent,” they point out.
The authors say there’s a crucial need to preserve things like Neil Armstrong’s first footsteps on the Moon, the first impact on the lunar surface by the USSR’s Luna 2, and even the USSR’s Venera 7 mission, the first spacecraft to land on another planet. This is our shared heritage as human beings.
“These examples are extraordinary firsts for humankind,” Holcomb and his co-authors write. “As we move forward during the next era of human exploration, we hope that planetary scientists, archaeologists and geologists can work together to ensure sustainable and ethical human colonization that protects
cultural resources in tandem with future space exploration.”
There are many historical examples of humans getting this type of thing wrong, particularly during European colonization of other parts of the world. Since we’re still at (we hope) the beginning of our exploration of the Solar System, we have an opportunity to get it right from the start. It will take a lot of work and many discussions to determine what this preservation and future exploration can look like.
“Those discussions could begin by considering and acknowledging the emerging archaeological record on Mars,” the authors conclude.
Binary stars are common throughout the galaxy. Roughly half the stars in the Milky Way are part of a binary or multiple system, so we would expect to find them almost everywhere. However, one place we wouldn’t expect to find a binary is at the center of the galaxy, close to the supermassive black hole Sagittarius A*. And yet, that is precisely where astronomers have recently found one.
There are several stars near Sagittarius A*. For decades, we have watched as they orbit the great gravitational well. The motion of those stars was the first strong evidence that Sag A* was indeed a black hole. At least one star orbits so closely that we can see it redshift as it reaches peribothron.
But we also know that stars should be ever wary of straying too close to the black hole. The closer a star gets to the event horizon of a black hole, the stronger the tidal forces on the star become. There is a point where the tidal forces are so strong a star is ripped apart. We have observed several of these tidal disruption events (TDEs), so we know the threat is very real.
Tidal forces also pose a threat to binary stars. It wouldn’t take much for the tidal pull of a black hole to disrupt binary orbits, causing the stars to separate forever. Tidal forces would also tend to disrupt the formation of binary stars in favor of larger single stars. Therefore astronomers assumed the formation of binary stars near Sagittarius A* wasn’t likely, and even if a binary formed, it wouldn’t last long on cosmic timescales. So astronomers were surprised when they found the binary system known as D9.
The D9 system is young, only about 3 million years old. It consists of one star of about 3 solar masses and the other with a mass about 75% that of the Sun. The orbit of the system puts it within 6,000 AU of Sag A* at its closest approach, which is surprisingly close. Simulations of the D9 system estimate that in about a million years, the black hole’s gravitational influence will cause the two stars to merge into a single star. But even this short lifetime is unexpected, and it shows that the region near a supermassive black hole is much less destructive than we thought.
It’s also pretty amazing that the system was discovered at all. The center of our galaxy is shrouded in gas and dust, meaning that we can’t observe the area in the visible spectrum. We can only see stars in the region with radio and infrared light. The binary stars are too close together for us to identify them individually, so the team used data from the Enhanced Resolution Imager and Spectrograph (ERIS) on the ESO’s Very Large Telescope, as well as archive data from the Spectrograph for INtegral Field Observations in the Near Infrared (SINFONI). This gave the team data covering a 15-year timespan, which was enough to watch the light of D9 redshift and blueshift as the stars orbit each other every 372 days.
Now that we know the binary system D9 exists, astronomers can look for other binary stars. This could help us solve the mystery of how such systems can form so close to the gravitational beast at the heart of our galaxy.
Jupiter’s moon Io is the most volcanically active body in the Solar System, with roughly 400 active volcanoes regularly ejecting magma into space. This activity arises from Io’s eccentric orbit around Jupiter, which produces incredibly powerful tidal interactions in the interior. In addition to powering Io’s volcanism, this tidal energy is believed to support a global subsurface magma ocean. However, the extent and depth of this ocean remains the subject of debate, with some supporting the idea of a shallow magma ocean while others believe Io has a more rigid, mostly solid interior.
In a recent NASA-supported study, an international team of researchers combined data from multiple missions to measure Io’s tidal deformation. According to their findings, Io does not possess a magma ocean and likely has a mostly solid mantle. Their findings further suggest that tidal forces do not necessarily lead to global magma oceans on moons or planetary bodies. This could have implications for the study of exoplanets that experience tidal heating, including Super-Earths and exomoons similar to Io that orbit massive gas giants.
As they explain in their paper, two types of analysis have predicted the existence of a global magma ocean. On the one hand, magnetic induction measurements conducted by the Galileo mission suggested the existence of a magma ocean within Io, approximately 50 km [~30 mi] thick and located near the surface. These results also implied that about 20% of the material in Io’s mantle is melted. However, these results were subjected to debate for many years. In recent years, NASA’s Juno mission conducted multiple flybys of Io and the other Jovian moons and obtained data that supported this conclusion.
In particular, the Juno probe conducted a global mapping campaign of Io’s volcanoes, which suggested that the distribution of volcanic heat flow is consistent with the presence of a global magma ocean. However, these discoveries have led to considerable debate about these techniques and whether they can be used to distinguish whether a shallow global magma ocean drives Io’s volcanic activity. This is the question Park and his colleagues sought to address in their study:
“In our study, Io’s tidal deformation is modeled using the gravitational tidal Love number k2, which is defined as the ratio of the imposed gravitational potential from Jupiter to the induced potential from the deformation of Io. In short, if k2 is large, there is a global magma ocean, and if k2 is small, there is no global magma ocean. Our result shows that the recovered value of k2 is small, consistent with Io not having a global magma ocean.”
The significance of these findings goes far beyond the study of Io and other potentially volcanic moons. Beyond the Solar System, astronomers have discovered countless bodies that (according to current planetary models) experience intense tidal heating. This includes rocky exoplanets that are several times the size and mass of Earth (Super-Earths) and in the case of tidally-locked planets like the TRAPPIST-1 system. These findings are also relevant for the study of exomoons that also experience intense tidal heating (similar to the Jovian moons). As Park explained:
“Although it is commonly assumed among the exoplanet community that intense tidal heating may lead to magma oceans, the example of Io shows that this need not be the case. Our results indicate that tidal forces do not universally create global magma oceans, which may be prevented from forming due to rapid melt ascent, intrusion, and eruption, so even strong tidal heating – like that expected on several known exoplanets and super-Earths – may not guarantee the formation of magma oceans on moons or planetary bodies.”
The star HD 65907 is not what it appears to be. It’s a star that looks young, but on closer inspection is actually much, much older. What’s going on? Research suggests that it is a resurrected star.
Astronomers employ different methods to measure a star’s age. One is based on its brightness and temperature. All stars follow a particular path in life, known as the main sequence. The moment they begin fusing hydrogen in their cores, they maintain a strict relationship between their brightness and temperature. By measuring these two properties, astronomers can roughly pin down the age of a star. But there are other techniques, like measuring the amount of heavy elements in a stellar atmosphere. Older stars tend to have fewer of these elements, because they were born at a time before the galaxy had become enriched with them.
Going by its temperature and brightness, HD 65907 is relatively young, with an age right around 5 billion years old. And yet it contains very little heavy elements. Plus, its path in the galaxy isn’t in line with other young stars, which tend to serenely orbit around the center. HD 65907 is much more erratic, suggesting that it only recently moved here from somewhere else.
In a recent paper, an international team of astronomers dug into archival data to see if they could resolve the mystery, and they believe that HD 65907 is a kind of star known as a blue straggler, and that it has its strange combination of properties because of a violent event in its past, causing it to be resurrected.
If two low-mass stars collide, the remnants can sometimes survive as a star on its own. At first that newly merged star will be both massive and large, with its outer surface flung far away from the core due to the enormous rotation after the collision. But eventually some astrophysical process (perhaps strong magnetic fields might be to blame) drag down the rotation rate of the star, causing it to slow down and settle into equilibrium. In this new state the star will appear massive and incredibly hot: a blue straggler.
No matter what, blue straggler stars get a second chance on life. Those mergers transform small stars into big stars, and they’re just now enjoying their hydrogen-burning main sequence lives.
The astronomers believe this is the case for HD 65907. What makes this star especially unique is that it’s not a member of a cluster, where frequent mergers can easily lead to blue stragglers. Instead, it’s a field star, wandering the galaxy on its own. It must have cannibalized a companion five billion years ago, leading to its apparent youthful age.
Work like this is essential to untangling the complicated lives of stars in the Milky Way, and it shows how the strangest stars hold the keys to unlocking the evolution of elements that lead to systems like our own.
It’s axiomatic that the Universe is expanding. However, the rate of expansion hasn’t remained the same. It appears that the Universe is expanding more quickly now than it did in the past.
Astronomers have struggled to understand this and have wondered if the apparent acceleration is due to instrument errors. The JWST has put that question to rest.
American astronomer Edwin Hubble is widely credited with discovering the expansion of the Universe. But it actually stemmed from relativity equations and was pioneered by Russian scientist Alexander Freedman. Hubble’s Law bears Edwin’s name, though, and he was the one who confirmed the expansion, called Hubble’s constant, and put a more precise value to it. It measures how rapidly galaxies that aren’t gravitationally bound are moving away from one another. The movement of objects due solely to the Hubble constant is called the Hubble flow.
Measuring the Hubble constant means measuring distances to far-flung objects. Astronomers use the cosmic distance ladder (CDL) to do that. However, the ladder has a problem.
The first rungs on the CDL are fundamental measurements that can be observed directly. Parallax measurement is the most important fundamental measurement. But the method breaks down at great distances.
Beyond that, astronomers use standard candles, things with known intrinsic brightness, like supernovae and Cepheid variables. Those objects and their relationships help astronomers measure distances to other galaxies. This has been tricky to measure, though advancing technology has made progress.
Another pair of problems plagues the effort, though. The first is that different telescopes and methods produce different distance measurements. The second is that our measurements of distances and expansion don’t match up with the Standard Model of Cosmology, also known as the Lambda Cold Dark Matter (LCDM) model. That discrepancy is called the Hubble tension.
The question is, can the mismatch between the measurements and the LCDM be explained by instrument differences? That possibility has to be eliminated, and the trick is to take one large set of distance measurements from one telescope and compare them to another.
New research in The Astrophysical Journal tackles the problem by comparing Hubble Space Telescope measurements with JWST measurements. It’s titled “JWST Validates HST Distance Measurements: Selection of Supernova Subsample Explains Differences in JWST Estimates of Local H0.” The lead author is Adam Riess, a Bloomberg Distinguished Professor and Thomas J. Barber Professor of Physics and Astronomy at Johns Hopkins University. Riess is also a Nobel laureate, winning the 2011 Nobel Prize in Physics “for the discovery of the accelerating expansion of the Universe through observations of distant supernovae,” according to the Nobel Institute.
As of 2022, the Hubble Space Telescope gathered the most numerous sample of homogeneously measured standard candles. It measured a large number of standard candles out to about 40 Mpc or about 130 million light-years. “As of 2022, the largest collection of homogeneously measured SNe Ia is complete to D less than or equal to 40 Mpc or redshift z less than or equal to 0.01,” the authors of the research write. “It consists of 42 SNe Ia in 37 host galaxies calibrated with observations of Cepheids with the Hubble Space Telescope (HST), the heritage of more than 1000 orbits (a comparable number of hours) invested over the last ~20 yrs.”
In this research, the astronomers used the powerful JWST to cross-check the Hubble’s work. “We cross-check the Hubble Space Telescope (HST) Cepheid/Type Ia supernova (SN Ia) distance ladder, which yields the most precise local H0 (Hubble flow), against early James Webb Space Telescope (JWST) subsamples (~1/4 of the HST sample) from SH0ES and CCHP, calibrated only with NGC 4258,” the authors write. SH0ES and CCHP are different observing efforts aimed at measuring the Hubble constant. SH0ES stands for Supernova H0 for the Equation of State of Dark Energy, and CCHP stands for Chicago-Carnegie Hubble Program, which uses the JWST to measure the Hubble constant.
“JWST has certain distinct advantages (and some disadvantages) compared to HST for measuring distances to nearby galaxies,” Riess and his co-authors write. It offers a 2.5 times higher near-infrared resolution than the HST. Despite some of its disadvantages, the JWST “is able to provide a strong cross-check of distances in the first two rungs,” the authors explain.
Observations from both telescopes are closely aligned, which basically minimizes instrument error as the cause of the discrepancy between observations and the Lambda CDM model.
“While it will still take multiple years for the JWST sample of SN hosts to be as large as the HST sample, we show that the current JWST measurements have already ruled out systematic biases from the first rungs of the distance ladder at a much smaller level than the Hubble tension,” the authors write.
This research covered about one-third of the Hubble’s data set, with the known distance to a galaxy called NGC 4258 serving as a reference point. Even though the data set was small, Riess and his co-researchers achieved impressively precise results. They showed that the measurement differences were less than 2%. That’s much less than the 8% to 9% in the Hubble tension discrepancy.
That means that our Lamda CDM model is missing something. The standard model yields an expansion rate of about 67 to 68 kilometres per second per megaparsec. Telescope observations yield a slightly higher rate: between 70 and 76 kilometres per second per megaparsec. This work shows that the discrepancy can’t be due to the different telescopes and methods.
“The discrepancy between the observed expansion rate of the universe and the predictions of the standard model suggests that our understanding of the universe may be incomplete. With two NASA flagship telescopes now confirming each other’s findings, we must take this [Hubble tension] problem very seriously—it’s a challenge but also an incredible opportunity to learn more about our universe,” said lead author Riess.
What could be missing from the Lambda CDM model?
Marc Kamionkowski is a Johns Hopkins cosmologist who helped calculate the Hubble constant and recently developed a possible new explanation for the tension. Though not part of this research, he commented on it in a press release.
“One possible explanation for the Hubble tension would be if there was something missing in our understanding of the early universe, such as a new component of matter—early dark energy—that gave the universe an unexpected kick after the big bang,” said Kamionkowski. “And there are other ideas, like funny dark matter properties, exotic particles, changing electron mass, or primordial magnetic fields that may do the trick. Theorists have license to get pretty creative.”
We’ve already seen the success of the Ingenuity probe on Mars. The first aircraft to fly on another world set off on its maiden voyage in April 2021 and has now completed 72 flights. Now a team of engineers are taking the idea one step further and investigating ways that drones can be released from satellites in orbit and explore the atmosphere without having to land. The results are positive and suggest this could be a cost effective way to explore alien atmospheres.
The idea of using drones on alien worlds has been enticing engineers and planetary explorers for a few years now. They’re lightweight and versatile and an excellent cost effective way to study the atmosphere of the planets. Orbiters and rovers have been visiting the planets for decades now but drones can explore in ways rovers and orbiters cannot. Not only will they be useful to study atmospheric effects but they will be able to reach inaccessible surface areas providing imagery to inform potential future land based study.
Perhaps one of the most famous, indeed the only successful planetary drone to date is the Ingenuity drone which was part of the Perseverance rover mission. It was able to demonstrate that controlled flight in the Martian atmosphere was possible, could hunt out possible landing sites for future missions and direct ground based exploration. It’s iconic large wingspan was needed due to the rarefied atmosphere on Mars requiring larger rotor blades to generate the required lift. Ingenuity was originally planned as a technology demonstration mission but it soon became a useful tool in the Perseverance mission arsenal.
NASA engineers are very aware of the benefits of drone technology and so a team of engineers and researchers from the Armstrong Flight Research Center in California have been taking the idea of small drones one step further. The research was part of the Center Innovation Fund award from 2023 and began as the team developed three atmospheric probe models. The models were all the same, measuring 71 cm from top to bottom, one for visual demonstration, the other two for research and technology readiness.
Their first launch on 1 August didn’t go to plan with a failure in the release mechanism. The team reviewed everything from the lifting aircraft, the release mechanism and even the probe design itself to identify improvements. The team were finally able to conduct flights with their new atmospheric probe after it was released from a quad rotor remotely piloted aircraft on 22 October 2024.
The flights were conducted above the Rogers Dry Lake near in California with designs informed by previous NASA instrumentation designed for lifting and transportation. The test flights were aiming to prove the shape of the probe worked. The team now want to release the probe from a higher altitude, ultimately hoping to be able to release it from a satellite in orbit around a planet.
The next steps are to review photos and videos from the flight to identify further improvements before another probe is built. Once they have probed the flight technology, instrumentation will be added to facilitate data gathering and recording. If all goes to plan then the team hope to be chosen for a mission to one of the planets, be released in orbit and then dive into the atmosphere under controlled flight to learn more about the environment.
Since the discovery of the first exoplanet in 1992, thousands more have been discovered. 40 light years away, one such system of exoplanets was discovered orbiting a star known as Trappist-1. Studies using the James Webb Space Telescope have revealed that one of the planets, Trappist-1 b has a crust that seems to be changing. Geological activity and weathering are a likely cause and if the latter, it suggests the exoplanet has an atmosphere too.
Exoplanets are planets that orbit around other stars. In every way they vary in size, composition and distance from their star. Finding them is quite a tricky undertaking and there are a number of different approaches that are used. Since the first discovery, over 5,000 exoplanets have been found and now of course, the hunt is on to find planets that could sustain life. Likely candidates would be orbiting their host star in a region known as the habitable zone where the temperature is just right for a life sustaining world to evolve.
There are three exoplanets in the Trappist-1 system that orbit the star within the habitable zone; Trappist-1e, f and g. The star is a cool dwarf star in the constellation of Aquarius and was identified as being a host of exoplanets in 2017. The discoveries were made using data from NASA’s Kepler Space Telescope and the Spitzer Space Telescope. The system was named after the Transiting Planets and PlanetesImals Small Telescope (TRAPPIST.)
A team of researchers from the Max Planck Institute for Astronomy and the Commissariat aux Énergies Atomiques (CEA) in Paris have been studying Trappist-1b. They have been using the Mid-Infrared Imager of the James Webb Space Telescope to measure thermal radiation from the exoplanet. Their findings have been published in the journal Nature Astronomy. Previous studies concluded that Trappist-1b was a dark rocky planet that and no atmosphere. The new study has turned this conclusion on its head.
The measurements found by the team revealed something else. They found a world with a surface composed of largely unchanged material. Typically the surface of a world with no atmosphere is weathered by radiation and peppered with impacts from meteorites. The study found that the surface materials is around 1,000 years old, much younger than the planet itself which is thought to be several billion years old.
The team postulate that this could indicate volcanic activity or plate tectonics since the planet has sufficient size to still retain internal heat from its formation. It’s also possible that the observations reveal a thick atmosphere rich in carbon dioxide. The observations suggested at first that there was no layer of carbon dioxide since they found no evidence of thermal radiation absorption. They ran models however to show that atmospheric haze can reverse the temperature profile of a carbon dioxide rich atmosphere. Typically the ground is the warmest region but in the case of Trappist-1b, it may be that the atmosphere absorbs radiation, this heats the upper layers which radiates the infrared energy itself. A similar process is seen on Saturn’s moon Titan.
Fortunately, the alignment of the planetary system means that it passes directly in front of the star so that spectroscopic observations and the dimming of starlight as the planet passes in front can reveal the profile of the atmosphere. Further studies are now underway to explore this and take further observations to conclude the nature of the atmosphere around Trappist-1b.