The Standard Model describes how the Universe has evolved at large scale. There are six numbers that define the model and a team of researchers have used them to build simulations of the Universe. The results of these simulations were then fed to a machine learning algorithm to train it before it was set the task of estimating five of the cosmological constants, a task which it completed with incredible precision.
The Standard Model incorporates a number of elements; the Big Bang, dark energy, cold dark matter, ordinary matter and the cosmic background radiation. It works well to describe the large scale structure of the Universe but there are gaps in our understanding. Quantum physics can describe the small scale of the Universe but struggles with gravity and there are questions around dark matter and dark energy too. Understanding these can help in our understanding of the evolution and structure of the Universe.
A team of researchers from the Flatiron Institute have managed to extract some hidden information in the distribution of galaxies to estimate the values of five of the parameters. The accuracy was a great improvement on values that were attained during previous attempts. Using AI technology the team’s results had less than half the uncertainty for the element that describes the clumpiness of the Universe than in the previous attempt. Their results also revealed estimates of other parameters that closely resembled observation. The paper was published in Nature Astronomy on 21 August.
The team generated 2,000 simulated universes after carefully specifying their cosmological parameters. These included expansion rate, the distribution and clumpiness of ordinary matter, dark matter and dark energy and using these the team ran the simulations. The output was then compressed into manageable data sets and this was used to compare against over one hundred thousand real galaxies data. From this, it was possible for the researchers to estimate the parameters for the real Universe.
The parameters the team managed to fine tune are those that describe how the Universe operates at the largest scale. These are essentially, the settings for the Universe and include the amount of ordinary matter, dark matter, dark energy, the conditions following the Big Bang and just how clumpy the matter is. Previously these settings were calculated using observations from the structure of galaxy clusters. To arrive at a more accurate group of settings observations needed to go down to smaller scale but this has not been possible.
Instead of using observations, the team used their AI approach to extract the small scale information that was hidden in the existing observational data. At the heart of the approach was the AI system that learned how to correlate the parameters with the observed structure of the Universe – but at small scale.
In the future the team hope to be able to use their new approach to solve other problems. The uncertainty about the Hubble Constant is an example where the team hope AI can help to fine tune its value. Over the next few years though, and as observational data becomes more detailed both Hubble’s Constant and the Settings of the Universe will become far better understood along with our understanding of the Universe.
Source : Astrophysicists Use AI to Precisely Calculate Universe’s ‘Settings’
The post Estimating the Basic Settings of the Universe appeared first on Universe Today.
No comments:
Post a Comment