Brief history of “dark energy”

Advertisement

Brief history of “dark energy”

The revival of a proposal challenging the existence of dark energy, which would eliminate 70% of the matter in the universe, has sparked a renewed and long-standing discussion.

Dark energy and dark matter are concepts that have been proposed to account for unexplained observations that cannot be explained by current understanding.

When considering the magnitude of galaxies, the force of gravity seems to exceed what can be explained solely by particles capable of emitting light. As a result, we introduce dark matter particles, which constitute approximately 25% of the total mass-energy in the Universe. However, these particles have yet to be observed directly.

In the vast expanses where the Universe is undergoing expansion, the force of gravity seems to be less powerful than initially anticipated in a universe composed solely of particles, be they ordinary or dark matter. Therefore, we introduce the concept of “dark energy” as a relatively feeble counteracting force, operating independently from matter.

The concept of dark energy has been around since the inception of general relativity. 100 years ago, Albert Einstein incorporated it into his application of relativity to cosmology.

Einstein had an incorrect belief that he could achieve a perfect equilibrium between matter’s self attraction and anti-gravity on a cosmic scale. He was unable to conceive of the idea that the Universe had a starting point and wished for it to remain unchanged over time.

In 1917, our knowledge of the Universe was extremely limited. There was a significant amount of debate over the concept that galaxies were actually objects located at immense distances.

Einstein encountered a difficult situation. The fundamental nature of his theory, which was later described in the introduction of a well-known textbook, can be summarized as follows, even though many years had passed since its inception.

The curvature of space is influenced by matter, while the movement of matter is influenced by space.

This implies that space has a natural inclination to either expand or contract, curving alongside matter. It is always in a state of motion and never remains static.

Alexander Friedmann, in 1922, made a realization similar to Einstein’s, but he did not attempt to equate the quantities of matter and dark energy. Instead, he proposed a model where universes could either expand or contract.

Additionally, the deceleration of expansion would occur exclusively with the presence of matter. However, the inclusion of anti-gravitational dark energy could lead to an acceleration in this expansion.

Ever since the late 1990s, various independent observations have indicated the need for an accelerating expansion in our Universe, which is believed to be composed of 70% dark energy. However, it is important to note that this conclusion relies on the previously established model of expansion, which has remained unchanged since the 1920s.

Einstein’s equations pose a substantial challenge, not only due to their greater quantity compared to Isaac Newton’s theory of gravity.

Regrettably, there are certain fundamental inquiries that Einstein did not address. These pertain to the scales at which matter influences the curvature of space, the maximum size of an object that acts as a singular particle in response, and the accurate depiction at different scales.

The 100-year old approximation, first proposed by Einstein and Friedmann, conveniently sidesteps these problems by assuming that the universe expands uniformly on average. It’s as if all the cosmic structures could be blended together to create a smooth, featureless mixture.

The justification for this homogenizing approximation was made in the early stages of cosmic history. Based on our knowledge of the cosmic microwave background, which is the leftover radiation from the Big Bang, we understand that there were minimal differences in the density of matter when the Universe was less than one million years old.

However, the current state of the universe is not uniform. The presence of gravitational instability resulted in the development of stars, galaxies, clusters of galaxies, and ultimately a large-scale structure known as the “cosmic web.” This structure is primarily composed of voids that are encircled by galaxy-filled sheets and interconnected by thin filaments.

In the field of standard cosmology, we make the assumption that there is a background that expands without taking into account any cosmic structures. Subsequently, we utilize computer simulations based on Sir Isaac Newton’s theory, which is over three centuries old. These simulations effectively generate a structure that bears a resemblance to the observed cosmic web, which can be considered quite convincing. However, this outcome necessitates the inclusion of dark energy and dark matter as crucial components.

Despite accounting for 95% of the universe’s energy density in order to explain various phenomena, the model continues to encounter issues ranging from conflicts to irregularities.

In addition, conventional cosmology also establishes the uniformity of space’s curvature in all regions, disregarding any connection with matter. However, this conflicts with Einstein’s fundamental concept that matter influences the curvature of space.

We are currently not utilizing the entirety of general relativity. An alternative way to summarize the standard model is as follows: Friedmann guides the curvature of space, while Newton dictates the motion of matter.

From the early 2000s, cosmologists have been investigating the possibility that Einstein’s equations, which connect matter and curvature on small scales, could result in backreaction – a non-uniform average expansion – on a larger scale.

In the early stages of the universe, both the matter and curvature distributions are relatively uniform. However, as the cosmic web develops and becomes more intricate, the small-scale curvature variations increase significantly, causing the average expansion to deviate from that predicted by standard cosmology.

The team in Budapest and Hawaii recently conducted numerical experiments to challenge the concept of dark energy, applying standard Newtonian simulations. However, they took a unique approach by employing a non-standard method to evolve their code forward in time, aiming to accurately model the backreaction effect.

Interestingly, the expansion law derived from analyzing data from the Planck satellite closely aligns with the expansion law of a backreaction model called the timescape cosmology, which is based on general relativity and was developed about ten years ago. This model suggests that we need to adjust our measurements of time and distance when taking into account variations in curvature between galaxies and the empty spaces between them. As a result, it implies that the Universe does not have a single age.

During the upcoming decade, experiments like the Euclid satellite and the CODEX experiment will be capable of examining whether the expansion of the universe adheres to the homogeneous law of Friedmann or a different backreaction model.

In order to be well-prepared, it is crucial that we avoid the tendency to rely solely on one particular cosmological theory, as emphasized by Avi Loeb, the Chair of Astronomy at Harvard University. According to Loeb, it is imperative to have a diversified approach to understanding the universe.

In order to prevent becoming stagnant and to foster a dynamic scientific culture, it is essential for a research field to maintain multiple interpretations of data, thereby encouraging new experiments to aim at determining the correct interpretation. Currently, it is common for conferences to focus primarily on experimental results and observations, but it would be beneficial to also prioritize discussions on conceptual issues, promoting a healthy dialogue among different perspectives.

Although the existence of backreaction effects is widely acknowledged by researchers, the main point of contention revolves around whether these effects can cause a deviation of more than 1% or 2% from the mass-energy budget of standard cosmology.

Any proposal for a backreaction solution that negates the presence of dark energy needs to provide an explanation for why the average expansion of the universe seems uniform, despite the fact that the cosmic web is not homogenous. This is a principle that standard cosmology assumes without giving a specific reason.

Given that Einstein’s equations have the potential to cause complex expansions in space, it becomes necessary to establish a simplifying principle for their overall average on a large scale. The timescape cosmology adopts this approach.

The roots of any method to simplify cosmological averages are probably found in the early Universe, as it was much less complex compared to the present Universe. Over the past 38 years, scientists have turned to inflationary universe models to account for the straightforwardness observed in the early Universe.

Although some models of inflation have achieved success in certain areas, the data from the Planck satellite has excluded many of these models. The remaining ones, however, provide intriguing suggestions of more profound fundamental principles.

Several physicists maintain the perspective that the Universe is a stable, unbroken entity that exists separately from the matter fields inhabiting it. However, considering the principles of relativity, which suggest that space and time only have significance in relation to each other, it might be necessary to reconsider fundamental concepts.

Perhaps the emergence of spacetime as we know it is closely tied to the condensation of the first massive particles, as time, which is measured by particles with non-zero rest mass, comes into existence.

No matter what the ultimate theory may be, it will probably incorporate the essential advancement found in general relativity, which is the dynamic connection between matter and geometry, but on a quantum scale.

Advertisement
Advertisement