Here is an incomplete list of additional possibilities; see also [ 48 , 58 , ]. Observations of numbers of objects vs. Clusters of galaxies would be possible candidates for such a measurement, but they are insufficiently isotropic; alternatives, however, have been proposed, using for example the quasar correlation function as determined from redshift surveys [ , ], or the Lyman-a forest [ ].
In a related effect, the dynamics of large-scale structure can be affected by a nonzero cosmological constant; if a protocluster, for example, is anisotropic, it can begin to contract along a minor axis while the universe is matter-dominated and along its major axis while the universe is vacuum-dominated. Although small, such effects may be observable in individual clusters [ ] or in redshift surveys [ 19 ].
New Horizons mission Archives - Universe Today
A different version of the distance-redshift test uses extended lobes of radio galaxies as modified standard yardsticks. Inspiralling compact binaries at cosmological distances are potential sources of gravitational waves. Finally, consistency of the age of the universe and the ages of its oldest constituents is a classic test of the expansion history. Measurements of geometric parallax to nearby stars from the Hipparcos satellite have, at the least, called into question previous determinations of the ages of the oldest globular clusters, which are now thought to be perhaps 12 billion rather than 15 billion years old see the discussion in [ 88 ].
It is therefore unclear whether the age issue forces a cosmological constant upon us, but by now it seems forced upon us for other reasons.
In Section 1. It is somewhat unfair to characterize this discrepancy as a factor of 10 , since energy density can be expressed as a mass scale to the fourth power. Although the mechanism which suppresses the naive value of the vacuum energy is unknown, it seems easier to imagine a hypothetical scenario which makes it exactly zero than one which sets it to just the right value to be observable today.
Keeping in mind that it is the zero-temperature, late-time vacuum energy which we want to be small; it is expected to change at phase transitions, and a large value in the early universe is a necessary component of inflationary universe scenarios [ , , 6 ]. If the recent observations pointing toward a cosmological constant of astrophysically relevant magnitude are confirmed, we will be faced with the challenge of explaining not only why the vacuum energy is smaller than expected, but also why it has the specific nonzero value it does.
Although initially investigated for other reasons, supersymmetry SUSY turns out to have a significant impact on the cosmological constant problem, and may even be said to solve it halfway. SUSY is a spacetime symmetry relating fermions and bosons to each other. Unlike most non-gravitational field theories, in supersymmetry the total energy of a state has an absolute meaning; the Hamiltonian is related to the supercharges in a straightforward way:.
More concretely, in a given supersymmetric theory we can explicitly calculate the contributions to the energy from vacuum fluctuations and from the scalar potential V. In the case of vacuum fluctuations, contributions from bosons are exactly canceled by equal and opposite contributions from fermions when supersymmetry is unbroken. So the vacuum energy of a supersymmetric state in a globally supersymmetric theory will vanish. This represents rather less progress than it might appear at first sight, since:. Supersymmetric states manifest a degeneracy in the mass spectrum of bosons and fermions, a feature not apparent in the observed world; and.
The above results imply that non-supersymmetric states have a positive-definite vacuum energy. In the real world, the fact that accelerator experiments have not discovered superpartners for the known particles of the Standard Model implies that M SUSY is of order 10 3 GeV or higher. Thus, we are left with a discrepancy. Comparison of this discrepancy with the naive discrepancy 54 is the source of the claim that SUSY can solve the cosmological constant problem halfway at least on a log scale.
As mentioned, however, this analysis is strictly valid only in flat space. In curved spacetime, the global transformations of ordinary supersymmetry are promoted to the position-dependent gauge transformations of supergravity. The scalar potential is.
But with gravity, in addition to the non-negative first term we find a second term providing a non-positive contribution. We are therefore free to imagine a scenario in which supersymmetry is broken in exactly the right way, such that the two terms in parentheses cancel to fantastic accuracy, but only at the cost of an unexplained fine-tuning see for example [ 63 ]. At the same time, supergravity is not by itself a renormalizable quantum theory, and therefore it may not be reasonable to hope that a solution can be found purely within this context. Unlike supergravity, string theory appears to be a consistent and well-defined theory of quantum gravity, and therefore calculating the value of the cosmological constant should, at least in principle, be possible.
On the other hand, the number of vacuum states seems to be quite large, and none of them to the best of our current knowledge features three large spatial dimensions, broken supersymmetry, and a small cosmological constant. At the same time, there are reasons to believe that any realistic vacuum of string theory must be strongly coupled [ 70 ]; therefore, our inability to find an appropriate solution may simply be due to the technical difficulty of the problem.
For general introductions to string theory, see [ , ]; for cosmological issues, see [ , 21 ]. String theory is naturally formulated in more than four spacetime dimensions. In each of these six cases, the solution with the maximum number of uncompactified, flat spacetime dimensions is a stable vacuum preserving all of the supersymmetry. To bring the theory closer to the world we observe, the extra dimensions can be compactified on a manifold whose Ricci tensor vanishes.
There are a large number of possible compactifications, many of which preserve some but not all of the original supersymmetry. If enough SUSY is preserved, the vacuum energy will remain zero; generically there will be a manifold of such states, known as the moduli space.
Of course, to describe our world we want to break all of the supersymmetry. Investigations in contexts where this can be done in a controlled way have found that the induced cosmological constant vanishes at the classical level, but a substantial vacuum energy is typically induced by quantum corrections [ ]. Moore [ ] has suggested that Atkin-Lehner symmetry, which relates strong and weak coupling on the string worldsheet, can enforce the vanishing of the one-loop quantum contribution in certain models see also [ 67 , 68 ] ; generically, however, there would still be an appreciable contribution at two loops.
Thus, the search is still on for a four-dimensional string theory vacuum with broken supersymmetry and vanishing or very small cosmological constant. See [ 69 ] for a general discussion of the vacuum problem in string theory. The difficulty of achieving this in conventional models has inspired a number of more speculative proposals, which I briefly list here.
In three spacetime dimensions supersymmetry can remain unbroken, maintaining a zero cosmological constant, in such a way as to break the mass degeneracy between bosons and fermions [ ]. In a holographic theory, the number of degrees of freedom in a region grows as the area of its boundary, rather than as its volume. Therefore, the conventional computation of the cosmological constant due to vacuum fluctuations conceivably involves a vast overcounting of degrees of freedom. We might imagine that a more correct counting would yield a much smaller estimate of the vacuum energy [ 20 , 57 , , ], although no reliable calculation has been done as yet.
The absence of manifest SUSY in our world leads us to ask whether the beneficial aspect of canceling contributions to the vacuum energy could be achieved even without a truly super-symmetric theory. Kachru, Kumar and Silverstein [ ] have constructed such a string theory, and argue that the perturbative contributions to the cosmological constant should vanish although the actual calculations are somewhat delicate, and not everyone agrees [ ]. If such a model could be made to work, it is possible that small non-perturbative effects could generate a cosmological constant of an astrophysically plausible magnitude [ ].
While gravity is harder to confine to a brane, phenomenologically acceptable scenarios can be constructed if either the extra dimensions are any size less than a millimeter [ , 10 , , 13 , ], or if there is significant spacetime curvature in a non-compact extra dimension [ , , ]. Although these scenarios do not offer a simple solution to the cosmological constant problem, the relationship between the vacuum energy and the expansion rate can differ from our conventional expectation see for example [ 32 , ] , and one is free to imagine that further study may lead to a solution in this context see for example [ , 40 ].
Of course, string theory might not be the correct description of nature, or its current formulation might not be directly relevant to the cosmological constant problem. For example, a solution may be provided by loop quantum gravity [ 98 ], or by a composite graviton [ ]. It is probably safe to believe that a significant advance in our understanding of fundamental physics will be required before we can demonstrate the existence of a vacuum state with the desired properties. Not to mention the equally important question of why our world is based on such a state, rather than one of the highly supersymmetric states that appear to be perfectly good vacua of string theory.
The anthropic principle [ 25 , ] is essentially the idea that some of the parameters characterizing the universe we observe may not be determined directly by the fundamental laws of physics, but also by the truism that intelligent observers will only ever experience conditions which allow for the existence of intelligent observers. Many professional cosmologists view this principle in much the same way as many traditional literary critics view deconstruction — as somehow simultaneously empty of content and capable of working great evil.
Anthropic arguments are easy to misuse, and can be invoked as a way out of doing the hard work of understanding the real reasons behind why we observe the universe we do. Furthermore, a sense of disappointment would inevitably accompany the realization that there were limits to our ability to unambiguously and directly explain the observed universe from first principles. It is nevertheless possible that some features of our world have at best an anthropic explanation, and the value of the cosmological constant is perhaps the most likely candidate.
In such a case, our local conditions arise as some combination of the relative abundance of different environments and the likelihood that such environments would give rise to intelligence. We are therefore faced with the task of estimating quantitatively the likelihood of observing any specific value of A within such a scenario. The most straightforward anthropic constraint on the vacuum energy is that it must not be so high that galaxies never form [ ]. From the discussion in Section 2. Thus, the cosmological constant could be somewhat larger than observation allows and still be consistent with the existence of galaxies.
See for example [ , , ]. Garriga and Vilenkin [ ] argue on the basis of quantum cosmology that there can be a significant departure from a constant a priori distribution. Perhaps the most significant weakness of this point of view is the assumption that there are a continuum of possibilities for the vacuum energy density. Such possibilities correspond to choices of vacuum states with arbitrarily similar energies. If these states were connected to each other, there would be local fluctuations which would appear to us as massless fields, which are not observed see Section 4.
If on the other hand the vacua are disconnected, it is hard to understand why all possible values of the vacuum energy are represented, rather than the differences in energies between different vacua being given by some characteristic particle-physics scale such as M Pl or M SUSY.
For one scenario featuring discrete vacua with densely spaced energies, see [ 23 ]. It will therefore again require advances in our understanding of fundamental physics before an anthropic explanation for the current value of the cosmological constant can be accepted. The importance of the cosmological constant problem has engendered a wide variety of proposed solutions. This section will present only a brief outline of some of the possibilities, along with references to recent work; further discussion and references can be found in [ , 48 , ]. One approach which has received a great deal of attention is the famous suggestion by Cole-man [ 59 ], that effects of virtual wormholes could set the cosmological constant to zero at low energies.
The essential idea is that wormholes thin tubes of spacetime connecting macroscopically large regions can act to change the effective value of all the observed constants of nature. If we calculate the wave function of the universe by performing a Feynman path integral over all possible spacetime metrics with wormholes, the dominant contribution will be from those configurations whose effective values for the physical constants extremize the action. These turn out to be, under a certain set of assumed properties of Euclidean quantum gravity, configurations with zero cosmological constant at late times.
Thus, quantum cosmology predicts that the constants we observe are overwhelmingly likely to take on values which imply a vanishing total vacuum energy. However, subsequent investigations have failed to inspire confidence that the desired properties of Euclidean quantum cosmology are likely to hold, although it is still something of an open question; see discussions in [ , 48 ].
Another route one can take is to consider alterations of the classical theory of gravity. The simplest possibility is to consider adding a scalar field to the theory, with dynamics which cause the scalar to evolve to a value for which the net cosmological constant vanishes see for example [ 74 , ]. While this approach has not led to a believable solution to the cosmological constant problem, it does change the context in which it appears, and may induce different values for the effective vacuum energy in different branches of the wavefunction of the universe.
Like supersymmetry, conformal invariance is not manifest in the Standard Model of particle physics. However, it has been proposed that quantum effects could restore conformal invariance on length scales comparable to the cosmological horizon size, working to cancel the cosmological constant for some examples see [ , 12 , 11 ].
At this point it remains unclear whether this suggestion is compatible with a more complete understanding of quantum gravity, or with standard cosmological observations. A final mechanism to suppress the cosmological constant, related to the previous one, relies on quantum particle production in de Sitter space analogous to Hawking radiation around black holes. The idea is that the effective energy-momentum tensor of such particles may act to cancel out the bare cosmological constant for recent attempts see [ , , 1 , ].
There is currently no consensus on whether such an effect is physically observable see for example [ ]. If inventing a theory in which the vacuum energy vanishes is difficult, finding a model that predicts a vacuum energy which is small but not quite zero is all that much harder. Along these lines, there are various numerological games one can play. The challenging part of this program, of course, is to devise such a theory. Scenarios along these lines have been explored [ , , ]; the major hurdle to be overcome is explaining why the energy difference between the true and false vacua is so much smaller than one would expect.
This possibility has been extensively explored of late, and a number of candidates have been put forward. A large number of phenomenological models of this type have been investigated, starting with the early work in [ , 89 ]; see [ , ] for many more references. Current observations of supernovae, large-scale structure, gravitational lensing, and the CMB already provide interesting limits on w X [ , 56 , , 93 , 54 , , , , , , 77 , ], and future data will be able to do much better [ 77 , , 60 , ].
It is clear that the favored value for the equation-of-state parameter is near -1, that of a true cosmological constant, although other values are not completely ruled out. Thin contours on the left represent limits from CMB and large-scale structure measurements, while thick contours are those from SNe observations; solid lines apply to models with constant w X , while dashed lines apply to models of dynamical scalar fields. The constraints are portrayed separately on the left, and combined on the right. This equation is similar to 45 , with analogous solutions.
The Hubble parameter acts as a friction term; for generic potentials, the field will be overdamped and thus approximately constant when , and underdamped and thus free to roll when. The energy density is , and the pressure is , implying an equation of state parameter. There are many reasons to consider dynamical dark energy as an alternative to a cosmolog-ical constant. First and foremost, it is a logical possibility which might be correct, and can be constrained by observation.
But most interestingly, one might wonder whether replacing a constant parameter A with a dynamical field could allow us to relieve some of the burden of fine-tuning that inevitably accompanies the cosmological constant. To date, investigations have focused on scaling or tracker models of quintessence, in which the scalar field energy density can parallel that of matter or radiation, at least for part of its history [ 86 , 62 , , , , , ]. Of course, we do not want the dark energy density to redshift away as rapidly as that in matter during the current epoch, or the universe would not be accelerating.
Tracker models can be constructed in which the vacuum energy density at late times is robust, in the sense that it does not depend sensitively on the initial conditions for the field. Indeed, it is hard to imagine how this could help but be the case; unlike the case of the axion solution to the strong-CP problem, we have no symmetry to appeal to that would enforce a small vacuum energy, much less a particular small nonzero number.
Quintessence models also introduce new naturalness problems in addition to those of a cosmological constant. By particle-physics standards, this is an incredibly small number; masses of scalar fields tend to be large in the absence of a symmetry to protect them. Such interactions are potentially observable, both via fifth-force experiments and searches for time-dependence of the constants of nature, and current limits imply that there must be suppression of the quintessence couplings by several orders of magnitude over what would be expected [ 47 , 53 , ].
The only known way to obtain such a suppression is through the imposition of an approximate global symmetry which would also help explain the low mass of the field , of the type characteristic of pseudo-Goldstone boson models of quintessence, which have been actively explored [ 92 , 91 , , 55 , , ]. Cosmological pseudo-Goldstone bosons are potentially detectable through their tendency to rotate polarized radiation from galaxies and the CMB [ 47 , ].
See [ ] for a discussion of further fine-tuning problems in the context of supersymmetric models. Nevertheless, these naturalness arguments are by no means airtight, and it is worth considering specific particle-physics models for the quintessence field.
In addition to the pseudo-Goldstone boson models just mentioned, these include models based on supersymmetric gauge theories [ 31 , ], supergravity [ 37 , 5 ], small extra dimensions [ 29 , 24 ], large extra dimensions [ 28 , 22 ], quantum field theory effects in curved spacetime [ , ], and non-minimal couplings to the curvature scalar [ , , 8 , , , 64 , 30 ].
Finally, the possibility has been raised that the scalar field responsible for driving inflation may also serve as quintessence [ 90 , , , ], although this proposal has been criticized for producing unwanted relics and isocurvature fluctuations [ 84 ]. There are other models of dark energy besides those based on nearly-massless scalar fields.bunmyrtliwel.ga
Join Kobo & start eReading today
In July of , the New Horizons mission made history when it conducted the first flyby in history of Pluto. These strange features showed people for the first time how radically different the surface of Pluto is from Earth and the other planets of the inner Solar System. But strangely, they also showcased how this distant world is also quite similar to Earth. On Earth, dunes are formed by wind-blown sand that create repeated ridges in the desert or along beaches.
Similar patterns have been observed along river beds and alluvial plains, where water deposits sediment over time.
- People, Places, Checkmates: Teaching Social Studies with Chess.
- On the Waterfront (BFI Film Classics).
- New Understandings of Teachers Work: Emotions and Educational Change.
In all cases, dune-like formations are the result of solid particles being transported by a moving medium i. However, when consulting images from New Horizons probe, Telfer and his colleagues noted similar formations in the Sputnik Planitia region on Pluto. This region, which constitutes the western lobe of the heart-shaped Tombaugh Regio, is essentially a massive ice-covered basin. Already, researchers have noted that the surface appears to consist of irregular polygons bordered by troughs, which appear to be indications of convection cells.
But one area became more and more convincing with every pass. Another interesting feature is the dark streams that are a few kilometers long and are all aligned in the same direction. But equally interesting were the features that Telfer and his team noticed, which looked like dunes that ran perpendicular to the wind streaks. This indicated that they were transverse dunes, the kinds that pile up due to prolonged wind activity in the desert.
To determine if this was a plausible hypothesis, the researchers constructed models that took into account what kind of particles would make up these dunes. They concluded that either methane or nitrogen ice would be able to form sand-sized grains that could be transported by typical winds. This is where sublimation played a key role, where surface ice goes from a solid phase directly to a gas when warmed by sunlight. As Dr. Telfer explained, this conclusion was made possible thanks to the immense amount of support his team got, much of which came from t he New Horizons Geology, Geophysics and Imaging Science Theme Team:.
Comparison of dune features on Pluto with those on Earth and Mars. And many more missions are being sent to explore the Red Planet before a crewed mission takes place in the s. Knowing how such formations were created are key to understanding the dynamics of the planet, which will help answer some of the deeper questions about what is taking place on the surface. Further Reading: ArsTechnica , Science. Pluto has been the focus of a lot of attention for more than a decade now.
For instance, a new study produced by researchers from the Southwest Research Institute and supported by NASA Rosetta funding indicates that Pluto may have formed from a billion comets crashing together. The study was authored by Dr. Christopher R. The first Kuiper Belt is home to more than , asteroids and comets there over 62 miles km across. The origin of Pluto is something that astronomers have puzzled over for some time.
However, this theory was disproven after dynamical studies showed that Pluto never approaches Neptune in its orbit. With the discovery of the Kuiper Belt in , the true of origin of Pluto began to become clear. Essentially, while Pluto is the largest object in the Kuiper Belt, it is similar in orbit and composition to the icy objects that surround it. On occasion, some of these objects are kicked out of the Kuiper Belt and become long-period comets in the Inner Solar System. Glein and Dr. Waite Jr. Glein explained:. We found an intriguing consistency between the estimated amount of nitrogen inside the glacier and the amount that would be expected if Pluto was formed by the agglomeration of roughly a billion comets or other Kuiper Belt objects similar in chemical composition to 67P, the comet explored by Rosetta.
In this scenario, Pluto formed from the very cold ices that were part of the protoplanetary disk, and would therefore have a chemical composition that more closely matches that of the Sun. In order to determine which was more likely, scientists needed to understand not only how much nitrogen is present at Pluto now in its atmosphere and glaciers , but how much could have leaked out into space over the course of eons. They then needed to come up with an explanation for the current proportion of carbon monoxide to nitrogen.
Ultimately, the low abundance of carbon monoxide at Pluto could only be explained by burial in surface ices or destruction from liquid water. In the end, Dr. While the research certainly offers an interesting explanation for how Pluto formed, the solar model still satisfies some criteria. In the end, more research will be needed before scientists can conclude how Pluto formed.
And if data from the New Horizons or Rosetta missions should prove insufficient, perhaps another to New Frontiers mission to Pluto will solve the mystery! Another historic first, the spacecraft will study these ancient objects in the hopes of learning more about the formation and evolution of the Solar System. By Jan. This object, which orbits our Sun at a distance of about 1.
It will also be the farthest encounter ever achieved in the history of space exploration. In , MU69 was identified as one of two potential destinations for the New Horizons mission and was recommended to NASA by the mission science team. It was selected because of the immense opportunities for research it presented. By the late s, Soviet cosmologist Rashid Sunyaev, now at the Max Planck Institute for Astrophysics in Garching, Germany, was among the first researchers to realize that the line could also be used to study the primordial cosmos.
Sunyaev and his mentor, the late Yakov Zeldovich, thought of using the primordial hydrogen signal to test some early theories for how galaxies formed 2. A simulation of the epoch of reionization in the early Universe. Ionized material around new galaxies bright blue would no longer emit centimetre radiation. Neutral hydrogen, still glowing at 21 cm, appears dark. Credit: M.
Alvarez, R. Kaehler and T. The idea of mapping the early Universe with cm photons received only sporadic attention for three decades, but technological advancements in the past few years have made the technique look more tractable. The basics of radio detection remain the same; many radio telescopes are constructed from simple materials, such as plastic pipes and wire mesh. But the signal-processing capabilities of the telescopes have become much more advanced.
Consumer-electronics components that were originally developed for gaming and mobile phones now allow observatories to crunch enormous amounts of data with relatively little investment. Meanwhile, theoretical cosmologists have been making a more detailed and compelling case for the promise of cm cosmology. Right after atomic hydrogen formed in the aftermath of the Big Bang, the only light in the cosmos was that which reaches Earth today as faint, long-wavelength radiation coming from all directions — a signal known as the cosmic microwave background CMB. Some 14 billion years ago, this afterglow of the Big Bang would have looked uniformly orange to human eyes.
Then the sky would have reddened, before slowly dimming into pitch darkness; there was simply nothing else there to produce visible light, as the wavelengths of the background radiation continued to stretch through the infrared spectrum and beyond. Loeb Phys.
D 82 , Over time, theorists reckon that the evolving Universe would have left three distinct imprints on the hydrogen that filled space. The first event would have begun some 5 million years after the Big Bang, when the hydrogen became cool enough to absorb more of the background radiation than it emitted.
Evidence of this period should be detectable today in the CMB spectrum as a dip in intensity at a certain wavelength, a feature that has been dubbed the dark-ages trough. A second change arose some million years later, after matter had clumped together enough to create the first stars and galaxies. As a result, astronomers expect to see a second dip, or trough, in the CMB spectrum at a different, shorter wavelength; this is the signature that EDGES seems to have detected 1.
But the hydrogen closest to those early galaxies absorbed so much energy that it lost its electrons and went dark. Those dark, ionized bubbles grew bigger over roughly half a billion years, as galaxies grew and merged, leaving less and less luminous hydrogen between them. Cosmologists call this transition the epoch of reionization, or EOR. The EOR is the period that many cm radioastronomy experiments, either ongoing or in preparation, are aiming to detect.
The hope is to map it in 3D as it evolved over time, by taking snapshots of the sky at different wavelengths, or redshifts. Details of when the bubbles formed, their shapes and how fast they grew will reveal how galaxies formed and what kind of light they produced. If stars did most of the reionization, the bubbles will have neat, regular shapes, Chapman says. The EOR will also provide an unprecedented test for the current best model of cosmic evolution.
Although there is plenty of evidence for dark matter, nobody has identified exactly what it is. Although astronomers are desperate to learn more about the EOR, they are only now starting to close in on the ability to detect it. Leading the way are radio telescope arrays, which compare signals from multiple antennas to detect variations in the intensity of waves arriving from different directions in the sky.
Currently the largest low-frequency radio observatory in the world, it has so far only been able to put limits on the size distribution of the bubbles, thereby excluding some extreme scenarios, such as those in which the intergalactic medium was particularly cold, says Leon Koopmans, an astronomer at the University of Groningen in the Netherlands who leads the EOR studies for LOFAR. Credit: Hsin Cynthia Chiang.