The mission of the NWA is to globally promote nanotechnology solutions adoption across industries by connecting entrepreneurs with researchers, start-ups with investors, providers with potential customers, employers with job seekers, key players with one another, in an independent and mainly industry-oriented advocacy group – inclusive of business, academia, business-supporting associates, as well as affiliate government agencies and other associations.
Scientists at TU Wien (Vienna) have calculated that the meson f0(1710) could be a very special particle – the long-sought-after glueball, a particle composed of pure force.
For decades, scientists have been looking for so-called “glueballs”. Now it seems they have been found at last. A glueball is an exotic particle, made up entirely of gluons – the “sticky” particles that keep nuclear particles together. Glueballs are unstable and can only be detected indirectly, by analysing their decay. This decay process, however, is not yet fully understood.
Professor Anton Rebhan and Frederic Brünner from TU Wien (Vienna) have now employed a new theoretical approach to calculate glueball decay. Their results agree extremely well with data from particle accelerator experiments. This is strong evidence that a resonance called “f0(1710)”, which has been found in various experiments, is in fact the long-sought glueball. Further experimental results are to be expected in the next few months.
Forces are Particles too
Protons and neutrons consist of even smaller elementary particles called quarks. These quarks are bound together by strong nuclear force. “In particle physics, every force is mediated by a special kind of force particle, and the force particle of the strong nuclear force is the gluon”, says Anton Rebhan (TU Wien).
Gluons can be seen as more complicated versions of the photon. The massless photons are responsible for the forces of electromagnetism, while eight different kinds of gluons play a similar role for the strong nuclear force. However, there is one important difference: gluons themselves are subject to their own force, photons are not. This is why there are no bound states of photons, but a particle that consists only of bound gluons, of pure nuclear force, is in fact possible.
In 1972, shortly after the theory of quarks and gluons was formulated, the physicists Murray Gell-Mann and Harald Fritsch speculated about possible bound states of pure gluons (originally called “gluonium”, today the term “glueball” is used). Several particles have been found in particle accelerator experiments which are considered to be viable candidates for glueballs, but there has never been a scientific consensus on whether or not one of these signals could in fact be the mysterious particle made of pure force. Instead of a glueball, the signals found in the experiments could also be a combination of quarks and antiquarks. Glueballs are too short-lived to detect them directly. If they exist, they have to be identified by studying their decay.
Candidate f0(1710) decays strangely
“Unfortunately, the decay pattern of glueballs cannot be calculated rigorously”, says Anton Rebhan. Simplified model calculations have shown that there are two realistic candidates for glueballs: the mesons called f0(1500) and f0(1710). For a long time, the former was considered to be the most promising candidate. The latter has a higher mass, which agrees better with computer simulations, but when it decays, it produces many heavy quarks (the so-called “strange quarks”). To many particle scientists, this seemed implausible, because gluon interactions do not usually differentiate between heavier and lighter quarks.
Anton Rebhan and his PhD-student Frederic Brünner have now made a major step forward in solving this puzzle by trying a different approach. There are fundamental connections between quantum theories describing the behaviour of particles in our three dimensional world and certain kinds of gravitation theories in higher dimensional spaces. This means that certain quantum physical questions can be answered using tools from gravitational physics.
“Our calculations show that it is indeed possible for glueballs to decay predominantly into strange quarks”, says Anton Rebhan. Surprisingly, the calculated decay pattern into two lighter particles agrees extremely well with the decay pattern measured for f0(1710). In addition to that, other decays into more than two particles are possible. Their decay rates have been calculated too.
Further Data is Expected Soon
Up until now, these alternative glueball decays have not been measured, but within the next few months, two experiments at the Large Hadron Collider at CERN (TOTEM and LHCb) and one accelerator experiment in Beijing (BESIII) are expected to yield new data. “These results will be crucial for our theory”, says Anton Rebhan. “For these multi-particle processes, our theory predicts decay rates which are quite different from the predictions of other, simpler models. If the measurements agree with our calculations, this will be a remarkable success for our approach.” It would be overwhelming evidence for f0(1710) being a glueball. And in addition to that, it would once again show that higher dimensional gravity can be used to answer questions from particle physics – in a way it would be one more big success of Einstein’s theory of general relativity, which turns 100 years old next month.
Our unfolding understanding of the universe is marked by epic searches and we are now on the brink of discovering something that has escaped detection for many years.
The search for gravity waves has been a century long epic. They are a prediction of Einstein’s General Theory of Relativity but for years physicists argued about their theoretical existence.
By 1957 physicists had proved that they must carry energy and cause vibrations. But it was also apparent that waves carrying a million times more energy than sunlight would make vibrations smaller than an atomic nucleus.
Building detectors seemed a daunting task but in the 1960s a maverick physicist Joseph Weber, at the University of Maryland, began to design the first detectors. By 1969 he claimed success!
There was excitement and consternation. How could such vast amounts of energy be reconciled with our understanding of stars and galaxies? A scientific gold rush began.
Within two years, ten new detectors had been built in major labs across the planet. But nothing was detected.
Going to need a better detector
Some physicists gave up on the field but for the next 40 years a growing group of physicists set about trying to build vastly better detectors.
By the 1980s a worldwide collaboration to build five detectors, called cryogenic resonant bars, was underway, with one detector called NIOBE located at the University of Western Australia.
These were huge metal bars cooled to near absolute zero. They used superconducting sensors that could detect a million times smaller vibration energy than those of Weber.
Gravity waves caused by two rotating black holes.
Click to enlarge
They operated throughout much of the 1990s. If a pair of black holes had collided in our galaxy, or a new black hole had formed, it would have been heard as a gentle ping in the cold bars… but all remained quiet.
What the cryogenic detectors did achieve was an understanding of how quantum physics affects measurement, even of tonne-scale objects. The detectors forced us to come to grips with a new approach to measurement. Today this has grown into a major research field called macroscopic quantum mechanics.
But the null results did not mean the end. It meant that we had to look further into the universe. A black hole collision may be rare in one galaxy but it could be a frequent occurrence if you could listen in to a million galaxies.
Laser beams will help
A new technology was needed to stretch the sensitivity enormously, and by the year 2000 this was available: a method called laser interferometry.
The idea was to use laser beams to measure tiny vibrations in the distance between widely spaced mirrors. The bigger the distance the bigger the vibration! And an L-shape could double the signal and cancel out the noise from the laser.
Several teams of physicists including a team at the Australian National University had spent many years researching the technology. Laser beam measurements allowed very large spacing and so new detectors up to 4km in size were designed and constructed in the US, Europe and Japan.
The Australian Consortium for Gravitational Astronomy built a research centre on a huge site at Gingin, just north of Perth, in Western Australia, that was reserved for the future southern hemisphere gravitational wave detector.
The world would need this so that triangulation could be used to locate signals.
Latest detectors
The new detectors were proposed in two stages. Because they involved formidable technological challenges, the first detectors would have the modest aim of proving that the laser technology could be implemented on a 4km scale, but using relatively low intensity laser light that would mean only a few per cent chance of detecting any signals.
The detectors were housed inside the world’s largest vacuum system, the mirrors had to be 100 times more perfect than a telescope mirror, seismic vibrations had to be largely eliminated, and the laser light had to be the purest light ever created.
A second stage would be a complete rebuild with bigger mirrors, much more laser power and even better vibration control. The second stage would have a sensitivity where coalescing pairs of neutron stars merging to form black holes, would be detectable about 20 to 40 times per year.
Australia has been closely involved with both stages of the US project. CSIRO was commissioned to polish the enormously precise mirrors that were the heart of the first stage detectors.
A gathering of minds
The Australian Consortium gathered at Gingin earlier this year to plan a new national project.
Students at work in the labs at Gingin.University of WA
Click to enlarge
Part of that project focusses on an 80 meter scale laser research facility – a sort of mini gravity wave detector – the consortium has developed at the site. Experiments are looking at the physics of the new detectors and especially the forces exerted by laser light.
The team has discovered several new phenomena including one that involves laser photons bouncing off particles of sound called phonons. This phenomenon turns out to be very useful as it allows new diagnostic tools to prevent instabilities in the new detectors.
The light forces can also be used to make “optical rods” – think of a Star Wars light sabre! These devices can capture more gravitational wave energy – opening up a whole range of future possibilities from useful gadgets to new gravitational wave detectors.
Final stages of discovery
The first stage detectors achieved their target sensitivity in 2006 and, as expected, they detected no signals. You would know if they had!
The second stage detectors are expected to begin operating next year. The Australian team is readying itself because the new detectors change the whole game.
For the first time we have firm predictions: both the strength and the number of signals. No longer are we hoping for rare and unknown events.
We will be monitoring a significant volume of the universe and for the first time we can be confident that we will “listen” to the coalescence of binary neutron star systems and the formation of black holes.
Once these detectors reach full sensitivity we should hear signals almost once a week. Exactly when we will reach this point, no one knows. We have to learn how to operate the vast and complex machines.
If you want to place bets on the date of first detection of some gravity wave then some physicists would bet on 2016, probably the majority would bet 2017. A few pessimists would say that we will discover unexpected problems that might take a few years to solve.
Researchers have made the first experimental determination of the weak charge of the proton in research carried out at the Department of Energy's Thomas Jefferson National Accelerator Facility (Jefferson Lab).
The results, accepted for publication inPhysical Review Letters, also include the determinations of the weak charge of the neutron, and of the up quark and down quark. These determinations were made by combining the new data with published data from other experiments. Although these preliminary figures are the most precise determinations to date, they were obtained from an analysis of just 4 percent of the total data taken by the experiment, with the full data analysis expected to take another year to complete.
The weak force is one of the four fundamental forces in our universe, along with gravity, electromagnetism and the strong force. Although the weak force acts only on the sub-atomic level, we can see its effects in our everyday world. The weak force plays a key role in the nuclear reaction processes that take place in stars and is responsible for much of the natural radiation present in our universe.
The Q-weak experiment was designed by an international group of nuclear physicists who came together more than a decade ago to propose a new measurement at Jefferson Lab. They proposed the world’s first direct measurement of the proton's weak charge, denoted by the symbolthis represents the strength of the weak force's pull on the proton, a measure of how strongly a proton interacts via the weak force. Since the weak charge of the proton is precisely predicted by the Standard Model, which is a well-tested theoretical framework that describes the elementary particles and the details of how they interact, it is an ideal parameter to measure experimentally as a test of the Standard Model.
To perform the experiment, the scientists directed a very intense beam of electrons into a container of liquid hydrogen. The electrons were longitudinally polarized (spinning along or opposite their direction of motion). Electrons that made only glancing collisions with the protons (elastic scattering, where the proton remained intact) emerged at small angles and were deflected by powerful electromagnets onto eight symmetrically placed detectors.
The weak force is far weaker than the electromagnetic force. In classical terms, one might think of this as for every one million electrons that interact with the protons via the electromagnetic force, only one will interact via the weak force. Physicists measured those few weak interactions by exploiting an important difference between the two forces - the weak force violates a symmetry known as parity, which reverses all spatial directions and turns our right-handed world into a left-handed one. In an opposite-parity world, the electrons spinning with their axes along their direction of motion would interact with protons via the electromagnetic force with the same strength. Where the weak force is concerned, electrons with right-handed spin interact differently than left-handed ones. By keeping all other parameters of the experiment the same, and only reversing the polarization direction of the electron beam, scientists can use the difference or “asymmetry” of the measurements between two polarization directions to isolate the effect of the weak interaction. The goal is to measure this difference, only ~200 parts per billion, as precisely as possible. This precision is equivalent to measuring the thickness of a sheet of paper laid atop the Eiffel Tower.
The initial analysis of the Q-weak experimental data yielded a value forthat is in good agreement with the Standard Model prediction. However, the collaboration has 25 times more data than was used in this initial determination. The final result should provide a rigorous experimental test of the Standard Model, providing constraints on new physics at the scale of energies being explored at the Large Hadron Collider at CERN in Europe.
"Readers should view this result primarily as a first determination of the weak charge of the proton. Our final publication will be focused on implications with respect to potential new physics," says Roger Carlini, a Jefferson Lab staff scientist and spokesperson for the Q-weak Collaboration.
The Q-weak experiment was originally approved in January 2002. A nearly year-long installation period for experimental equipment began in 2009, which was followed by a two year period of data collection during 2010-2012.
Numerous technical achievements in the last decade made this experiment possible. These include the high-current, high-polarization, extremely stable electron beam provided by Jefferson Lab's Continuous Electron Beam Accelerator Facility; the world's highest-power cryogenic hydrogen target; extremely radiation-hard Cerenkov detectors; ultra-low noise electronics to read out the signals and precisely measure the beam current; and a system which measures the beam polarization to better than 1 percent using a back-scattered laser. These technical achievements have yielded an astonishingly small total uncertainty of 47 parts per billion for the data published so far.
The Q-weak collaboration consists of 97 researchers from 23 institutions in the U.S., Canada, and Europe. The experiment was funded by the U.S. Department of Energy Office of Science, the U.S. National Science Foundation and the Natural Sciences and Engineering Research Council of Canada. Matching university contributions were also provided by The College of William and Mary, Virginia Tech, George Washington University and Louisiana Tech University. Technical support was provided by TRIUMF, MIT/Bates and Jefferson Lab.
Jefferson Science Associates manages Jefferson Lab for the DOE Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit http://science.energy.gov/.
Many researchers believe that physics will not be complete until it can explain not just the behaviour of space and time, but where these entities come from.
“Imagine waking up one day and realizing that you actually live inside a computer game,” says Mark Van Raamsdonk, describing what sounds like a pitch for a science-fiction film. But for Van Raamsdonk, a physicist at the University of British Columbia in Vancouver, Canada, this scenario is a way to think about reality. If it is true, he says, “everything around us — the whole three-dimensional physical world — is an illusion born from information encoded elsewhere, on a two-dimensional chip”. That would make our Universe, with its three spatial dimensions, a kind of hologram, projected from a substrate that exists only in lower dimensions.
This 'holographic principle' is strange even by the usual standards of theoretical physics. But Van Raamsdonk is one of a small band of researchers who think that the usual ideas are not yet strange enough. If nothing else, they say, neither of the two great pillars of modern physics — general relativity, which describes gravity as a curvature of space and time, and quantum mechanics, which governs the atomic realm — gives any account for the existence of space and time. Neither does string theory, which describes elementary threads of energy.
Van Raamsdonk and his colleagues are convinced that physics will not be complete until it can explain how space and time emerge from something more fundamental — a project that will require concepts at least as audacious as holography. They argue that such a radical reconceptualization of reality is the only way to explain what happens when the infinitely dense 'singularity' at the core of a black hole distorts the fabric of space-time beyond all recognition, or how researchers can unify atomic-level quantum theory and planet-level general relativity — a project that has resisted theorists' efforts for generations.
“All our experiences tell us we shouldn't have two dramatically different conceptions of reality — there must be one huge overarching theory,” says Abhay Ashtekar, a physicist at Pennsylvania State University in University Park.
Finding that one huge theory is a daunting challenge. Here, Nature explores some promising lines of attack — as well as some of the emerging ideas about how to test these concepts (see 'The fabric of reality').
Gravity as thermodynamics
One of the most obvious questions to ask is whether this endeavour is a fool's errand. Where is the evidence that there actually is anything more fundamental than space and time?
A provocative hint comes from a series of startling discoveries made in the early 1970s, when it became clear that quantum mechanics and gravity were intimately intertwined with thermodynamics, the science of heat.
In 1974, most famously, Stephen Hawking of the University of Cambridge, UK, showed that quantum effects in the space around a black hole will cause it to spew out radiation as if it was hot. Other physicists quickly determined that this phenomenon was quite general. Even in completely empty space, they found, an astronaut undergoing acceleration would perceive that he or she was surrounded by a heat bath. The effect would be too small to be perceptible for any acceleration achievable by rockets, but it seemed to be fundamental. If quantum theory and general relativity are correct — and both have been abundantly corroborated by experiment — then the existence of Hawking radiation seemed inescapable.
A second key discovery was closely related. In standard thermodynamics, an object can radiate heat only by decreasing its entropy, a measure of the number of quantum states inside it. And so it is with black holes: even before Hawking's 1974 paper, Jacob Bekenstein, now at the Hebrew University of Jerusalem, had shown that black holes possess entropy. But there was a difference. In most objects, the entropy is proportional to the number of atoms the object contains, and thus to its volume. But a black hole's entropy turned out to be proportional to the surface area of its event horizon — the boundary out of which not even light can escape. It was as if that surface somehow encoded information about what was inside, just as a two-dimensional hologram encodes a three-dimensional image.
In 1995, Ted Jacobson, a physicist at the University of Maryland in College Park, combined these two findings, and postulated that every point in space lies on a tiny 'black-hole horizon' that also obeys the entropy–area relationship. From that, he found, the mathematics yielded Einstein's equations of general relativity — but using only thermodynamic concepts, not the idea of bending space-time1.
“This seemed to say something deep about the origins of gravity,” says Jacobson. In particular, the laws of thermodynamics are statistical in nature — a macroscopic average over the motions of myriad atoms and molecules — so his result suggested that gravity is also statistical, a macroscopic approximation to the unseen constituents of space and time.
In 2010, this idea was taken a step further by Erik Verlinde, a string theorist at the University of Amsterdam, who showed2 that the statistical thermodynamics of the space-time constituents — whatever they turned out to be — could automatically generate Newton's law of gravitational attraction.
And in separate work, Thanu Padmanabhan, a cosmologist at the Inter-University Centre for Astronomy and Astrophysics in Pune, India, showed3 that Einstein's equations can be rewritten in a form that makes them identical to the laws of thermodynamics — as can many alternative theories of gravity. Padmanabhan is currently extending the thermodynamic approach in an effort to explain the origin and magnitude of dark energy: a mysterious cosmic force that is accelerating the Universe's expansion.
Testing such ideas empirically will be extremely difficult. In the same way that water looks perfectly smooth and fluid until it is observed on the scale of its molecules — a fraction of a nanometre — estimates suggest that space-time will look continuous all the way down to the Planck scale: roughly 10−35 metres, or some 20 orders of magnitude smaller than a proton.
But it may not be impossible. One often-mentioned way to test whether space-time is made of discrete constituents is to look for delays as high-energy photons travel to Earth from distant cosmic events such as supernovae and γ-ray bursts. In effect, the shortest-wavelength photons would sense the discreteness as a subtle bumpiness in the road they had to travel, which would slow them down ever so slightly. Giovanni Amelino-Camelia, a quantum-gravity researcher at the University of Rome, and his colleagues have found4 hints of just such delays in the photons from a γ-ray burst recorded in April. The results are not definitive, says Amelino-Camelia, but the group plans to expand its search to look at the travel times of high-energy neutrinos produced by cosmic events. He says that if theories cannot be tested, “then to me, they are not science. They are just religious beliefs, and they hold no interest for me.”
Other physicists are looking at laboratory tests. In 2012, for example, researchers from the University of Vienna and Imperial College London proposed5 a tabletop experiment in which a microscopic mirror would be moved around with lasers. They argued that Planck-scale granularities in space-time would produce detectable changes in the light reflected from the mirror (see Naturehttp://doi.org/njf; 2012).
Loop quantum gravity
Even if it is correct, the thermodynamic approach says nothing about what the fundamental constituents of space and time might be. If space-time is a fabric, so to speak, then what are its threads?
One possible answer is quite literal. The theory of loop quantum gravity, which has been under development since the mid-1980s by Ashtekar and others, describes the fabric of space-time as an evolving spider's web of strands that carry information about the quantized areas and volumes of the regions they pass through6. The individual strands of the web must eventually join their ends to form loops — hence the theory's name — but have nothing to do with the much better-known strings of string theory. The latter move around in space-time, whereas strands actually are space-time: the information they carry defines the shape of the space-time fabric in their vicinity.
Because the loops are quantum objects, however, they also define a minimum unit of area in much the same way that ordinary quantum mechanics defines a minimum ground-state energy for an electron in a hydrogen atom. This quantum of area is a patch roughly one Planck scale on a side. Try to insert an extra strand that carries less area, and it will simply disconnect from the rest of the web. It will not be able to link to anything else, and will effectively drop out of space-time.
One welcome consequence of a minimum area is that loop quantum gravity cannot squeeze an infinite amount of curvature onto an infinitesimal point. This means that it cannot produce the kind of singularities that cause Einstein's equations of general relativity to break down at the instant of the Big Bang and at the centres of black holes.
In 2006, Ashtekar and his colleagues reported7 a series of simulations that took advantage of that fact, using the loop quantum gravity version of Einstein's equations to run the clock backwards and visualize what happened before the Big Bang. The reversed cosmos contracted towards the Big Bang, as expected. But as it approached the fundamental size limit dictated by loop quantum gravity, a repulsive force kicked in and kept the singularity open, turning it into a tunnel to a cosmos that preceded our own.
This year, physicists Rodolfo Gambini at the Uruguayan University of the Republic in Montevideo and Jorge Pullin at Louisiana State University in Baton Rouge reported8 a similar simulation for a black hole. They found that an observer travelling deep into the heart of a black hole would encounter not a singularity, but a thin space-time tunnel leading to another part of space. “Getting rid of the singularity problem is a significant achievement,” says Ashtekar, who is working with other researchers to identify signatures that would have been left by a bounce, rather than a bang, on the cosmic microwave background — the radiation left over from the Universe's massive expansion in its infant moments.
Loop quantum gravity is not a complete unified theory, because it does not include any other forces. Furthermore, physicists have yet to show how ordinary space-time would emerge from such a web of information. But Daniele Oriti, a physicist at the Max Planck Institute for Gravitational Physics in Golm, Germany, is hoping to find inspiration in the work of condensed-matter physicists, who have produced exotic phases of matter that undergo transitions described by quantum field theory. Oriti and his colleagues are searching for formulae to describe how the Universe might similarly change phase, transitioning from a set of discrete loops to a smooth and continuous space-time. “It is early days and our job is hard because we are fishes swimming in the fluid at the same time as trying to understand it,” says Oriti.
Causal sets
Such frustrations have led some investigators to pursue a minimalist programme known as causal set theory. Pioneered by Rafael Sorkin, a physicist at the Perimeter Institute in Waterloo, Canada, the theory postulates that the building blocks of space-time are simple mathematical points that are connected by links, with each link pointing from past to future. Such a link is a bare-bones representation of causality, meaning that an earlier point can affect a later one, but not vice versa. The resulting network is like a growing tree that gradually builds up into space-time. “You can think of space emerging from points in a similar way to temperature emerging from atoms,” says Sorkin. “It doesn't make sense to ask, 'What's the temperature of a single atom?' You need a collection for the concept to have meaning.”
In the late 1980s, Sorkin used this framework to estimate9 the number of points that the observable Universe should contain, and reasoned that they should give rise to a small intrinsic energy that causes the Universe to accelerate its expansion. A few years later, the discovery of dark energy confirmed his guess. “People often think that quantum gravity cannot make testable predictions, but here's a case where it did,” says Joe Henson, a quantum-gravity researcher at Imperial College London. “If the value of dark energy had been larger, or zero, causal set theory would have been ruled out.”
Causal dynamical triangulations
That hardly constituted proof, however, and causal set theory has offered few other predictions that could be tested. Some physicists have found it much more fruitful to use computer simulations. The idea, which dates back to the early 1990s, is to approximate the unknown fundamental constituents with tiny chunks of ordinary space-time caught up in a roiling sea of quantum fluctuations, and to follow how these chunks spontaneously glue themselves together into larger structures.
The earliest efforts were disappointing, says Renate Loll, a physicist now at Radboud University in Nijmegen, the Netherlands. The space-time building blocks were simple hyper-pyramids — four-dimensional counterparts to three-dimensional tetrahedrons — and the simulation's gluing rules allowed them to combine freely. The result was a series of bizarre 'universes' that had far too many dimensions (or too few), and that folded back on themselves or broke into pieces. “It was a free-for-all that gave back nothing that resembles what we see around us,” says Loll.
But, like Sorkin, Loll and her colleagues found that adding causality changed everything. After all, says Loll, the dimension of time is not quite like the three dimensions of space. “We cannot travel back and forth in time,” she says. So the team changed its simulations to ensure that effects could not come before their cause — and found that the space-time chunks started consistently assembling themselves into smooth four-dimensional universes with properties similar to our own10.
Intriguingly, the simulations also hint that soon after the Big Bang, the Universe went through an infant phase with only two dimensions — one of space and one of time. This prediction has also been made independently by others attempting to derive equations of quantum gravity, and even some who suggest that the appearance of dark energy is a sign that our Universe is now growing a fourth spatial dimension. Others have shown that a two-dimensional phase in the early Universe would create patterns similar to those already seen in the cosmic microwave background.
Holography
Meanwhile, Van Raamsdonk has proposed a very different idea about the emergence of space-time, based on the holographic principle. Inspired by the hologram-like way that black holes store all their entropy at the surface, this principle was first given an explicit mathematical form by Juan Maldacena, a string theorist at the Institute of Advanced Study in Princeton, New Jersey, who published11 his influential model of a holographic universe in 1998. In that model, the three-dimensional interior of the universe contains strings and black holes governed only by gravity, whereas its two-dimensional boundary contains elementary particles and fields that obey ordinary quantum laws without gravity.
Hypothetical residents of the three-dimensional space would never see this boundary, because it would be infinitely far away. But that does not affect the mathematics: anything happening in the three-dimensional universe can be described equally well by equations in the two-dimensional boundary, and vice versa.
In 2010, Van Raamsdonk studied what that means when quantum particles on the boundary are 'entangled' — meaning that measurements made on one inevitably affect the other12. He discovered that if every particle entanglement between two separate regions of the boundary is steadily reduced to zero, so that the quantum links between the two disappear, the three-dimensional space responds by gradually dividing itself like a splitting cell, until the last, thin connection between the two halves snaps. Repeating that process will subdivide the three-dimensional space again and again, while the two-dimensional boundary stays connected. So, in effect, Van Raamsdonk concluded, the three-dimensional universe is being held together by quantum entanglement on the boundary — which means that in some sense, quantum entanglement and space-time are the same thing.
Or, as Maldacena puts it: “This suggests that quantum is the most fundamental, and space-time emerges from it.”