Tuesday, July 19, 2016

MIT scientists find weird quantum effects, even over hundreds of miles

Neutrinos traveling 450 miles have no individual identities, according to MIT analysis.

In the world of quantum, infinitesimally small particles, weird and often logic-defying behaviors abound. Perhaps the strangest of these is the idea of superposition, in which objects can exist simultaneously in two or more seemingly counterintuitive states. For example, according to the laws of quantum mechanics, electrons may spin both clockwise and counter-clockwise, or be both at rest and excited, at the same time.

The physicist Erwin Schrödinger highlighted some strange consequences of the idea of superposition more than 80 years ago, with a thought experiment that posed that a cat trapped in a box with a radioactive source could be in a superposition state, considered both alive and dead, according to the laws of quantum mechanics. Since then, scientists have proven that particles can indeed be in superposition, at quantum, subatomic scales. But whether such weird phenomena can be observed in our larger, everyday world is an open, actively pursued question.

Now, MIT physicists have found that subatomic particles called neutrinos can be in superposition, without individual identities, when traveling hundreds of miles. Their results, to be published later this month in Physical Review Letters, represent the longest distance over which quantum mechanics has been tested to date. 

A subatomic journey across state lines

The team analyzed data on the oscillations of neutrinos — subatomic particles that interact extremely weakly with matter, passing through our bodies by the billions per second without any effect. Neutrinos can oscillate, or change between several distinct “flavors,” as they travel through the universe at close to the speed of light.

The researchers obtained data from Fermilab’s Main Injector Neutrino Oscillation Search, or MINOS, an experiment in which neutrinos are produced from the scattering of other accelerated, high-energy particles in a facility near Chicago and beamed to a detector in Soudan, Minnesota, 735 kilometers (456 miles) away. Although the neutrinos leave Illinois as one flavor, they may oscillate along their journey, arriving in Minnesota as a completely different flavor.

The MIT team studied the distribution of neutrino flavors generated in Illinois, versus those detected in Minnesota, and found that these distributions can be explained most readily by quantum phenomena: As neutrinos sped between the reactor and detector, they were statistically most likely to be in a state of superposition, with no definite flavor or identity.
What’s more, the researchers  found that the data was “in high tension” with more classical descriptions of how matter should behave. In particular, it was statistically unlikely that the data could be explained by any model of the sort that Einstein sought, in which objects would always embody definite properties rather than exist in superpositions.

“What’s fascinating is, many of us tend to think of quantum mechanics applying on small scales,” says David Kaiser, the Germeshausen Professor of the History of Science and professor of physics at MIT. “But it turns out that we can’t escape quantum mechanics, even when we describe processes that happen over large distances. We can’t stop our quantum mechanical description even when these things leave one state and enter another, traveling hundreds of miles. I think that’s breathtaking.”

Kaiser is a co-author on the paper, which includes MIT physics professor Joseph Formaggio, junior Talia Weiss, and former graduate student Mykola Murskyj.

A flipped inequality

The team analyzed the MINOS data by applying a slightly altered version of the Leggett-Garg inequality, a mathematical expression named after physicists Anthony Leggett and Anupam Garg, who derived the expression to test whether a system with two or more distinct states acts in a quantum or classical fashion.

Leggett and Garg realized that the measurements of such a system, and the statistical correlations between those measurements, should be different if the system behaves according to classical versus quantum mechanical laws.

“They realized you get different predictions for correlations of measurements of a single system over time, if you assume superposition versus realism,” Kaiser explains, where “realism” refers to models of the Einstein type, in which particles should always exist in some definite state.

Formaggio had the idea to flip the expression slightly, to apply not to repeated measurements over time but to measurements at a range of neutrino energies. In the MINOS experiment, huge numbers of neutrinos are created at various energies, where Kaiser says they then “careen through the Earth, through solid rock, and a tiny drizzle of them will be detected” 735 kilometers away.

According to Formaggio’s reworking of the Leggett-Garg inequality, the distribution of neutrino flavors — the type of neutrino that finally arrives at the detector — should depend on the energies at which the neutrinos were created. Furthermore, those flavor distributions should look very different if the neutrinos assumed a definite identity throughout their journey, versus if they were in superposition, with no distinct flavor.

“The big world we live in”

Applying their modified version of the Leggett-Garg expression to neutrino oscillations, the group predicted the distribution of neutrino flavors arriving at the detector, both if the neutrinos were behaving classically, according to an Einstein-like theory, and if they were acting in a quantum state, in superposition. When they compared both predicted distributions, they found there was virtually no overlap.

More importantly, when they compared these predictions with the actual distribution of neutrino flavors observed from the MINOS experiment, they found that the data fit squarely within the predicted distribution for a quantum system, meaning that the neutrinos very likely did not have individual identities while traveling over hundreds of miles between detectors.

But what if these particles truly embodied distinct flavors at each moment in time, rather than being some ghostly, neither-here-nor-there phantoms of quantum physics? What if these neutrinos behaved according to Einstein’s realism-based view of the world? After all, there could be statistical flukes due to defects in instrumentation, that might still generate a distribution of neutrinos that the researchers observed. Kaiser says if that were the case and “the world truly obeyed Einstein’s intuitions,” the chances of such a model accounting for the observed data would be “something like one in a billion.”  

So how do neutrinos do it? How do they maintain a quantum, identityless state for seemingly long distances? André de Gouvêa, professor of physics and astronomy at Northwestern University, says because neutrinos move so fast and interact with so little in the world, “relativistic effects — as in Einstein’s special theory of relativity —are huge, and conspire to make the very long distances appear [to the neutrinos] short.”

“The final result is that, like all other tests performed to date under very different circumstances, quantum mechanics appears to be the correct description of the world at all distance scales, weirdness not withstanding,” says Gouvêa, who was not involved in the research.

“What gives people pause is, quantum mechanics is quantitatively precise and yet it comes with all this conceptual baggage,” Kaiser says. “That’s why I like tests like this: Let’s let these things travel further than most people will drive on a family road trip, and watch them zoom through the big world we live in, not just the strange world of quantum mechanics, for hundreds of miles. And even then, we can’t stop using quantum mechanics. We really see quantum effects persist across macroscopic distances.”

Monday, July 11, 2016

Physicists discover family of tetraquarks

Syracuse University Professor Tomasz Skwarnicki and Ph.D. student Thomas Britton confirm existence of rare 'exotic' particle, find evidence of 3 others
Physicists in the Syracuse University College of Arts and Sciences have made science history by confirming the existence of a rare four-quark particle and discovering evidence of three other "exotic" siblings.
Their findings are based on data from the Large Hadron Collider (LHC), the world's biggest, most powerful particle accelerator, located at the CERN science laboratory in Geneva, Switzerland.
Professor Tomasz Skwarnicki and Ph.D. student Thomas Britton G'16, both members of the Experimental High-Energy Physics Group at Syracuse and the Large Hadron Collider beauty (LHCb) collaboration at CERN, have confirmed the existence of a tetraquark candidate known as X(4140). They also have detected three other exotic particles with higher masses, called X(4274), X(4500) and X(4700).
All four particles were the subject of Britton's Ph.D. dissertation, which he defended in May and then submitted, on behalf of the LHCb collaboration, as a journal article to Physical Review Letters (American Physical Society, 2016).
A tetraquark is a particle made of four quarks: two quarks and two antiquarks.
Tetraquarks--and, by extension, pentaquarks, containing five quarks--are considered exotic because they have more than the usual allotment of two or three quarks.
"Even though all four particles contain the same quark composition, each of them has a unique internal structure, mass and set of quantum numbers," says Skwarnicki, who, in April 2014, confirmed the existence of the world's first charged tetraquark candidate, called Z(4430)+. A year earlier, he and Ph.D. student Bin Gui G'14 determined the quantum numbers of the first neutral, heavy tetraquark candidate, X(3872).
Quantum numbers describe each particle's subatomic properties.
Skwarnicki says the measurement of all four particles is the largest single one of its kind to date. Unlike other exotic particle candidates, his and Britton's do not contain ordinary nuclear matter (i.e., quarks found in protons and neutrons).
"We've never seen this kind of thing before. It's helping us distinguish among various theoretical models of particles," Skwarnicki says.
A rendering of the enormous LHCb detector, which registers approximately 10 million proton collisions per second. Scientists study the debris from these collisions to better understand the building blocks of matter and the forces controlling them.

A fellow of the American Physical Society, Skwarnicki is a longtime member of the LHCb collaboration, involving approximately 800 other scientists from 16 countries. Their goal is to discover all forms of matter, in hopes of explaining why the universe is made of it, instead of anti-matter.
Skwarnicki's work focuses on quarks--fundamental constituents of matter that serve as a kind of scaffolding for protons and neutrons. While most particles have two or three quarks, Skwarnicki and others, in the past decade, have observed ones with four or five.
Last summer, he and doctoral student Nathan Jurik G'16 teamed up with Distinguished Professor Sheldon Stone and Liming Zhang, a professor at Tsinghua University in Beijing, to announce their discovery of two rare pentaquark states. The news made headlines, thrusting Syracuse and CERN into the international spotlight.
According to the Standard Model of particle physics, there are six kinds of quarks, whose intrinsic properties cause them to be grouped into pairs with unusual names: up/down, charm/strange and top/bottom.
Tomasz Skwarnicki
The particles that Skwarnicki and Britton study have two charm quarks and two strange quarks. Charm and strange quarks are the third- and fourth-most massive of all quarks.
That all four quarks in the new family are "heavy" is noteworthy.
"The heavier the quark, the smaller the corresponding particle it creates," says Skwarnicki, adding that the names of the particles reflect their masses. "The names are denoted by mega-electron volts [MeV], referring to the amount of energy an electron gains after being accelerated by a volt of electricity. ... This information, along with each particle's quantum numbers, enhances our understanding of the formation of particles and the fundamental structures of matter."
Evidence of X(4140) first appeared in 2009 at the Fermi National Accelerator Laboratory, outside of Chicago, but the observation was not confirmed until three years later at CERN.
A rendering of the enormous LHCb detector, which registers approximately 10 million proton collisions per second. Scientists study the debris from these collisions to better understand the building blocks of matter and the forces controlling them. Extremely rare and four times heavier than a proton, X(4140) has been initially detected only 20 times out of billions of man-made energy collisions. LHCb is uniquely suited to study such particles, and thus, has gone on to detect X(4140) nearly 560 times.
Skwarnicki attributes the discovery of X(4140)'s three siblings, culled from LHCb data from 2011 to 2012, to increased instrumental sensitivity. It is the energy configuration of the quarks, he explains, that gives each particle its unique mass and identity.
Thomas Britton G'16
"Quarks may be tightly bound, like three quarks packed inside a single proton, or loosely bound, like two atoms forming a molecule," Skwarnicki says. "By examining the particles' quantum numbers, we were able to narrow down the possibilities and rule out the molecular hypothesis."
A snapshot of LHCb detector data, singling out the collisions that have resulted in the four tetraquarks. Not that the process has been easy. An "aporetic saga" is how Britton describes studying molecular structures that seem to "jump out of the data."
"We looked at every known particle and process to make sure that these four structures couldn't be explained by any pre-existing physics," he says. "It was like baking a six-dimensional cake with 98 ingredients and no recipe--just a picture of a cake."
Meanwhile, Skwarnicki, Britton and others face the onerous task of combing through data and developing theoretical models, in an attempt to confirm what they have seen.
"It may be a quartet of entirely new particles or the complex interplay of known particles, simply flipping their identities," Skwarnicki concludes. "Either way, the outcome will shape our understanding of the subatomic universe."

Thursday, June 23, 2016

Making error-free DNA from RNA

For 3 billion years, one of the major carriers of information needed for life, RNA, has had a glitch that creates errors when making copies of genetic information. Researchers at The University of Texas at Austin have developed a fix that allows RNA to accurately proofread for the first time. The new discovery, published June 23 in the journal Science, will increase precision in genetic research and could dramatically improve medicine based on a person's genetic makeup.

Certain viruses called retroviruses can cause RNA to make copies of DNA, a process called reverse transcription. This process is notoriously prone to errors because an evolutionary ancestor of all viruses never had the ability to accurately copy genetic material.

The new innovation engineered at UT Austin is an enzyme that performs reverse transcription but can also "proofread," or check its work while copying genetic code. The enzyme allows, for the first time, for large amounts of RNA information to be copied with near perfect accuracy.

"We created a new group of enzymes that can read the genetic information inside living cells with unprecedented accuracy," says Jared Ellefson, a postdoctoral fellow in UT Austin's Center for Systems and Synthetic Biology. "Overlooked by evolution, our enzyme can correct errors while copying RNA."

Reverse transcription is mainly associated with retroviruses such as HIV. In nature, these viruses' inability to copy DNA accurately may have helped create variety in species over time, contributing to the complexity of life as we know it.

Since discovering reverse transcription, scientists have used it to better understand genetic information related to inheritable diseases and other aspects of human health. Still, the error-prone nature of existing RNA sequencing is a problem for scientists.

"With proofreading, our new enzyme increases precision and fidelity of RNA sequencing," says Ellefson. "Without the ability to faithfully read RNA, we cannot accurately determine the inner workings of cells. These errors can lead to misleading data in the research lab and potential misdiagnosis in the clinical lab."

Ellefson and the team of researchers engineered the new enzyme using directed evolution to train a high-fidelity (proofreading) DNA polymerase to use RNA templates. The new enzyme, called RTX, retains the highly accurate and efficient proofreading function, while copying RNA. Accuracy is improved at least threefold, and it may be up to 10 times as accurate. This new enzyme could enhance the methods used to read RNA from cells.

"As we move towards an age of personalized medicine where everyone's transcripts will be read out almost as easily as taking a pulse, the accuracy of the sequence information will become increasingly important," said Andy Ellington, a professor of molecular biosciences. "The significance of this is that we can now also copy large amounts of RNA information found in modern genomes, in the form of the RNA transcripts that encode almost every aspect of our physiology. This means that diagnoses made based on genomic information are far more likely to be accurate. "


Synthetic evolutionary origin of a proofreading reverse transcriptase
Science  24 Jun 2016:
Vol. 352, Issue 6293, pp. 1590-1593
DOI: 10.1126/science.aaf5409

Nanotechnology World Association

Tuesday, June 21, 2016

New electron microscope method detects atomic-scale magnetism

Scientists can now detect magnetic behavior at the atomic level with a new electron microscopy technique developed by a team from the Department of Energy’s Oak Ridge National Laboratory and Uppsala University, Sweden. The researchers took a counterintuitive approach by taking advantage of optical distortions that they typically try to eliminate.

“It’s a new approach to measure magnetism at the atomic scale,” ORNL’s Juan Carlos Idrobo said.

“We will be able to study materials in a new way. Hard drives, for instance, are made by magnetic domains, and those magnetic domains are about 10 nanometers apart.” One nanometer is a billionth of a meter, and the researchers plan to refine their technique to collect magnetic signals from individual atoms that are ten times smaller than a nanometer. 

“If we can understand the interaction of those domains with atomic resolution, perhaps in the future we will able to decrease the size of magnetic hard drives,” Idrobo said. “We won’t know without looking at it.”

Researchers have traditionally used scanning transmission electron microscopes to determine where atoms are located within materials. This new technique allows scientists to collect more information about how the atoms behave.

“Magnetism has its origins at the atomic scale, but the techniques that we use to measure it usually have spatial resolutions that are way larger than one atom,” Idrobo said. “With an electron microscope, you can make the electron probe as small as possible and if you know how to control the probe, you can pick up a magnetic signature.”

The ORNL-Uppsala team developed the technique by rethinking a cornerstone of electron microscopy known as aberration correction. Researchers have spent decades working to eliminate different kinds of aberrations, which are distortions that arise in the electron-optical lens and blur the resulting images.

Instead of fully eliminating the aberrations in the electron microscope, the researchers purposely added a type of aberration, called four-fold astigmatism, to collect atomic level magnetic signals from a lanthanum manganese arsenic oxide material. The experimental study validates the team’s theoretical predictions presented in a 2014 Physical Review Lettersstudy.

“This is the first time someone has used aberrations to detect magnetic order in materials in electron microscopy,” Idrobo said. “Aberration correction allows you to make the electron probe small enough to do the measurement, but at the same time we needed to put in a specific aberration, which is opposite of what people usually do.”

Idrobo adds that new electron microscopy techniques can complement existing methods, such as x-ray spectroscopy and neutron scattering, that are the gold standard in studying magnetism but are limited in their spatial resolution.

The study is published as “Detecting magnetic ordering with atomic size electron probes,” in the journal of Advanced Structural and Chemical Imaging. Coauthors are ORNL’s Juan Carlos Idrobo, Michael McGuire, Christopher Symons, Ranga Raju Vatsavai, Claudia Cantoni and Andrew Lupini; and Uppsala University’s Ján Rusz and Jakob Spiegelberg.

The electron microscopy experiments were conducted at the Center for Nanophase Materials Sciences, a DOE Office of Science User Facility at ORNL. The research was supported by DOE’s Office of Science.

ORNL is managed by UT-Battelle for the Department of Energy's Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Nanotechnology World Association

Monday, June 20, 2016

Particle accelerator on a microchip

Three “accelerators on a chip” made of silicon are mounted on a clear base. Credit: SLAC National Accelerator Laboratory

The Gordon and Betty Moore Foundation has awarded 13.5 million US dollars (12.6 million euros) to promote the development of a particle accelerator on a microchip. DESY and the University of Hamburg are among the partners involved in this international project, headed by Robert Byer of Stanford University (USA) and Peter Hommelhoff of the University of Erlangen-Nürnberg. Within five years, they hope to produce a working prototype of an “accelerator-on-a-chip”.

For decades, particle accelerators have been an indispensable tool in countless areas of research – from fundamental research in physics to examining the structure of biomolecules in order to develop new drugs. Accelerator-based research has repeatedly been awarded Nobel prizes. Until now, the necessary facilities have been very large and costly. Scientists and engineers are trying out a range of different approaches to build more compact and less expensive particle accelerators. For the time being, the big facilities will remain indispensable for many purposes, however there are some applications in which efficient miniature electron accelerators can provide completely new insights.

“The impact of shrinking accelerators can be compared to the evolution of computers that once occupied entire rooms and can now be worn around your wrist,” says Hommelhoff. This advance could mean that particle accelerators will become available in areas that have previously had no access to such technologies.

The aim of the project is to develop a new type of small, inexpensive particle accelerator for a wide range of different users. Apart from using the fast electrons themselves, they could also be used to produce high-intensity X-rays. “This prototype could set the stage for a new generation of ‘tabletop’ accelerators, with unanticipated discoveries in biology and materials science and potential applications in security scanning, medical therapy and X-ray imaging,” explains Byer.

Some of the accelerator-on-a-chip designs being explored by the international collaboration. Credit: SLAC National Accelerator Laboratory

The project is based on advances in nano-photonics, the art of creating and using nano structures to generate and manipulate different kinds of light. A laser using visible or infrared light is used to accelerate the electrically charged elementary particles, rather than the radio-frequency (RF) waves currently used. The wavelength of this radiation is some ten to one hundred thousand times shorter than that of the radio waves, meaning that steeper accelerator gradients can be achieved than those using RF technology. “The advantage is that everything is up to fifty times smaller,” explains Franz Kärtner who is a Leading Scientist at DESY, as well as a professor at the University of Hamburg and the Massachusetts Institute of Technology (MIT) in the US, and a member of Hamburg’s Centre for Ultrafast Imaging (CUI), and who heads a similar project in Hamburg, funded by the European Research Council.

“The typical transverse dimensions of an accelerator cell shrink from ten centimetres to one micrometre,” adds Ingmar Hartl, head of the laser group in DESY’s Photon Science Division. At the moment, the material of choice for the miniature accelerator modules is silicon. “The advantage is that we can draw on the highly advanced production technologies that are already available for silicon microchips,” explains Hartl.

DESY will bring its vast knowhow as an internationally leader in laser technology to the project, which has already paid off in other collaborations involving the University of Erlangen-Nürnberg. There, Hommelhoff’s group showed that for slow electrons a micro-structured accelerator module is able to achieve steeper acceleration gradients than RF technology.

Byer’s group had demonstrated independently the same effect for fast, so-called relativistic electrons.

However, it is still a long way from an experimental set-up in a laboratory to a working prototype. Individual components of the system will have to be developed from scratch.

Among other things, DESY is working on a high-precision electron source to feed the elementary particles into the accelerator modules, a powerful laser for accelerating them, and an electron undulator for creating X-rays. In addition, the interaction between the miniature components is not yet a routine matter, especially not when it comes to joining up several accelerator modules.

The SINBAD (“Short Innovative Bunches and Accelerators at DESY”) accelerator lab that is currently being set up at DESY will provide the ideal testing environment for the miniature accelerator modules. “SINBAD will allow us to feed high-quality electron beams into the modules, to test the quality of the radiation and work out an efficient way of coupling the laser.

DESY offers unique opportunities in this respect,” explains Ralph Aßmann, Leading Scientist at DESY.

Apart from DESY, the Universities of Stanford, Erlangen-Nürnberg and Hamburg, SLAC National Accelerator Laboratory in the US, the Swiss Paul Scherrer Institute (PSI) and the University of California in Los Angeles (UCLA), the Purdue University, the Swiss Federal Institute of Technology in Lausanne (EPFL) and the Technical University of Darmstadt are also involved in the project, as well as the US company Tech-X.

The Gordon and Betty Moore Foundation fosters path-breaking scientific discovery, environmental conservation, patient care improvements and the preservation of the special character of the San Francisco Bay Area. Gordon Moore is one of the founders of the chip manufacturer Intel and the author of “Moore's Law”, which predicts that the number of transistors in an integrated circuit doubles approximately every two years.


Read on NTWA