Showing posts with label particle physics. Show all posts
Showing posts with label particle physics. Show all posts

Monday, July 11, 2016

Physicists discover family of tetraquarks


Syracuse University Professor Tomasz Skwarnicki and Ph.D. student Thomas Britton confirm existence of rare 'exotic' particle, find evidence of 3 others
Physicists in the Syracuse University College of Arts and Sciences have made science history by confirming the existence of a rare four-quark particle and discovering evidence of three other "exotic" siblings.
Their findings are based on data from the Large Hadron Collider (LHC), the world's biggest, most powerful particle accelerator, located at the CERN science laboratory in Geneva, Switzerland.
Professor Tomasz Skwarnicki and Ph.D. student Thomas Britton G'16, both members of the Experimental High-Energy Physics Group at Syracuse and the Large Hadron Collider beauty (LHCb) collaboration at CERN, have confirmed the existence of a tetraquark candidate known as X(4140). They also have detected three other exotic particles with higher masses, called X(4274), X(4500) and X(4700).
All four particles were the subject of Britton's Ph.D. dissertation, which he defended in May and then submitted, on behalf of the LHCb collaboration, as a journal article to Physical Review Letters (American Physical Society, 2016).
A tetraquark is a particle made of four quarks: two quarks and two antiquarks.
Tetraquarks--and, by extension, pentaquarks, containing five quarks--are considered exotic because they have more than the usual allotment of two or three quarks.
"Even though all four particles contain the same quark composition, each of them has a unique internal structure, mass and set of quantum numbers," says Skwarnicki, who, in April 2014, confirmed the existence of the world's first charged tetraquark candidate, called Z(4430)+. A year earlier, he and Ph.D. student Bin Gui G'14 determined the quantum numbers of the first neutral, heavy tetraquark candidate, X(3872).
Quantum numbers describe each particle's subatomic properties.
Skwarnicki says the measurement of all four particles is the largest single one of its kind to date. Unlike other exotic particle candidates, his and Britton's do not contain ordinary nuclear matter (i.e., quarks found in protons and neutrons).
"We've never seen this kind of thing before. It's helping us distinguish among various theoretical models of particles," Skwarnicki says.
A rendering of the enormous LHCb detector, which registers approximately 10 million proton collisions per second. Scientists study the debris from these collisions to better understand the building blocks of matter and the forces controlling them.

A fellow of the American Physical Society, Skwarnicki is a longtime member of the LHCb collaboration, involving approximately 800 other scientists from 16 countries. Their goal is to discover all forms of matter, in hopes of explaining why the universe is made of it, instead of anti-matter.
Skwarnicki's work focuses on quarks--fundamental constituents of matter that serve as a kind of scaffolding for protons and neutrons. While most particles have two or three quarks, Skwarnicki and others, in the past decade, have observed ones with four or five.
Last summer, he and doctoral student Nathan Jurik G'16 teamed up with Distinguished Professor Sheldon Stone and Liming Zhang, a professor at Tsinghua University in Beijing, to announce their discovery of two rare pentaquark states. The news made headlines, thrusting Syracuse and CERN into the international spotlight.
According to the Standard Model of particle physics, there are six kinds of quarks, whose intrinsic properties cause them to be grouped into pairs with unusual names: up/down, charm/strange and top/bottom.
Tomasz Skwarnicki
The particles that Skwarnicki and Britton study have two charm quarks and two strange quarks. Charm and strange quarks are the third- and fourth-most massive of all quarks.
That all four quarks in the new family are "heavy" is noteworthy.
"The heavier the quark, the smaller the corresponding particle it creates," says Skwarnicki, adding that the names of the particles reflect their masses. "The names are denoted by mega-electron volts [MeV], referring to the amount of energy an electron gains after being accelerated by a volt of electricity. ... This information, along with each particle's quantum numbers, enhances our understanding of the formation of particles and the fundamental structures of matter."
Evidence of X(4140) first appeared in 2009 at the Fermi National Accelerator Laboratory, outside of Chicago, but the observation was not confirmed until three years later at CERN.
A rendering of the enormous LHCb detector, which registers approximately 10 million proton collisions per second. Scientists study the debris from these collisions to better understand the building blocks of matter and the forces controlling them. Extremely rare and four times heavier than a proton, X(4140) has been initially detected only 20 times out of billions of man-made energy collisions. LHCb is uniquely suited to study such particles, and thus, has gone on to detect X(4140) nearly 560 times.
Skwarnicki attributes the discovery of X(4140)'s three siblings, culled from LHCb data from 2011 to 2012, to increased instrumental sensitivity. It is the energy configuration of the quarks, he explains, that gives each particle its unique mass and identity.
Thomas Britton G'16
"Quarks may be tightly bound, like three quarks packed inside a single proton, or loosely bound, like two atoms forming a molecule," Skwarnicki says. "By examining the particles' quantum numbers, we were able to narrow down the possibilities and rule out the molecular hypothesis."
A snapshot of LHCb detector data, singling out the collisions that have resulted in the four tetraquarks. Not that the process has been easy. An "aporetic saga" is how Britton describes studying molecular structures that seem to "jump out of the data."
"We looked at every known particle and process to make sure that these four structures couldn't be explained by any pre-existing physics," he says. "It was like baking a six-dimensional cake with 98 ingredients and no recipe--just a picture of a cake."
Meanwhile, Skwarnicki, Britton and others face the onerous task of combing through data and developing theoretical models, in an attempt to confirm what they have seen.
"It may be a quartet of entirely new particles or the complex interplay of known particles, simply flipping their identities," Skwarnicki concludes. "Either way, the outcome will shape our understanding of the subatomic universe."

Monday, June 20, 2016

Particle accelerator on a microchip

Three “accelerators on a chip” made of silicon are mounted on a clear base. Credit: SLAC National Accelerator Laboratory

The Gordon and Betty Moore Foundation has awarded 13.5 million US dollars (12.6 million euros) to promote the development of a particle accelerator on a microchip. DESY and the University of Hamburg are among the partners involved in this international project, headed by Robert Byer of Stanford University (USA) and Peter Hommelhoff of the University of Erlangen-Nürnberg. Within five years, they hope to produce a working prototype of an “accelerator-on-a-chip”.

For decades, particle accelerators have been an indispensable tool in countless areas of research – from fundamental research in physics to examining the structure of biomolecules in order to develop new drugs. Accelerator-based research has repeatedly been awarded Nobel prizes. Until now, the necessary facilities have been very large and costly. Scientists and engineers are trying out a range of different approaches to build more compact and less expensive particle accelerators. For the time being, the big facilities will remain indispensable for many purposes, however there are some applications in which efficient miniature electron accelerators can provide completely new insights.

“The impact of shrinking accelerators can be compared to the evolution of computers that once occupied entire rooms and can now be worn around your wrist,” says Hommelhoff. This advance could mean that particle accelerators will become available in areas that have previously had no access to such technologies.

The aim of the project is to develop a new type of small, inexpensive particle accelerator for a wide range of different users. Apart from using the fast electrons themselves, they could also be used to produce high-intensity X-rays. “This prototype could set the stage for a new generation of ‘tabletop’ accelerators, with unanticipated discoveries in biology and materials science and potential applications in security scanning, medical therapy and X-ray imaging,” explains Byer.

Some of the accelerator-on-a-chip designs being explored by the international collaboration. Credit: SLAC National Accelerator Laboratory


The project is based on advances in nano-photonics, the art of creating and using nano structures to generate and manipulate different kinds of light. A laser using visible or infrared light is used to accelerate the electrically charged elementary particles, rather than the radio-frequency (RF) waves currently used. The wavelength of this radiation is some ten to one hundred thousand times shorter than that of the radio waves, meaning that steeper accelerator gradients can be achieved than those using RF technology. “The advantage is that everything is up to fifty times smaller,” explains Franz Kärtner who is a Leading Scientist at DESY, as well as a professor at the University of Hamburg and the Massachusetts Institute of Technology (MIT) in the US, and a member of Hamburg’s Centre for Ultrafast Imaging (CUI), and who heads a similar project in Hamburg, funded by the European Research Council.

“The typical transverse dimensions of an accelerator cell shrink from ten centimetres to one micrometre,” adds Ingmar Hartl, head of the laser group in DESY’s Photon Science Division. At the moment, the material of choice for the miniature accelerator modules is silicon. “The advantage is that we can draw on the highly advanced production technologies that are already available for silicon microchips,” explains Hartl.

DESY will bring its vast knowhow as an internationally leader in laser technology to the project, which has already paid off in other collaborations involving the University of Erlangen-Nürnberg. There, Hommelhoff’s group showed that for slow electrons a micro-structured accelerator module is able to achieve steeper acceleration gradients than RF technology.

Byer’s group had demonstrated independently the same effect for fast, so-called relativistic electrons.

However, it is still a long way from an experimental set-up in a laboratory to a working prototype. Individual components of the system will have to be developed from scratch.

Among other things, DESY is working on a high-precision electron source to feed the elementary particles into the accelerator modules, a powerful laser for accelerating them, and an electron undulator for creating X-rays. In addition, the interaction between the miniature components is not yet a routine matter, especially not when it comes to joining up several accelerator modules.

The SINBAD (“Short Innovative Bunches and Accelerators at DESY”) accelerator lab that is currently being set up at DESY will provide the ideal testing environment for the miniature accelerator modules. “SINBAD will allow us to feed high-quality electron beams into the modules, to test the quality of the radiation and work out an efficient way of coupling the laser.

DESY offers unique opportunities in this respect,” explains Ralph Aßmann, Leading Scientist at DESY.

Apart from DESY, the Universities of Stanford, Erlangen-Nürnberg and Hamburg, SLAC National Accelerator Laboratory in the US, the Swiss Paul Scherrer Institute (PSI) and the University of California in Los Angeles (UCLA), the Purdue University, the Swiss Federal Institute of Technology in Lausanne (EPFL) and the Technical University of Darmstadt are also involved in the project, as well as the US company Tech-X.

The Gordon and Betty Moore Foundation fosters path-breaking scientific discovery, environmental conservation, patient care improvements and the preservation of the special character of the San Francisco Bay Area. Gordon Moore is one of the founders of the chip manufacturer Intel and the author of “Moore's Law”, which predicts that the number of transistors in an integrated circuit doubles approximately every two years.

DESY

Read on NTWA

Friday, June 17, 2016

Physicists measured something new in the radioactive decay of neutrons.



The experiment inspired theorists; future ones could reveal new physics.

A physics experiment performed at the National Institute of Standards and Technology (NIST) has enhanced scientists’ understanding of how free neutrons decay into other particles. The work provides the first measurement of the energy spectrum of photons, or particles of light, that are released in the otherwise extensively measured process known as neutron beta decay. The details of this decay process are important because, for example, they help to explain the observed amounts of hydrogen and other light atoms created just after the Big Bang.

Published in Physical Review Letters, the findings confirm physicists’ big-picture understanding of the way particles and forces work together in the universe—an understanding known as the Standard Model. The work has stimulated new theoretical activity in quantum electrodynamics (QED), the modern theory of how matter interacts with light. The team’s approach could also help search for new physics that lies beyond the Standard Model.

Neutrons are well known as one of the three kinds of particles that form atoms. Present in all atoms except the most common form of hydrogen, neutrons together with protons form the atomic nucleus. However, “free” neutrons not bound within a nucleus decay in about 15 minutes on average. Most frequently, a neutron transforms through the beta decay process into a proton, an electron, a photon, and the antimatter version of the neutrino, an abundant but elusive particle that rarely interacts with matter.

The photons from beta decay are what the research team wanted to explore. These photons have a range of possible energies predicted by QED, which has worked very well as a theory for decades. But no one had actually checked this aspect of QED with high precision.

“We weren’t expecting to see anything unusual,” said NIST physicist Jeff Nico, “but we wanted to test QED’s predictions very precisely in a way no one has done before.”

Nico and his colleagues, who represent nine research institutions, performed their measurements at the NIST Center for Neutron Research (NCNR). It produces an intense beam of slow-moving neutrons whose photon emissions can be detected with the same setup used for earlier precision measurements of the neutron’s lifetime.

The team measured two aspects of neutron decay: the energy spectrum of the photons, and also its branching ratio, which can provide information on how frequently the decays were accompanied by photons above a specific energy. The results of this effort gave them a branching ratio measurement more than twice as accurate as the previous value, and the first measurement of the energy spectrum.

“Everything we found was consistent with the predominant QED calculations,” Nico said. “We got quite a good match with theory on the energy spectrum, and we reduced the uncertainty in the branching ratio.”

According to Nico, the results provided specific information that theoretical physicists are already using to further develop QED to provide more detailed descriptions of neutron beta decay.

The results serve as a needed check on the Standard Model, said Nico, and validates the team’s experimental approach as a way to go beyond it. With better detectors, the approach could be used to search for so-called “right-handed” neutrinos, which have not yet been detected in nature, and potential time-reversal symmetry violations, which could explain why there is much more matter than antimatter in the universe.

Paper: M.J. Bales, R. Alarcon, C.D. Bass, E.J. Beise, H. Breuer, J. Byrne, T.E. Chupp, K.J. Coakley, R.L. Cooper, M.S. Dewey, S. Gardner, T.R. Gentile, D. He, H.P. Mumm, J.S. Nico, B. O'Neill, A.K. Thompson and F.E. Wietfeldt (RDK II Collaboration). Precision measurement of the radiative beta decay of the free neutron. Physical Review Letters. June 14, 2016, DOI: 10.1103/PhysRevLett.116.242501

Tuesday, March 8, 2016

International research team achieves controlled movement of skyrmions



Basis for the utilization of skyrmions for application-related systems / Magnetic vortices as data storage media of the future


A joint research project being undertaken by Johannes Gutenberg University Mainz (JGU) and the Massachusetts Institute of Technology (MIT) has achieved a breakthrough in fundamental research in the field of potential future data storage technologies. The idea is that electronic storage units (bits) will not be stored on rotating hard disks as is currently standard practice but on a nanowire in the form of magnetic vortex structures, so-called skyrmions, using a process similar to that of a shift register. The magnetic skyrmion bits would be rapidly accessible, while storage density would be high and there would be improved energy efficiency. The project team managed, for the first time, to achieve targeted shifting of individual skyrmions at room temperature using electrical impulses. Their results have been published in the journal Nature Materials.
Magnetic skyrmions are special spin configurations that can occur in materials and particularly in thin layer structures when the inversion symmetry is broken. With regard to the systems that are of interest here, this means that a thin metal film with a non-symmetrical layer structure can be employed. In materials such as this, spin configurations that behave rather like a hair whorl can form. It can be just as difficult to eradicate a skyrmion as it can be to smooth out a hair whorl, a property that gives skyrmions enhanced stability.
An important characteristic of skyrmions is that they can exist in isolation in magnetic materials and generally do not tend to collide with the edge of a structure. This provides them with the unique ability to skirt any isolated defects or unevenness in the material with which other magnetic structures, such as domain walls, would collide. Skyrmions are therefore excellent candidates for use with magnetic shift registers, otherwise known as racetrack memory. Information could be encoded in the form of skyrmions and an electrical current could be employed to move them past fixed read/write heads. The process would be both rapid and completely independent of movable mechanical components and thus ideally suited for mobile applications.
During the project, it was demonstrated that individual skyrmions can indeed be moved in a controlled manner along a magnetic wire, i.e., the so-called racetrack, by exposing them to brief electrical impulses at room temperature. In addition, new methods to describe their dynamics were developed and confirmed by experimentation. This work can thus be regarded as laying down the cornerstone for the future use of skyrmions in application-related systems.
"It is always great to see when a joint project quickly leads to exciting results. This is particularly true in this case, as we have been able to produce this journal article within just a year of entering into our cooperation agreement. It would never have come about had it not been for the close collaboration between Mainz University and MIT and the lively exchange of ideas," said Kai Litzius, co-author of the article. Litzius is on a stipendiary scholarship awarded by the Graduate School of Excellence "Materials Science in Mainz" (MAINZ) and is a member of the team headed by Professor Mathias Kläui.
"I'm delighted by the way we were able to work efficiently and continuously with groups at MIT. After receiving start-up funding through a joint project financed by the German Federal Ministry of Education and Research, we have been able to produce six joint publications since 2014, partly as the result of several student visits to MIT," emphasized Kläui, a professor at the Institute of Physics and the director of the MAINZ Graduate School of Excellence.
Establishment of the MAINZ Graduate School was approved through the Excellence Initiative by the German Federal and State Governments to Promote Science and Research at German Universities in 2007 and its funding was extended in the second round in 2012. It consists of work groups from Johannes Gutenberg University Mainz, TU Kaiserslautern, and the Max Planck Institute for Polymer Research in Mainz. One of its focal research areas is spintronics, where cooperation with leading international partners plays an important role.

Sunday, March 6, 2016

A Proposed Superconductivity Theory Receives Exclusive Experimental Confirmation


Superconductivity – a quantum phenomenon in which metals below a certain temperature develop flow of current with no loss or resistance – is one of the most exciting problems in physics, which has resulted in investments worldwide of enormous brain power and resources since its discovery a little over a century ago. Many prominent theorists, Nobel laureates among them, have proposed theories for new classes of superconducting materials discovered several decades later, followed by teams of experimentalists working furiously to provide solid evidence for these theories.  More than 100,000 research papers have been published on the new materials.
One such theory began with a proposal in 1989 by Chandra Varma while at Bell Laboratories, NJ, and now a distinguished professor of physics and astronomy at the University of California, Riverside. At UC Riverside, he further developed the theory and proposed experiments to confirm or refute it.  That theory has now been experimentally proven to be a consistent theory by physicists in China and Korea.
The experimental results, published in Science Advances today (March 4), now allow for a clear discrimination of theories of high-temperature superconductivity, favoring one and ruling others out.  The research paper is titled “Quantitative determination of pairing interactions for high-temperature superconductivity incuprates.”
“At the core of most models for the high-temperature superconductivity in cuprates lies the idea of the electron-electron pairing,” said Lev P. Gor’kov, a theoretical physicist at Florida State University who is renowned for making the most important formal advance in the superconductivity field in 1958, while at the Soviet Academy of Sciences. “The paper by Prof. Chandra Varma and his colleagues from China and Korea is the daring and successful attempt to extract the relevant electron-electron interactions directly from experiment. Their elegant approach opens new prospects also for studying the superconductivity mechanisms in other systems with strongly correlated electrons.”
A boon to technology
Superconductors are used in magnetic-imaging devices in hospitals. They are used, too, for special electrical switches.  The electromagnets used in the Large Hadron Collider at CERN use superconducting wire.  Large-scale use of superconductivity, however, is not feasible presently because of cost.  If superconductors could be made cheaply and at ordinary temperatures, they would find wide use in power transmission, energy storage and magnetic levitation.
First discovered in the element mercury in 1911, superconductivity is said to occur when electrical resistance in a solid vanishes when that solid is cooled below a characteristic temperature, called the transition temperature, which varies from material to material. Transition temperatures tend to be close to 0 K or -273 C.  At even slightly higher temperatures, the materials tend to lose their superconducting properties; indeed, at room temperature most superconductors are very poor conductors.  In 1987, some high-temperature superconductors, called cuprates, were discovered by physicists Georg Bednorz and Alexander Müller, so named because they all contain copper and oxygen.  These new materials have properties which have raised profound new questions.  Why these high-temperature superconductors perform as they do has remained unknown.
A brief history lesson
The superconductivity problem was considered solved by a theory proposed in 1957: the BCS theory of superconductivity. This comprehensive theory, developed by physicists John Bardeen, Leon Cooper and John Schrieffer (the first letter of their last names gave the theory its name), explained the behavior of superconducting materials as resulting from electrons forming pairs, with each pair being strongly correlated with other pairs, allowing them all to function coherently as a single entity.  Concepts in the BCS theory and its elaborations have influenced all branches of physics, ranging from elementary particle physics to cosmology.
“But in the cuprates, some of the founding concepts of the physics of interacting particles, such as the quasi-particle concept, were found to be invalid,” Varma said. “The physical properties of superconductors  above the superconducting transition temperature were more remarkable than the superconductivity itself. Subsequently, almost all the leading theoretical physicists in the world proposed different directions of ideas and calculations to explain these properties as well as superconductivity.  But very few predictions stemming from these ideas were verified, and specific experiments were not in accord with them.”
A quasi-particle is a packet of energy and momentum that can, in some respects, be regarded as a particle. It is a physical concept, which allows detailed calculation of properties of matter.
In 1989, while at Bell Laboratories, Varma and some collaborators proposed that the breakdown of the quasi-particle concept occurs due to a simple form of quantum-critical fluctuations – fluctuations which are quantum in nature and occur when symmetry of matter breaks down, such as at the phase transition critical point near absolute zero of temperature.
In physics, symmetry is said to occur when some change in orientation or movement by any amount leaves the physical situation unchanged (empty space, for example, has symmetry because it is everywhere the same). Relativity, quantum theory, crystallography and spectroscopy involve notions of symmetry.
“It was at this time that we introduced the concept of marginal Fermi-liquids or marginal quasi-particles through which various properties of superconductivity were explained,” Varma said. “We also provided some definitive predictions, which could only be tested in 2000 by a new technique called Angle Resolved Photoemissions or ARPES.”
Varma explained that in 1989 there was also no evidence that the same quantum-critical fluctuations promoted the superconductivity transition.
“There was no theory for the cause of such quantum-critical fluctuations or for the symmetry which must change near absolute zero to realize them,” he said.
In 1997, Varma proposed transitions to a new class of symmetries, in which the direction of time was picked by the direction of currents. These currents, he suggested, begin to spontaneously flow in each microscopic cell of the cuprates.  Since 2004, a group of French scientists at Saclay has been reporting evidence of such symmetries in every high-temperature superconducting compound it could investigate with neutron scattering.  Several other kinds of experiments by other research groups are in accord also.
Varma cautioned that some unresolved issues persist. His group is proposing experiments to address them.
In 2003, the year Varma moved to UC Riverside, he formulated a theory for how quantum fluctuations coupled to electrons give rise to the observed symmetry in superconductivity.
“This was a completely new kind of coupling,” he said. “It had very remarkable and unusual predictions for experiments designed to decipher such a coupling.”
ARPES to the rescue
In 2010, Varma became aware of high-quality laser-based ARPES in a laboratory at the Institute of Physics in the Chinese Academy of Sciences, Beijing, China. A collaboration with physicist Xingjiang Zhou at the institute ensued, with numerical analysis of the data being done by Han-Yong Choi, a physicist at SungKyunKwan University, Korea, who, in the past, worked with Varma at UCR.
Zhou’s team made several improvements in the ARPES technique, which ensured that the quality of data was high and reproducible enough to have full confidence.
“The data obtained and the analysis we describe in our paper are conclusive on the most important issues relevant to superconductivity,” Varma said. “Our conclusions – namely, that the quantum fluctuations promoting superconductivity are the same as those that lead to the marginal Fermi-liquid and they are consistently of the form predicted, being stretched exponentially in time in a scale-invariant way relative to stretching in space – also have no theoretical approximations.  They are as precise as the quality of the data allows.  They also unambiguously address the question of symmetry of superconductivity.  Further, they rule out many of the alternative ideas that have been proposed on this problem in the last thirty years since the original discovery.  Our observations of the breakdown of time-reversal symmetry and of the fluctuations that follow complete major aspects of our understanding of these problems.”
Varma, Zhou and Choi were joined in the research by Jin Mo Bok (first author of the paper) and Jong Ju Bae at SungKyunKwan University, Korea; and Wentai Zhang, Junfeng He, Yuxiao Zhang and Li Yu at the Institute of Physics, Chinese Academy of Sciences, Beijing, China. Varma was partially supported by a grant from the National Science Foundation.
About Varma
After he received his doctoral degree in physics from the University of Minnesota, Varma joined Bell Labs in 1969, one of the most coveted positions at the time for young physicists anywhere in the world. The following year, he became a permanent member of the laboratory.  He was the head of the theoretical physics department at Bell Labs from 1983 to 1987, and was awarded the Distinguished Member of Research in 1988.  He has served as a visiting professor at the University of Chicago, Stanford University, MIT, the College de France in Paris, France, and at CNRS, France; and a senior visiting fellow at Cavendish Lab at Cambridge University.  In 2000, he was selected to the Lorentz Visiting Chair at Leiden University, the Netherlands.  In 2009, he held a Miller Professorship at UC Berkeley.
He is a fellow of the American Physical Society and of the American Association for the Advancement of Science. A member of the World Academy of Sciences, he is the recipient of the Alexander Humboldt Prize and the Bardeen Prize for theoretical advances in superconductivity.
Varma has published nearly 200 scientific papers, which have in all about 18,000 citations. He has made seminal contributions to the theory of glasses, to Kondo and mixed valence and heavy-fermion phenomena, novel forms of superconductivity, charge density waves, co-existing magnetic and superconducting states, the Higgs boson in superconductors, quantum criticality, singular Fermi-liquids and associated superconductivity.

Wednesday, March 2, 2016

Discovery of new particle: 'four-flavored' tetraquark

The new particle is the first tetraquark to contain four quarks of different "flavors." | Photo by Fermilab


Research led by Indiana University physicist Daria Zieminska has resulted in the first detection of a new form of elementary particle: the "four-flavored" tetraquark.

Zieminska, a senior scientist in the IU Bloomington College of Arts and Sciences' Department of Physics, is a lead member of the team responsible for the particle's detection by the DZero Collaboration at the U.S. Department of Energy's Fermi National Laboratory, which announced the discovery Feb. 25.

She also delivered the first scientific seminar on the particle and is an author on a paper submitted to Physics Review Letters, the premier journal in physics, describing the tetraquarks' observation.

"For most of the history of quarks, it's seemed that all particles were made of either a quark and an antiquark, or three quarks; this new particle is unique -- a strange, charged beauty," said Zieminska, who has been a member of the DZero experiment since the project's establishment in 1985. "It's the birth of a new paradigm. Particles made of four quarks -- specifically, two quarks and two antiquarks -- is a big change in our view of elementary particles."

The results could also affect scientists' understanding of "quark matter," the hot, dense material that existed moments after the Big Bang, and which may still exist in the super-dense interior of neutron stars.

Quarks are the building blocks that form subatomic particles, the most familiar of which are protons and neutrons, each composed of three quarks. There are six types, or "flavors," of quarks: up, down, strange, charm, bottom and top. Each of these also has an antimatter counterpart.

A tetraquark is a group of four quarks, the first evidence for which was recorded by scientists on the Belle experiment in Japan in 2008. But the new tetraquark is the first quark quartet to contain four different quark flavors: up, down, strange and bottom.

Currently, Zieminska leads the "heavy flavor" group of the DZero experiment, which encompasses the study of all particles containing one or more "heavy quarks," including the new tetraquark, dubbed X(5568) for its mass of 5568 Megaelectronvolts, roughly 5.5 times the mass of a proton. The DZero experiment is led by Dmitri Denisov, a staff scientist at the U.S. Department of Energy's Fermilab.

"Daria was the lead person on the tetraquark observation and performed calculations, cross-checking and other work required to answer the hundreds of questions of the rest of the team," said Denisov, co-spokesman for the DZero experiment. "She was an active participant in the design and construction of the experiment and in the collection of the data."

A chart compares mesons, composed of two quarks; baryons, composed of three quarks; and the lesser understood tetraquark, composed of four quarks. | Photo by Fermilab

The DZero experiment is also responsible for other fundamental physics discoveries, including the first observation, with the Collider-Detector at Fermilab experiment, of the elusive Higgs boson particle decaying into bottom quarks.

Other IU scientists engaged in the DZero project include the late Andrzej Zieminski, former professor of physics at IU Bloomington, who also joined the project in 1985, and Rick Van Kooten, IU vice provost for research, who joined in 2002 during "phase 2" of the project, which involved upgrades to the detector partially constructed at IU. Hal Evans, professor, and Sabine Lammers, associate professor, both at IU, also contributed to the upgraded detector.

DZero is one of two experiments collecting data from Fermilab's Tevatron proton-antiproton collider, once the most powerful particle accelerator in the world, officially retired in 2011. Zieminska and colleagues uncovered the existence of X(5568) based on analysis of billions of previously recorded events from these collisions.

As with other discoveries in physics, Zieminska said the new tetraquark’s discovery was a surprise. Alexey Drutskoy, a colleague at Russia's National Research Nuclear University, spotted indications of the tetraquark signal in summer 2015, after which Zieminska joined him in the hunt. Only after performing multiple cross-checks, in collaboration with Alexey Popov, another Russian colleague, did the team confirm they were observing evidence for a new particle.

Although nothing in nature forbids the formation of a tetraquark, four-quark states are rare and not nearly as well understood as two- and three-quark states. Zieminska and colleagues plan to deepen their understanding of the tetraquark by measuring various properties of the particle, such as the ways it decays or how much it spins on its axis.

The discovery of the tetraquark also comes on the heels of the first observation of a pentaquark -- a five-quark particle -- announced last year by CERN's LHCb experiment at the Large Hadron Collider.

Zieminska is also a member of the ATLAS Experiment at CERN, the European Organization for Nuclear Research.

A total of 75 institutions from 18 countries are members of the DZero Collaboration.

Indiana University


Tuesday, February 9, 2016

Weighing the lightest particle


Neutrinos are everywhere. Every second, 100 trillion of them pass through your body unnoticed, hardly ever interacting. Though exceedingly abundant, they are the lightest particles of matter, and physicists around the world are attempting the difficult challenge of measuring their mass.   

For a long time, physicists thought neutrinos were massless. This belief was overturned by the discovery that neutrinos oscillate between three flavors: electron, muon and tau. This happens because each flavor contains a mixture of three mass types, neutrino-1, neutrino-2 and neutrino-3, which travel at slightly different speeds.

According to the measurements taken so far, neutrinos must weigh less than 2 electronvolts (a minute fraction of the mass of the tiny electron, which weighs 511,000 electronvolts). A new generation of experiments is attempting to lower this limit—and possibly even identify the actual mass of this elusive particle.

Where did the energy go?

Neutrinos were first proposed by the Austrian-born theoretical physicist Wolfgang Pauli to resolve a problem with beta decay. In the process of beta decay, a neutron in an unstable nucleus transforms into a proton while emitting an electron. Something about this process was especially puzzling to scientists. During the decay, some energy seemed to go missing, breaking the well-established law of energy conservation.

Pauli suggested that the disappearing energy was slipping away in the form of another particle. This particle was later dubbed the neutrino, or “little neutral one,” by the Italian physicist Enrico Fermi.

Scientists are now applying the principle of energy conservation to direct neutrino mass experiments. By very precisely measuring the energy of electrons released during the decay of unstable atoms, physicists can deduce the mass of neutrinos.

“The heavier the neutrino is, the less energy is left over to be carried by the electron,” says Boris Kayser, a theoretical physicist at Fermilab. “So there is a maximum energy that an electron can have when a neutrino is emitted.”

These experiments are considered direct because they rely on fewer assumptions than other neutrino mass investigations. For example, physicists measure mass indirectly by observing neutrinos’ imprints on other visible things such as galaxy clustering.

Detecting the kinks

Of the direct neutrino mass experiments, KATRIN, which is based at the Karlsrule Institute for Technology in Germany, is the closest to beginning its search.

“If everything works as planned, I think we'll have very beautiful results in 2017,” says Guido Drexlin, a physicist at KIT and co-spokesperson for KATRIN.

KATRIN plans to measure the energy of the electrons released from the decay of the radioactive isotope tritium. It will do so by using a giant tank tuned to a precise voltage that allows only electrons above a specific energy to pass through to the detector at the other side. Physicists can use this information to plot the rate of decays at any given energy.

The mass of a neutrino will cause a disturbance in the shape of this graph. Each neutrino mass type should create its own kink. KATRIN, with a peak sensitivity of 0.2 electronvolts (a factor 100 better than previous experiments) will look for a “broad kink” that physicists can use to calculate average neutrino mass.

Another tritium experiment, Project 8, is attempting a completely different method to measure neutrino mass. The experimenters plan to detect the energy of each individual electron ejected from a beta decay by measuring the frequency of its spiraling motion in a magnetic field. Though still in the early stages, it has the potential to go beyond KATRIN’s sensitivity, giving physicists high hopes for its future.

“KATRIN is the furthest along—it will come out with guns blazing,” says Joseph Formaggio, a physicist at MIT and Project 8 co-spokesperson. “But if they see a signal, the first thing people are going to want to know is whether the kink they see is real. And we can come in and do another experiment with a completely different method.”

Cold capture

Others are looking for these telltale kinks using a completely different element, holmium, which decays through a process called electron capture. In these events, an electron in an unstable atom combines with a proton, turning it into a neutron while releasing a neutrino.

Physicists are measuring the very small amount of energy released in this decay by enclosing the holmium source in microscopic detectors that are operated at very low temperatures (typically below minus 459.2 degrees Fahrenheit). Each holmium decay leads to a tiny increase of the detector’s temperature (about 1/1000 degrees Fahrenheit).

“To lower the limit on the electron neutrino mass, you need a good thermometer that can measure these very small changes of temperature with high precision,” says Loredana Gastaldo, a Heidelberg University physicist and spokesperson for the ECHo experiment.  

There are currently three holmium experiments, ECHo and HOLMES in Europe and NuMECs in the US, which are in various stages of testing their detectors and producing isotopes of holmium.

The holmium and tritium experiments will help lower the limit on how heavy neutrinos can be, but it may be that none will be able to definitively determine their mass. It will likely require a combination of both direct and indirect neutrino mass experiments to provide scientists with the answers they seek—or, physicists might even find completely unexpected results.

“Don't bet on neutrinos,” Formaggio says. “They’re kind of unpredictable.”

Symmetry Magazine

Tuesday, February 2, 2016

Solving Hard Quantum Problems: Everything is Connected





Quantum objects cannot just be understood as the sum of their parts. This is what makes quantum calculations so complicated. Scientists at TU Wien (Vienna) have now calculated Bose-Einstein-condensates, revealing the secrets of the particles’ collective behaviour.

 

Quantum systems are extremely hard to analyse if they consist of more than just a few parts. It is not difficult to calculate a single hydrogen atom, but in order to describe an atom cloud of several thousand atoms, it is usually necessary to use rough approximations. The reason for this is that quantum particles are connected to each other and cannot be described separately. Kaspar Sakmann (TU Wien, Vienna) and Mark Kasevich (Stanford, USA) have now shown in an article published in “Nature Physics” that this problem can be overcome. They succeeded in calculating effects in ultra-cold atom clouds which can only be explained in terms of the quantum correlations between many atoms. Such atom clouds are known as Bose-Einstein condensates and are an active field of research.

Quantum Correlations

Quantum physics is a game of luck and randomness. Initially, the atoms in a cold atom cloud do not have a predetermined position. Much like a die whirling through the air, where the number is yet to be determined, the atoms are located at all possible positions at the same time. Only when they are measured, their positions are fixed. “We shine light on the atom cloud, which is then absorbed by the atoms”, says Kaspar Sakmann. “The atoms are photographed, and this is what determines their position. The result is completely random.”

There is, however, an important difference between quantum randomness and a game of dice:  if different dice are thrown at the same time, they can be seen as independent from each other. Whether or not we roll a six with die number one does not influence the result of die number seven. The atoms in the atom cloud on the other hand are quantum physically connected. It does not make sense to analyse them individually, they are one big quantum object. Therefore, the result of every position measurement of any atom depends on the positions of all the other atoms in a mathematically complicated way.

“It is not hard to determine the probability that a particle will be found at a specific position”, says Kaspar Sakmann. “The probability is highest in the centre of the cloud and gradually diminishes towards the outer fringes.” In a classically random system, this would be all the information that is needed. If we know that in a dice roll, any number has the probability of one sixth, then we can also determine the probability of rolling three ones with three dice. Even if we roll five ones consecutively, the probability remains the same the next time. With quantum particles, it is more complicated than that.

“We solve this problem step by step”, says Sakmann. “First we calculate the probability of the first particle being measured on a certain position. The probability distribution of the second particle depends on where the first particle has been found. The position of the third particle depends on the first two, and so on.” In order to be able to describe the position of the very last particle, all the other positions have to be known. This kind of quantum entanglement makes the problem mathematically extremely challenging.

Only Correlations Can Explain the Experimental Data 

But these correlations between many particles are extremely important – for example for calculating the behaviour of colliding Bose-Einstein-condensates. “The experiment shows that such collisions can lead to a special kind of quantum waves. On certain positions we find many particles, on an adjacent position we do not find any”, says Kaspar Sakmann. “If we consider the atoms separately, this cannot be explained. Only if we take the full quantum distribution into account, with all its higher correlations, these waves can be reproduced by our calculations.”

Also other phenomena have been calculated with the same method, for instance Bose-Einstein-condensates which are stirred with a laser beam, so that little vortices emerge – another typical quantum many-particle-effect. “Our results show how important theses correlations are and that it is possible to include them in quantum calculations, in spite of all mathematical difficulties”, says Sakmann. With certain modifications, the approach can be expected to be useful for many other quantum systems as well.


Original paper: