Researchers have built two-dimensional materials-based transistors and used them to design ultra-low power artificial neuron circuits for autonomous robots.

Measuring the size of a very short-lived particle

Author(s)
November 3,2017
Read time: 5 mins

Photo : Prof. Ananthanarayan

“One meditates on the omniscient, primordial, the controller, smaller than the atom, yet the maintainer of everything; whose form is inconceivable, resplendent like the Sun and totally transcendental to material nature” reads a verse of the Bhagavad Gita, written between the 5th and 3rd century BC.

Atomism- the idea that every object is made of discrete, indivisible particles, had existed for centuries. With the discovery of the atomic nucleus in 1909, we learnt that atoms were, after all, made of protons and neutrons in the nucleus, and electrons surrounding it. This understanding got a lot murkier with the advent of quantum physics in the 1930s. In experiments like the Large Hadron Collider at the European Organization for Nuclear Research (CERN), Switzerland, scientists collided particles at high velocities, much like breaking a watch by smashing it to study its components. The high-energy collisions allow scientists to probe the structure and components of the fundamental particles, that make up everything. The experiments revealed a zoo of particles, like quarks, neutrinos, mesons, and pions, by smashing together protons, which suggested that protons and neutrons were themselves made of other elementary particles-- particles which cannot be further divided into constituent particles. 

Today, the Standard Model of particle physics, a theoretical model that classifies the elementary particles, is the best theory put forth to explain how these myriad particles behave in nature and what laws govern them. However, being the realm of the subnuclear, the model is far from complete. It is yet to fully explain the force of gravity, and lacks the precision of some of the classical theories, like general relativity and thermodynamics, keeping research in the field active. 

In a new breakthrough, researchers from the Indian Institute of Science (IISc), Bangalore, Horia Hulubei National Institute for Physics and Nuclear Engineering, Romania, and Physical Research Laboratory, Ahmedabad (later moved to University of Delhi), have developed a theoretical framework to accurately measure the diameter of a pion – the particle that binds the constituents of the nucleus of an atom together. Their paper titled ‘The electromagnetic charge radius of the pion at high precision’ was published in the reputed Physical Review Letters journal, on 28th September 2017.

“The pion is not an elementary particle unlike an electron which is elementary. A pion is made of a quark and an antiquark pair, which are elementary particles” explains Prof. B Ananthanarayan, Professor and Chairman at the Center for High Energy Physics (CHEP) at IISc, who is also a co-author of the study.

As per our current understanding, gravity, electromagnetism, weak nuclear force, and strong nuclear force are the four fundamental forces that govern our Universe. Although gravity remains outside its realm, the Standard Model, describes the others as arising by an exchange of particles collectively called ‘gauge bosons’. Photons govern electromagnetism and is responsible for electric and magnetic fields and light. The W and Z bosons dictate the weak nuclear force responsible for radioactivity. And a set of eight gluons gives rise to the strong nuclear force that holds the protons and neutrons together in the nucleus of an atom. A pion is born out of a quark, antiquark, and the eight gluons. Its existence was predicted in the 1930s by Nobel laureate Hideki Yukawa.

“As it’s a composite particle made up of constituents, a pion is not a point particle and its diameter need to be determined precisely. It has an approximate effective diameter of 1.3 femtometre (a quadrillionth of a metre or 10-15 metres), which is significantly larger than diameter of a proton, but with a mass of 139 MeV/ C2, a pion is much lighter than a proton, which is around 938 MeV/ C2”, explains Prof. B Ananthanarayan. The pion is also extremely short-lived, sometimes lasting only for a billionth of a nanosecond, before quickly decaying into other particles, like muons.

So how can we measure the diameter of a short-lived particle that moves highly relativistically? A CERN experiment in 1986 was the most direct experiment conducted to measure the diameter of the pion. In the experiment, a highly energetic beam of pions is collided against atomic electrons. While some of the electrons will graze the pion, some will penetrate into the particle and probe deeper. “It’s like probing a ball of wool. If we use a very energetic probe, it can go deeper into the woolen ball, giving us more information about the ball. Similarly, we get better information at larger values of momentum of the probe particle”, remarks Prof. Ananthanarayan.

Using data obtained from this and other experiments, and combining several mathematical and statistical tools available in quantum physics, the scientists measured the diameter of a pion to be around 0.657 femtometre, with an error factor of 0.003. “We knew the size of the pion, but not at the required level of precision. Earlier its size was known to 1.5% accuracy whereas now, with the new work, we know it to an accuracy of 0.5%.” says Prof. Ananthanarayan.

According to Prof. Ananthanarayan, this work, by setting a new benchmark, will force other theorists and experimentalists to improve their own designs and models to achieve the same accuracy and precision. “To study strong nuclear force in atoms on the computer, scientists have developed the Lattice Gauge Theory around forty years ago that measures the basic properties of such interactions. By achieving an accuracy of half percent, we are forcing the researchers studying such strong interactions using the lattice to also reach such levels of accuracy” he explains.

But more importantly, he feels the study reveals fundamental details about the Universe’s building blocks, while also validating the mathematical and statistical tools that have been developed over the years to make sense of our Universe. And with an increased precision in our experiments, we can now make increasingly accurate predictions. 

“This is a regime of physics which is fundamental in nature. The role of fundamental research is to push other frontiers of knowledge, and force experiments to always improve their precision and to make more accurate theoretical predictions”, signs off Prof. Ananthanarayan.