As a teenager, it is hard enough dealing with the emotional challenges of growing; add to it the stigma of living with HIV. Then there are researchers who aim to improve the mental health of these teenagers without ever involving them in the process! This leads to a disconnect between lived experience and research outcomes. It is a little-recognized barrier to effective mental health interventions for marginalised communities. 

Looking through the fog

Read time: 4 mins
Bengaluru
30 Jul 2021
Looking through the fog

Image by Maksim ŠiŠlo via Unsplash

In foggy weather, water droplets in the air scatters the light from any source, rendering poor visibility. Imaging scenes in such conditions becomes difficult. Beacons of light are difficult to observe from a distance because the light from these beacons are scattered before they reach the intended observer. In such weather conditions, it is all the more critical for the beacons to observe them, for example, in aeroplane runways during takeoff and landing, in maritime navigation, railways, and vehicular traffic on highways.

The physics of the scattering reveals that only a small number of photons from a source of light retain their original directions of motion after the scattering. The water droplets in fog scatters most of the source light into random angles. Previously, scientists have attempted to use the physics of scattering and computer algorithms to process the resulting data and improve the quality of images. Whereas the improvements are not stark in some cases, computer algorithms require processing large volumes of data, involving ample storage and significant processing time.

Research by a team has offered a solution for improving the image quality without heavy computations. The team consists of researchers from the Raman Research Institute (RRI), Bengaluru; Space Applications Centre, Indian Space Research Organisation, Ahmedabad; Shiv Nadar University, Gautam Buddha Nagar; and Université Rennes and Université Paris-Saclay, CNRS, France. The study was partially funded by the Department of Science and Technology, Ministry of Science and Technology, Government of India. It was published in the journal OSA Continuum. The technique consists of modulating the light source and demodulating them at the observer’s end.

The researchers have demonstrated the technique by conducting extensive experiments on foggy winter mornings at Shiv Nadar University, Gautam Buddha Nagar, Uttar Pradesh. They chose ten red LED lights as the source of light. Then, they modulated this source of light by varying the current flowing through the LEDs at a rate of about 15 cycles per second.

The researchers chose to keep the camera at a distance of 150 metres from the LEDs. The camera captures the image and transmits it to a desktop computer. Then, computer algorithms use the knowledge of the modulation frequency to extract the characteristics of the source. This process is called ‘demodulation’. The researchers then showed that if they demodulated the image at a rate different from which they had modulated the source of light, the computer could not possibly determine the intensity of the source image, resulting in gibberish images instead.

The team saw a marked improvement in the image quality using the modulation-demodulation technique. The time the computer takes to execute the process depends on the image’s size.

“For a 2160 × 2160 image, the computational time is about 20 milliseconds,” shares Bapan Debnath, PhD scholar at RRI and a co-author of the study.

That is roughly the size of the image containing the LEDs. His colleagues had estimated the rate in 2016.

The team repeated the experiment a few times and observed the improvement each time. Once, when the fog varied in intensity during the observation, they did not record a marked improvement in the image quality. In this case, there was a strong wind, and they observed fog trails across the scene. The density of the water droplets in the air changed as time passed, which rendered the modulation-demodulation technique less effective.

Next, the researchers changed the experimental setup. They made an external material, a piece of cardboard kept at a distance of 20 centimetres from the LEDs, to reflect the light to the camera. The distance between the cardboard and the camera was 75 metres. The modulated light reflected from the cardboard, travelled through the fog, and was then captured by the camera. They demonstrated how their technique still significantly improved the quality of the resulting image.

Then, the team turned to investigate their technique in the presence of an external source of light. For this purpose, they experimented in sunny conditions. They kept the cardboard and the camera separated by 150 metres and used a reflector made of plastic to reflect sunlight into the camera. Even then, after performing the demodulation of the source, the image quality was high enough to distinguish the LEDs from the strongly reflected sunlight.

The researchers have made a case for improving image quality in foggy conditions. The project costs are low, requiring only a few LEDs and an ordinary desktop computer, which can execute the technique within a second. The method can improve the landing techniques of aeroplanes by providing the pilot with a good view of the beacons on the runway, significantly better than relying only on reflected radio waves as is presently the case. In rail, sea, and road transportation, the technique can help reveal obstacles in the path that would otherwise be hidden by fog. Spotting lighthouse beacons can also become easier, and more research can demonstrate the effectiveness in such real-life conditions. Given the significant possibilities, the team is presently investigating whether the technique can apply to moving sources.


This article has been run past the researchers, whose work is covered, to ensure accuracy.

Editor's Note: This article was first published here.