In a new study, researchers at the Indian Institute of Science (IISc) show how a brain-inspired image sensor can go beyond the diffraction limit of light to detect miniscule objects such as cellular components or nanoparticles invisible to current microscopes. Their novel technique, which combines optical microscopy with a neuromorphic camera and machine learning algorithms, presents a major step forward in pinpointing objects smaller than 50 nanometers in size. The results are published in Nature Nanotechnology.
Since the invention of optical microscopes, scientists have strived to surpass a barrier called the diffraction limit, which means that the microscope cannot distinguish between two objects if they are smaller than a certain size (typically 200–300 nanometers).
Their efforts have largely focused on either modifying the molecules being imaged, or developing better illumination strategies—some of which led to the 2014 Nobel Prize in Chemistry. “But very few have actually tried to use the detector itself to try and surpass this detection limit,” says Deepak Nair, Associate Professor at the Center for Neuroscience (CNS), IISc, and corresponding author of the study.
Comments are closed.