Researchers figured out how to take advantage of different lens adapters that can be mounted in front of a single camera to enable it to take images ranging from a macroscopic scale—think landscape—all the way down to a microscopic scale—think cells and bacteria—thus spanning at least six orders of magnitude.
The new prototype, called the Astrobiological Imager, is described in the journal Astrobiology.
“For each scale, there is of course one or even several imagers that are superior to our instrument for that particular scale,” says Wolfgang Fink, associate professor in the department of electrical and computer engineering at the University of Arizona. “However, there is no instrument out there that can go across several orders of magnitude.
“Think of the world’s best decathlete as opposed to the world record holders in each individual discipline. That’s the best analogy. Our camera is the best decathlete.”
For example, HiRISE, the High Resolution Imaging Science Experiment instrument aboard NASA’s Mars Reconnaissance Orbiter, has imaged the Red Planet in unprecedented detail. But as a space-borne instrument, it can only resolve features about the size of a kitchen table and is not capable of microscopic imaging. If the table were set with plates or anything smaller, HiRISE wouldn’t know.
The Astrobiological Imager, on the other hand, could image the table from far away, then move closer to take detailed shots of the dinnerware, and finally zoom in to take high-resolution pictures of a single salt crystal left on one of the plates.
Like a Field Biologist On Earth
For the prototype, Fink and his team modified an $85 point-and-shoot camera with parts adding up to less than $100. Mounted on the camera lens is an adapter ring with a special lens that shortens the camera’s minimal focal distance so it can be directly placed on the object and still use its built-in autofocus.
“With the newest generation of digital cameras and their better lenses, you can get down to the limit of what is optically resolvable,” Fink says. “In the time since the prototype was assembled, imaging sensors have become smaller and have more densely packed pixels. With a 20-megapixel camera modified in this way, we could get down to a few hundred nanometers. In other words, the optical limit of a light microscope.”
The idea is to enable a robotic rover exploring another planet with the imaging capabilities of a field biologist on Earth: a pair of eyes, binoculars, a hand lens, a dissecting microscope, and a light microscope.
“The idea is contextual imaging to subsequently zoom in on areas of interest in a nested fashion, until you hit the sweet spot, which you want to image microscopically. For example, to find microbial communities in rock formations.
“Mounted on a rover, our camera would be equipped with a rotating turret containing different adapter lenses. From an astrobiological point of view, you need the context first, so we’d use it in wide-angle mode to look around in search for promising targets, then drive to, say, a rock pile, image individual rocks, then go close to image patches potentially containing life, and then zoom in to produce a microscopic image of anything that might be living on or beneath that rock surface.”
Living Under Rocks
In this fashion, Fink and his team tested their Astrobiological Imager in the Mojave Desert, using it to photograph sandstone outcroppings and scan them for promising patches indicating microbe colonies on the rocks. Moving in closer, they used it to image the growth up close, revealing the close relationship between sand grains and biomass. The team was able to microscopically image a microbial colony living beneath a rock surface.
Equipped with a device that blocks stray light, the imager could use built-in LEDs emitting well-defined light and analyze the reflected light, which would allow researchers to perform a spectral analysis of the sample and get an idea of its chemical composition.
Fink says he is convinced there will be more multipurpose instruments like the Astrobiological Imager in upcoming space missions. The underlying technology of the adapter-based imaging capability is patented.
“In principle, our imager could be used on a mission like the OSIRIS-REx asteroid sample return mission, which is also led by the UA, but too far along obviously,” he says. “NASA is going toward multiuse instruments wherever possible, and they have to work more in tandem with each other. Our prototype fulfills those requirements.”
Researchers from Washington State University, the Desert Research Institute,Quaternary Surveys, and the Planetary Science Institute contributed to the study.
Source: University of Arizona. Posted by Daniel Stolte-Arizona on