Giving our devices visibility has enabled a range of applications in self-driving cars, object detection, and crop monitoring. But unlike animals, artificial vision systems simply cannot evolve under natural environments. Thus, dynamic visual systems that can navigate both on land and in water, still need to power our devices — leading researchers from MIT, Gwangju Institute of Science and Technology (GIST), and Seoul National University in Korea to develop a new artificial vision system that closely replicates Seeing a fiddler crab being able to handle both terrains.
The semi-terrestrial species – affectionately known as the calling crab, as they seem to lure with their massive claws – have amphibious imaging capability and an extremely wide field of view, with all current systems limited to a hemispherical. The new prosthetic eye, which resembles a small, largely nondescript black ball, makes sense of its input through a mixture of materials that process and understand light. The scientists combined an array of flat microlenses with a gradient refractive index profile, and a flexible photodiode array with comb-shaped patterns, all wrapped on a three-dimensional spherical structure. This configuration means that light rays from multiple sources will always meet at the same place on the image sensor, regardless of the refractive index of their surroundings.
A research paper on this system, co-authored by Fredo Durand, a professor of electrical engineering and computer science at MIT and affiliated with the Computer Science and Artificial Intelligence Laboratory (CSAIL), and 15 others, appeared in the July issue of the journal. Nature Electronics.
The amphibious and panoramic imaging capabilities were tested in experiments in the air and in the water by imaging five objects at different distances and directions, and the system delivered consistent image quality and a nearly 360-degree field of view in both terrestrial and aquatic environments. Meaning: He can see both underwater and on land, as previous systems were limited to one sphere.
There is more than meets the eye when it comes to crabs. Behind their huge tentacles there is a powerful and unique vision system that has evolved from living both underwater and on land. The flat corneas of the creatures, along with their gradient refractive index, reverse defocusing effects arising from changes in the external environment – an overwhelming limitation for other compound eyes. Crabs also have a three-dimensional multidirectional field of view, of an elliptical structure and a stalk eye. They have evolved to look at almost everything at once to avoid attacks on wide open tidal flats, and to communicate and interact with their mates.
Biomimetic cameras are certainly nothing new. In 2013, a wide field of view (FoV) camera that mimics the compound eyes of an insect was reported in temper natureAnd in 2020, a wide, fish-eye FoV camera appeared. Although these cameras can capture large areas simultaneously, it is structurally difficult to go beyond 180 degrees, and recently, commercial 360-degree vision foam products have come into operation. However, these things can be heavy, since they have to combine images from two or more cameras, and to expand the field of view, you need an optical system with a complex configuration, which causes image distortion. It is also difficult to maintain the ability to focus when the surrounding environment changes, such as in the air and underwater – hence the impulse to look at the connected crab.
The crab proved to be a worthy inspiration. During the tests, five cute objects (dolphin, plane, submarine, fish and ship) at different distances were viewed on the artificial vision system from different angles. The team conducted multi-laser spot imaging experiments, matching the artificial images to the simulation. To go deeper, they immersed the device halfway in water in a bowl.
A logical extension of the work includes looking at biologically inspired adaptation schemes for light in the search for higher resolution and superior image processing techniques.
Rogers, the Louis Simpson and Kimberly Querrey Professor of Materials Science and Engineering, Biomedical Engineering and Neurosurgery at Northwestern University, who was not involved in the work. “Potential uses range from population monitoring to environmental monitoring.”
This research was supported by the Institute of Basic Sciences, the National Research Foundation of Korea and a GIST-MIT Research Collaboration Grant funded by GIST in 2022.