During mating, the appearance and smell of a potential partner are important for Heliconius butterflies. Despite the brain’s small size, it must process both types of sensory input simultaneously. They are better at multisensory decision-making than current AI technologies, which require significant energy expenditure to achieve similar results.
To fill this gap, a team of Penn State researchers has developed a more advanced and energy-efficient multisensory AI platform. This breakthrough could be a game-changer for robotics and smart sensors that detect potential hazards such as structural defects and chemical leaks.
“If you think about the AI that we have today, we have very good image processors that are visually based, and very good language processors that use audio.” said Saptarshi Das, associate professor of engineering science and mechanics and corresponding author of the study. “But if you think about most animals and humans, decision-making is based on multiple senses. AI works very well with single sensory input, but current AI relies on multiple senses. No decisions have been made.”
Heliconia butterflies have a unique way of choosing a mate. They determine whether the other butterfly is a Heliconius butterfly by combining visual and chemical cues of pheromones emitted by the other butterfly. What’s even more amazing is that they do this with tiny brains that use minimal energy.
This is completely different from modern computing, which consumes large amounts of energy. These tiny creatures can utilize multiple sensory inputs simultaneously to perform complex computational tasks.
Scientists have discovered a way to electronically mimic butterfly behavior using 2D materials. The hardware platform developed by the researchers is composed of two of his materials: molybdenum sulfide (MoS2) and graphene. MoS2 is a memory transistor that can perform both memory and information processes and was chosen for its light-sensing ability, which mimics the visual abilities of butterflies. Graphene, on the other hand, acts as a chemistorist that can detect chemical molecules and mimic the butterfly brain’s pheromone detection.
“Visual cues and chemical cues from pheromones determine whether a female butterfly will mate with a male butterfly.” said co-author Subir Ghosh, a second-year engineering science and mechanics doctoral student. “So we thought about how we could create a 2D material with that functionality, and that inspired us to come up with an idea. By combining photoresponsive MoS2 and chemically active graphene, and the potential to create a visual chemistry integration platform for neuromorphic computing.”
The researchers’ experiments involved exposing the dual-material sensor to different colors of light and applying solutions with different chemical compositions similar to pheromones emitted by butterflies. They want to test how well the sensor can integrate information from both photodetectors and chemical sensors, just as mating success in butterflies depends on matching wing color and pheromone strength. That’s what I was thinking.
Based on the output response, the researchers were able to conclude that the device can seamlessly integrate visual and chemical cues. This shows great potential for sensors to process and interpret different types of information simultaneously.
“We also introduced adaptability into the sensor’s circuitry, allowing one cue to play a more important role than the other.” said Yikai Zheng, a fourth-year doctoral student in engineering science and mechanics and co-author of the study. “This adaptability is similar to how female butterflies adjust their mating behavior in response to different scenarios in the wild.”
According to the researchers, dual sensing in a single device is more energy efficient than how current AI systems operate. Instead of collecting data from different sensor modules and sending it to a processing module, the new method allows him to integrate the two senses into one device. This not only reduces delays but also minimizes excessive energy consumption.
Next, the researchers plan to expand the device to allow it to integrate all three senses, similar to how crayfish use visual, tactile, and chemical cues to sense prey and predators. are doing. The ultimate goal is to develop hardware AI devices that can handle complex decision-making scenarios in diverse environments.
“Sensor systems can be installed in locations such as power plants to detect potential problems such as leaks or system failures based on multiple sensory signals.” Ghosh said. “Chemical odors, vibrational changes, visual spotting of weaknesses, etc. This relies on multiple senses, not just one, so what systems and staff do to quickly fix it.” It will help you decide if you need it.”
Reference magazines:
- Ekai Jen, Subir Ghosh, Saptarshi Das. A butterfly-inspired multisensory neuromorphic platform for integrating visual and chemical cues. Advanced Materials, 2023. DOI: 10.1002/adma.202307380