A group of scientists at Johns Hopkins has proposed a new study to understand the similarities—if any exist—between brain cells and robots in an autonomous swarm.
“This is a very exciting new project from the perspective of a theorist and computational neuroscientist,” said Kechen Zhang, associate professor at the School of Medicine and the study’s principal investigator. “We propose that individual robots in a group can be thought of as neurons in an animal’s brain. They interact with one another to form dynamic patterns that collectively signal locations in space and time, much in the same way brain rhythms do.”
The scientists, from the Johns Hopkins University School of Medicine and Applied Physics Laboratory, will combine research into navigational planning in brains with autonomous robotic swarms to drive advances in both fields. The study will use new information and discoveries about how the brain allows an animal to navigate and change its routes while moving—called dynamic replanning—to improve swarming algorithms to the point that groups of robots will automatically adapt to changes in the environment in the same way that a rat knows which detour to take around an unexpected obstacle.
In turn, the neuroscientists will examine the replanning behaviors of drone swarms to evaluate their models of how the rodent brain dynamically replans paths, which will lead to new avenues for neuroscience research.
Zhang will work with a multidisciplinary team of co-principal investigators from APL, including applied mathematician Kevin Schultz, neuroscientist Grace Hwang, robotics researcher Robert Chalmers, and STEM program manager Dwight Carr. Joseph Monaco, a postdoctoral biomedical engineering fellow at the School of Medicine, is a theoretical modeler on the study. The project was one of 18 funded in September by the National Science Foundation to conduct innovative research on neural and cognitive systems.
The study’s genesis began in November 2017 at a poster session at the Society for Neuroscience conference in Washington, D.C., when APL’s Hwang came upon a poster by Zhang and Monaco describing their work with a type of neuron they had discovered and named a “phaser cell.” Found outside the hippocampus in a deep, centrally located region of the brain, these neurons are critical to how the brain maintains its sense of location during spatial navigation, Zhang and Monaco theorized. An animal’s brain uses spike timing patterns in the electrical activity of these neurons (called phase) to determine its specific location in space. Hwang, a neuroscientist with an interest in brain-inspired robotics, noted that unlike other neurons used in spatial perception, phaser cells create a phase map of the environment that is absolute (oriented to fixed directions such as north and south), not relative (i.e., “on my left”).
“That was the big ‘a-ha’ moment for me,” said Hwang, “because both the phaser cells and some robotic control algorithms use phase to determine location. For the first time, I saw a way to create a cognitive map of the entire environment using phase.”
https://youtube.com/watch?v=wtN3YmoGfDk%3Fcolor%3Dwhite
Soon after that meeting, Hwang—who works in APL’s Intelligent Systems Center—was awarded APL internal research and development funding for an idea to develop brain-inspired small robots for GPS-denied areas. That proposal grew to include applied mathematicians Clare Lau and Kevin Schultz, and robotics researchers Robert Chalmers and Bryanna Yeh.
“We have neuroscientists here in the Intelligent Systems Center who collaborate with artificial intelligence researchers and roboticists,” said Chalmers. “And APL has an amazing hardware history in swarming vehicles. This is exactly the kind of cross-discipline collaboration the center is designed to encourage.”
Added Zhang: “I have been at Hopkins for over 15 years, and this is my first collaboration with APL at the intersection of neuroscience and robotics. I look forward to a fruitful collaboration at the interplay of neuroscience, engineering, and robotics that will find brain-inspired solutions to controlling distributed groups of robotic agents.”
In early 2018, the project was awarded expanded R&D funding; encouraging simulations produced by Hwang and Schultz from that study led Zhang and the APL team to submit their proposal, “Spatial Intelligence for Swarms Based on Hippocampal Dynamics,” for an NSF grant. The key realization of the NSF proposal was the need to include an additional emergent phenomenon that occurs in the hippocampus, called sharp waves, that has been hypothesized to contribute to navigational planning in mammals. (Additionally, a new, related project led by Hwang has also received additional R&D funding from APL.)
The project also includes development of “Swarming Powered by Neuroscience,” a 16-hour micro-seminar for high school students that will be designed and operated by APL’s STEM Office. Robot swarms will be used to teach students about neuroscience, and the content will eventually be transformed into a course at APL’s STEM Academy.
The team envisions that this new approach to navigation will enable the kinds of tasks that society will be increasingly asking robots to perform—disaster relief and search and rescue, in addition to research and defense applications. These tasks require improved and more intelligent spatial coordination among many robots spread over large geographical areas. The team hopes that their research will create a revolutionary algorithmic framework for autonomous behaviors in swarming, as well as informing theoretical advances in understanding the brain.
Source: Read Full Article