Jeffrey Krichmar, Ph.D.
November 5 @ 11:00 a.m. - 12:00 p.m.
FreeJoin the Center for the Neurobiology of Learning and Memory (CNLM) for a hybrid event featuring Dr. Jeffrey Krichmar, Professor of Cognitive Sciences at the University of California, Irvine.
This event will be held in-person in the Herklotz Conference Center and virtually via Zoom.
Biologically inspired robot navigation
We take inspiration from recent neurophysiological findings to create a flexible navigation system for mobile robots. In the first part of my talk, I will present a neuromorphic path planning algorithm inspired by place cell behavior and experience-dependent plasticity. Our navigation system utilizes a spiking neural network wavefront planner and E-prop learning to concurrently map and plan paths in large, complex environments. We incorporate a novel method for mapping which, when combined with the spiking wavefront planner, allows for adaptive planning by selectively considering combining costs. The learning is continuous and does not require retraining due to changes in the environment. The system is tested on a mobile robot platform in an outdoor environment with obstacles and varying terrain. On real and simulated paths, our system outperforms state-of-the-art robot path planners. The spiking wavefront planner is compatible with neuromorphic hardware and could be used for applications requiring low size weight and power. In the second part of my talk, I will discuss how we seamlessly move between global perspectives and first-person perspectives and why this is important for navigation, memory formation, and other cognitive tasks. To understand how a neural system might carry out these computations, we used variational autoencoders (VAEs) to reconstruct first-person perspectives from global map perspectives, and vice versa. Many latent variables in our model had similar responses to those seen in neuron recordings, including place cells, head direction tuning, and encoding distance to objects. These results could advance our understanding of how brain regions support viewpoint linkages and transformations. Currently, we are combining these two modeling approaches into a unified biologically inspired navigation system that can handle dynamic environments.