Environmental Situational Awareness using Neuromorphic Vision Sensors and IMU-based SLAM

Supervisors:

Primary supervisor Dr Saeed Afshar

Description:

Human eyes perform several rapid jerk-like movement of the eyeball called saccadic motions every second. The saccadic motion of the eye generates large shifts in the visual field. Yet humans and many other animals can reduce their visual perception’s sensitivity to the sudden displacement of visual stimuli using a process called Saccadic Suppression of Displacement (SSD). The biological visual perception can generate a stable visual perception even with the ego motion of the body and head. The working principle of biological SSD is still an open question, but many modern signal processing approaches have been applied to the perform Simultaneous Localisation and Mapping (SLAM) using conventional vision sensors. The visual SLAM-based methods can efficiently calculate the ego motion of the visual sensor and localise itself in any given environment. This allows us to build a stable projection of the visual stimuli on to the 3D perception of the surrounding environment. The neuromorphic sensors can capture visual stimuli with higher temporal precision and dynamic range than the conventional cameras. Visual SLAM using neuromorphic vision sensors is an active area of research and has been successfully applied to perform localisation and mapping of robots in an environment. In this project, we plan to combine the visual SLAM methods with active saccadic motions tracked by IMU sensor inside the sensors to generate a stable representation of the sensor surroundings, and actively monitor the environment for changes in the surroundings. Building on top of space situational awareness methods built previously at ICNS to perform star mapping, we will be extending it to challenging environments with complex backgrounds.

Figure2: Indoor Visual SLAM using conventional image sensors. Reference.

Outcomes:

This project will investigate neuromorphic vision sensors for Simultaneous Localisation and Mapping of the surrounding environment. The project will involve developing algorithms to process data from an actively moving event-based camera with motion sensors to generate a map of the surroundings and simultaneously detect changes in the environment.

  • Develop an algorithm to map event-based data and location of the sensor using IMU data. This would involve tracking the trajectory and ego-motion of the sensor and estimate the shifts in the visual receptive field.
  • Investigate event-based data representations for event stimuli generated by an object that are invariant to the motion of the sensor and detect changes in the environment by matching the representations.
  • A working prototype of the setup to map real world scenarios. Investigation of the effects of occlusions and changing lighting conditions in the environment.
  • Future work could involve using the generated map for tasks that depend on changes in the surrounding environment and investigation of generating event data at the 3D environment level.

Eligibility criteria:

Experience with C++, Python, MATLAB, or other equivalent languages for developing and testing the algorithms. Experience with computer vision algorithms and strong mathematical background.