Artemis Feasibility Study

Project Title

Feasibility of Event Based Vision Sensors servicing the Artemis Spacecraft

Project TimelineJune 2021 - June 2022
ResearchersAndré van Schaik, Gregory Cohen, Bharath Ramesh
Partners/Collaborators     

Project Synopsis

The overall goal of the project is to investigate the potential benefit of fusing data from an event-based vision sensor (EBVS) with data provided by a grayscale image sensor on the same platform during in-orbit docking and servicing missions. The overall technical aim was to predict the silhouette or edge map as the object rotates and moves towards the camera under non-uniform object reflectance, which makes detecting edge maps using conventional sensors more difficult.

Project Details

This feasibility project investigated the potential benefit of fusing data from an Event-Based Vision Sensor (EBVS) with data provided by a grayscale image sensor on the same platform during in-orbit docking and servicing missions. There are autonomous applications where rendezvous and docking/servicing may be called for with non-cooperative vehicles. In such a situation, having a neuromorphic camera that acts like a visual differentiator is likely to become more important. Based on the simulation studies done during this project, we conservatively conclude the benefits of combining EBVS with conventional cameras.

Sample result of our feasibility study for silhouette prediction (original in red) and estimated edge map (yellow). We observed the overall silhouette can be predicted accurately while the finer details on the spacecraft can be predicted only up to a few pixels, indicated by double edges in certain areas.