J. de Curtò i DíAz

March 12, 2021

"End-to-end Real-time Perception in Lunar Rovers"

Hey everybody,

Following on an earlier post (https://world.hey.com/decurtoidiaz/open-dataset-with-astounding-images-from-the-moon-9fcabd1b), I share here a publication that we recently released where we make good use of the CE4 dataset previously introduced.

Here are some preliminary results from the segmentation of the lander which I think are very cool.

vc_de_Curto_i_DiAz.pdf
vc_de_Curto_i_DiAz.pdf 1.95 MB


Vulcan Centaur: towards end-to-end real-time perception in lunar rovers.
De Curtò and Duvall. 2020.
https://arxiv.org/pdf/2011.15104

Abstract goes like this; again if you find the publication useful, please *do* cite it:

We introduce a new real-time pipeline for Simultaneous Localization and Mapping (SLAM) and Visual Inertial Odometry (VIO) in the context of planetary rovers. We leverage prior information of the location of the lander to propose an object-level SLAM approach that optimizes pose and shape of the lander together with camera trajectories of the rover. As a further refinement step, we propose to use techniques of interpolation between adjacent temporal samples; videlicet synthesizing non-existing images to improve the overall accuracy of the system. The experiments are conducted in the context of the Iris Lunar Rover, a nano-rover that will be deployed in lunar terrain in 2021 as the flagship of Carnegie Mellon, being the first unmanned rover of America to be on the Moon.

Best regards,
De Curtò i DíAz.