MOUSE: Multimodal Multitask Learning for Reliable Robot Situational Awareness in Adverse Environmental Conditions

PI: Eren Erdal Aksoy, Lund University
co-PI: Gustaf Hendeby, Linköping University

We propose MOUSE, a novel adaptive multimodal fusion framework that learns the complementary strengths of each sensor and dynamically selects optimal configurations for changing environments. Unlike existing methods, MOUSE jointly learns multiple downstream perception tasks within a unified framework to enhance situational awareness. Furthermore, MOUSE leverages learned semantic information to enhance measurement associations and object classification, enabling enhanced tracking, localization, and navigation in challenging and dynamic environmental conditions.

Project number: G2