Research

Active SLAM

Simultaneous Localization and Mapping (SLAM) is a probabilistic inference technique allowing an autonomous robot to estimate its location from onboard camera and inertial sensors and construct a map of its surroundings in real time. SLAM research has been instrumental in allowing robotics to transition from the factory floor to numerous applications in unstructured environments, such as autonomous transportation, structure inspection, mining, environmental monitoring, and many more. The traditional SLAM algorithms operate in a passive estimation setting, where data is provided to the system but its acquisition is not optimized. Autonomous systems employing SLAM see but do not look. Our central research objective is to develop a novel planning algorithm for SLAM by actively optimizing the robot’s trajectory to obtain more informative data and minimize uncertainty in its environment model.
Relevant publications: [IROS’22] [ACC’22] [IROS’21]
Relevant project page: https://github.com/ExistentialRobotics/ROAM

Vision Tracking

Target tracking serves as a problem to reduce the uncertainty in the target state of interest by planning the trajectory of the sensing robot gathering information about the dynamic target state. The difficulty of the general active target tracking problem is inherent in predicting the future target state, optimizing the sensing robot trajectory with a limited Field of View (FoV), and taking into account the stochasticity of the target motion and the observation from the sensor. The goal of this project is to clarify the reliable prediction method for the future target state and developing a control policy for the target tracking by optimizing the robot motion to maximize acquired information under limited FoV with occlusion. Both model-free and model-based methods are pursued to obtain a robust and implementable control policy under reasonable amount of data.
Relevant publications: [ICRA’25] [L4DC’23] [ICRA’23]
Relevant project page: https://existentialrobotics.org/VisibilityControl/

Safe Autonomy

One of the significant tasks for implementing autonomous robots in practice is to guarantee safety under the executed planning. The global information of the environment is hard to acquire in unknown and unstructured space. The on-board sensors equipped with robots instead enables perception of the local information. We pursue to found a robust safe navigation to reach a goal without collisions with objects under uncertain robot and environment models via utilizing only local information. The uncertain models can be identified online through synthesizing a safe navigation with a dynamics learning using deep neural network. We will demonstrate such a sophisticated control and learning algorithms in the hardware experiments of ground robot and quadrotors.
Workshops: [ICRA’25] [ICRA’21]
Relevant publications: [ICRA’25] [L-CSS/ACC’23]