My work focuses on delivering perception and navigation systems that remain dependable when sensing is sparse, noisy, or degraded, especially in GPS-denied environments.
My research spans machine learning-based sensor calibration, radar-centric perception, and uncertainty-aware multi-sensor fusion for GPS-denied navigation. I have experience developing simulation-heavy autonomy pipelines and scalable dataset workflows, and translating those efforts into deployment-ready modules and clean, maintainable codebases.
A central theme of my work is the integration of classical estimation and control with modern learning-based methods. By pairing structured GNC principles with neural models, I aim to improve robustness, interpretability, and real-world performance under challenging operational conditions.
I am currently a Research Engineer at the Autonomous Vehicle Laboratory (REEF) at the University of Florida, where I develop learning-based calibration and fusion systems to enhance navigation reliability in GPS-denied environments. My work bridges robotics, perception, and learning-based navigation, with a particular focus on radar-only localization, multi-sensor fusion, and resilient autonomy for both aerial and ground platforms.
Previously, I contributed to research at the GAMMA Lab and the Bio-Imaging & Machine Vision Lab at the University of Maryland, working on VR-based driving simulation, trajectory prediction, and computer vision for underwater robotic systems. These experiences shaped my interest in autonomy that must operate beyond controlled settings.
Ultimately, I care about building autonomous systems that work outside ideal conditions low visibility, harsh environments, and tight computational budgets where reliability is not a feature, but the product itself.

