Hi! My name is
Dew Bhaumik
I enjoy insightful conversations, inspiring others, and bringing futuristic ideas to life!
I enjoy insightful conversations, inspiring others, and bringing futuristic ideas to life!
Hey! I’m Dew, a Mechatronics Engineering student at the University of Waterloo. My goal in life is to change the world through innovation and futuristic technology. Aside from robotics and autonomous technology, my interests range from travelling to diverse countries, learning about various cultures, and sharing stories with passionate individuals, to my obsession of playing and discussing sports such as soccer, basketball, and MMA!
My love for robotics has led me to learn skills such as C++, Python, ROS2, OpenCV, and PyTorch. I’ve gained experience in fields such as Perception/Computer Vision, State Estimation, Motion Planning, and Controls through personal projects, design teams, and six internships at some super cool companies!
For my most recent internship, I joined Tesla Optimus on the Navigation team in Palo Alto, California. As a Robotics Software Engineer, I worked on advancing the mapping and localization stack for humanoid robots. My projects included enhancing SLAM/VIO pipelines to achieve sub-centimeter pose estimation, integrating modern neural feature descriptors like ALIKED and LightGlue, and improving global bundle adjustment with Ceres. I also aligned point clouds to generate sparse feature maps and occupancy grids for relocalization, and transitioned VO to VIO with IMU preintegration and online bias calibration—cutting ATE and drift significantly. It was an incredible experience seeing my algorithms contribute directly to the future of humanoid robotics.
Before that, I spent four months in Tokyo at Reazon Human Interaction Lab for my fifth internship, where I focused on building robust 6-DoF tracking for humanoid robots. I engineered adaptive Madgwick filters for multi-IMU fusion, achieving sub-degree orientation accuracy, and performed fisheye camera intrinsic calibration alongside Apriltag-based PnP extrinsic calibration. I then combined vision, IMU, and joint encoder data with an EKF to produce accurate, low-latency hand and finger pose estimates. This work enabled the lab’s humanoid platforms to achieve precise manipulation—an amazing chance to bring perception and state estimation research into the real world.
For my fourth internship, I worked at Noah Medical, a surgical robotics company focusing on robotic bronchoscopy and medical imaging. As a Computer Vision & Navigation Software Engineer on the Imaging & Navigation team, I worked on localization/state estimation and vision-based navigation with sensor fusion and machine learning!
I worked at Impossible Metals for my third internship, on the Underwater Robotics team as a Robotics Software Engineer, focusing on Motion Planning and Controls. During that time, I was actively involved in advancing control systems—pioneering a Model Predictive Control system for AUVs, tuning a robust PID control system, and engineering a ROS2 based D* Lite motion planning algorithm!
In my second work term, I worked an internship at Canadensys Aerospace on the Lunar Rover Team. As a GNC Robotics Software Engineer, I developed an obstacle detection/avoidance algorithm for Canada's first ever lunar rover. NASA will launch this rover to the moon in 2026!
My first work term was at Mission Control Space Services, a Space robotics company. In Spring 2023, our work landed on the Moon and we became the first company in the world to demonstrate deep learning AI in lunar orbit!