Experience

Navigation Software Intern

May 2024 - Present

My most recent internship took place at Noah Medical, a pioneering company specializing in innovative surgical robotics. At Noah Medical, I was immersed in cutting-edge projects focused on enhancing the surgical precision of bronchoscopy procedures. As part of the Imaging & Navigation team, I engineered a robust IMU sensor fusion algorithm to improve the localization and state estimation of the state-of-the-art GALAXY surgical robot.


To develop this algorithm, I leveraged advanced techniques in sensor fusion to integrate data from various sensors in the IMU, optimizing the robot's 3D pose estimation accuracy during procedures. In another project, I utilized deep learning models and advanced point cloud registration algorithms to implement vision-based navigation and SLAM (Simultaneous Localization and Mapping) algorithms for real-time mapping and localization.


Overall, my experience at Noah Medical was incredibly rewarding. I am grateful for the opportunity to work on transformative technologies in surgical robotics and contribute to innovations that are advancing the field of medical technology.

Noah Medical.mp4

Robotics Software Engineer Intern

September 2023 - December 2023

During my third internship as a Robotics Software Engineer at Impossible Metals on the Underwater Robotics team, I had the incredible opportunity to work on groundbreaking projects in the realm of autonomous underwater vehicles (AUVs). At Impossible Metals, I played a pivotal role in advancing our Eureka II robot with a focus on precision control, stability improvement, motion planning, writing driver code, and general system software.


During my time here, I worked on developing a Model Predictive Control (MPC) system for the AUV, leveraging a linearized 6-DOF dynamic system model and a quadratic cost function for precise linear and angular position and velocity control. In parallel, I delved into the realm of PID control, fine-tuning a robust system that integrated 12 PID controllers for linear and angular position and velocity control. Utilizing advanced tuning methodologies, including Cohen-Coon and Ziegler-Nichols I was able to significantly improve the robot's control system. These enhancements not only optimized the AUV's performance but also showcased our dedication to achieving excellence in control system design.


Working alongside a team of dedicated engineers at Impossible Metals, I found immense satisfaction in contributing to projects that push the boundaries of robotics innovation. This experience has not only expanded my technical skill set but also deepened my appreciation for the impactful intersection of robotics and real-world challenges in underwater nodule collection. I am truly grateful for the opportunity to be part of such a visionary team, driving advancements in autonomous underwater vehicles.

Robotics Software Engineer Intern

January 2023 - April 2023

My second internship was at an extraordinary Aerospace company called Canadensys. At Canadensys, various high-level innovative projects ranging from spaceflight cameras to lunar rovers are being developed. On the Lunar Rover Team, I developed an obstacle detection/avoidance algorithm for Canada's first ever lunar rover!


To design and program an obstacle avoidance algorithm for this rover, I used C++ and various technologies such as the Sentis-ToF-M100 3D LiDAR, OpenCV, and the Eigen C++ library. I developed median filtering, Gaussian filtering, and bilateral filtering algorithms to filter out random noise in the LiDAR data. Using an algorithm called Least Squares Plane, I performed linear regression on XYZ LiDAR data read in from CSV files to estimate the ground plane. I then used rotation matrices and translations to transform points in the ground's reference frame to the LiDAR's reference frame. Based on their heights and the slopes of their surfaces, I detected obstacles and performed semantic segmentation on the LiDAR images to classify terrain such as boulders and ditches. I also established a TCP server-client system for image transfer between NISA spaceflight cameras, and researched various path planning algorithms for obstacle avoidance such as Artificial Potential Field, A* search, Bug 1 & 2, and RRT.


Overall, I really enjoyed my experience at Canadensys. I'm really thankful and fortunate I had the opportunity to work alongside brilliant engineers on historical projects like developing obstacle avoidance for Canada's first ever lunar rover!

Robotics Software Engineer Intern

May 2022 - August 2022

My first ever internship was at an incredible Space robotics company called Mission Control and what an experience it was! At Mission Control, I was fortunate to work alongside some of the most brilliant and passionate engineers I've ever met. Our work has been launched to the moon and in Spring 2023, Mission Control will become the first company in the world to demonstrate artificial intelligence on the moon!


During my time at Mission Control, I was lucky to have the opportunity to work on absolutely unreal tech! participated in a Lunar Mapping Mission where I used a LiDAR sensor and ZED 2 camera's visual odometry to perform SLAM, creating 2D and 3D maps of a 4000ft2 moon-yard with RTAB-Map. I also used OpenCV, C++, and a hazard detection algorithm to identify hazards in the rover's view and autonomously stopped the rover within 1m of several hazards with 89% accuracy. Additionally, I used C++ and Eigen to compute the axis of rotation for a PTZ camera and implemented UR16e robot arm depth perception and spatial perception by integrating ZED Mini and AXIS stereo cameras into the robot arm with C++.


I genuinely believe that working at Mission Control was the best first work term I could have asked for. I was able to not only discover my love for robotics and autonomous tech and learn various technical skills, but also meet incredible people who are friendly, intelligent, and passionate. Mission Control is a company that I won't forget because of the unbelievable opportunity and experience it gave me. It's a company I'm very thankful for and will recommend it to anyone.

Perception & Localization Engineer

May 2022 - December 2023

WATonomous is a creative student design team at the University of Waterloo, with the ambitious goal of developing a fully autonomous Level 5 self-driving vehicle. In 2021, WATonomous placed 2nd in the SAE International AutoDrive Challenge! On WATonomous,  I've been a member of the Perception and State Estimation team.


On the Perception Team, I performed Multi-Modal Object Detection by implementing the BEVFusion sensor fusion algorithm. This algorithm fused 2D images from stereo cameras with 3D point cloud data from LiDAR to create 3D bounding boxes used for detecting pedestrians, vehicles, and traffic signs. Additionally, I developed a ROS2 node in Python that subscribed to ROS2 topics containing 2D images and 3D LiDAR point cloud data from the NuScenes dataset. After performing BEVFusion with PyTorch, this node published bounding box predictions to the detections_3d topic.


On the State Estimation team, I worked on developing a ROS2 node in C++ that publishes ego vehicle pose estimates from the NovAtel OEM7 GNSS sensor to a localization node. This localization node then performs state estimation by fusing IMU data and wheel odometry with NovAtel OEM7 pose estimates using an Unscented Kalman Filter. Next, the node calculates the translation and rotation from the initial position with the wheel odometry and IMU data, estimates the current velocity, and localizes the ego vehicle.


These experiences at WATonomous have not only honed my technical prowess but also fostered a passion for pushing the boundaries of autonomous vehicle technology.

Perception Engineer

May 2022 - December 2023

UWAFT is a reputed student design team at the University of Waterloo, with the goal of winning the EcoCAR Mobility Challenge, a four-year autonomous driving competition between 12 North American universities. On UWAFT, I've been a member of the Connected and Automated Vehicle (CAV) Team as a Perception Engineer.


During my time here, I had the opportunity to work on cutting-edge perception systems in the realm of autonomous driving. I developed a semantic segmentation model using a ResNet-based CNN architecture in PyTorch, enabling the classification and segmentation of complex road scenes. This model achieved impressive results, marking a significant leap in understanding scene semantics.


In parallel, I tackled state estimation and localization by integrating IMU data with LiDAR point cloud data through an Unscented Kalman Filter (UKF). This fusion process proved critical in achieving reliable positioning in dynamic environments. To further refine pose estimation, I employed the Iterative Closest Point (ICP) algorithm to align 3D point clouds from successive LiDAR scans, substantially improved the system’s overall robustness.


These experiences at UWAFT have not only expanded my expertise in autonomous systems but also fueled my passion for advancing the frontiers of automotive technology.