SLAM Drone

This is the quadcopter with integrated Raspberry Pi for autonomy, performing visual SLAM under Arkane Works. As an Autonomous Systems Engineer at Anoop Singh Robotics, I developed a 250mm carbon fiber quadcopter for vision-based SLAM, integrating Raspberry Pi 4, Arducam IMX708 camera, SpeedyBee F405 controller, and ArduPilot firmware for autonomous waypoint navigation and obstacle avoidance.

Implemented EKF/Particle Filter algorithms in Python/MATLAB for real-time pose estimation and mapping, fusing IMU/camera data to minimize drift. Prototyped desktop SLAM with Arduino and ultrasonic sensor using 2D EKF in Python for landmark-based localization.

Applied SLAM principles including IMU fusion for dead reckoning, Kalman filtering for noise reduction, and barometer integration for vertical pose estimation for autonomous hover and homing. Applied dynamics, control (PID tuning), and embedded systems, linking to aerospace uses like drone deployment and guided landing in rocketry.

CAD Design of SLAM Drone Frame EKF Simulation Plot

Used Fusion 360 CAD for frames/mounts, electrical wiring/soldering, propulsion tuning with brushless motors/LiPo batteries, and iterative prototyping/testing. This project demonstrates autonomous navigation for aerospace applications.