Details of my current experiments and projects can be viewed below.
18 November 2015
Wil Selby
The purpose of this project was to develop a basic autopilot utilizing the ArduCopter software libraries, an APM 2.5 board, and a 3DR Quad-C Frame. This video tutorial briefly covers the following topics:
-Quadrotor System Modeling describing the coordinate system used to develop the model and the derived non-linear equations of motion
-Model Estimation and Verification. This includes modeling the accelerometer and gyroscope, the motor system, and estimating the unknown model parameters using measurements from a Vicon motion capture system.
-Control System Design and Simulation describing the nested position, attitude, and angular rate PID control systems as well as the simulation of the autopilot based on the estimated system model.
-Autopilot Implementation using the ArduCopter libraries, and APM 2.5 board and a 3DR quadrotor frame.
Wil Selby
This first experiment demonstrated the ability of the ArduCopter to accurately follow pre-programmed GPS waypoints. The ArduCopter was flown manually initially and then switched into auto-mode. Once in auto-mode, the ArduCopter followed the GPS waypoints indicated on the map with green dots. The dotted circles indicate the perimeter of the GPS waypoint. Once inside this circle, the ArduCopter was considered at the GPS waypoint.
Wil Selby
In this experiment, a small Unmanned Aircraft System (UAS) represented by the ArduCopter is used to create a photomosaic image of a user defined region of interest. This enables a small UAS with a limited imagery payload to act as a wide aerial surveillance platform. This is useful if the target is time sensitive and larger imagery platforms are unavailable.
To begin, the user creates a polygon to bound the region of interest. This region is populated with GPS waypoints and the GPS waypoint mission is sent to the ArduCopter’s autopilot. A GoPro Hero 3 is attached to the quadrotor and oriented towards the ground. It is programmed to take pictures at 2 second intervals. Once the UAS has completed the GPS mission, the images are taken off of the GoPro.
ArduCopter Sensor Fusion Utilizing Augmented Reality
July 15, 2013
Wil SelbyThis experiment was a proof on concept demonstration to show the utility of augmented reality. GPS waypoints were used to simulate the location of a point of interest determined by some other sensor type. The points could be pre-programmed or updated in real time depending on the mission.As the GPS location came into view, a visual representation of the point was displayed on the live video. This would allow someone monitoring the video sensor to keep a desired target in view without needing to focus on an external source of information. Additionally, telemetry data such as the ArduCopter’s speed, heading, altitude, and battery life are displayed on the video in real time.The program OSDLite was used to read the MavLink messages from the ArduPilot and overlay them on the video. The video was taken by a GoPro Hero 3 on-board the ArduCopter and transmitted via a 900 Mhz 200mW wireless video system. An EasyCap D60 was used to read the component video output of the video receiver into the computer’s USB port.
[simple-social-share]