Site icon Wil Selby

Experiments

Details of my current experiments and projects can be viewed below.

 


 Custom Quadrotor Autopilot Development

18 November 2015
Wil Selby

The purpose of this project was to develop a basic autopilot utilizing the ArduCopter software libraries, an APM 2.5 board, and a 3DR Quad-C Frame. This video tutorial briefly covers the following topics:
-Quadrotor System Modeling describing the coordinate system used to develop the model and the derived non-linear equations of motion
-Model Estimation and Verification. This includes modeling the accelerometer and gyroscope, the motor system, and estimating the unknown model parameters using measurements from a Vicon motion capture system.
-Control System Design and Simulation describing the nested position, attitude, and angular rate PID control systems as well as the simulation of the autopilot based on the estimated system model.
-Autopilot Implementation using the ArduCopter libraries, and APM 2.5 board and a 3DR quadrotor frame.


ArduCopter Derby Circle Waypoint Mission
May 19, 2013

Wil Selby

This first experiment demonstrated the ability of the ArduCopter to accurately follow pre-programmed GPS waypoints. The ArduCopter was flown manually initially and then switched into auto-mode. Once in auto-mode, the ArduCopter followed the GPS waypoints indicated on the map with green dots. The dotted circles indicate the perimeter of the GPS waypoint. Once inside this circle, the ArduCopter was considered at the GPS waypoint.

In the video, the larger screen is the display from the ground control station (GCS). The map depicts the locations of the GPS waypoints. On the left of the screen are the flight measurements such as roll, pitch, and yaw angles. This graphic gives a complete description of the items displayed on the GCS. For this mission, the Arducopter loitered at each waypoint for 10 seconds before autonomously landing at the last waypoint. All of the telemetry is logged and available for review and playback through the APM Mission Planner software. In the lower right hand corner, the video recorded from the GoPro is displayed. For this experiment, the GoPro was mounted on the edge of the forward arm of the ArduCopter.

ArduCopter Autonomous Photomosaic
June 8, 2013

Wil Selby

In this experiment, a small Unmanned Aircraft System (UAS) represented by the ArduCopter is used to create a photomosaic image of a user defined region of interest. This enables a small UAS with a limited imagery payload to act as a wide aerial surveillance platform. This is useful if the target is time sensitive and larger imagery platforms are unavailable.

To begin, the user creates a polygon to bound the region of interest. This region is populated with GPS waypoints and the GPS waypoint mission is sent to the ArduCopter’s autopilot. A GoPro Hero 3 is attached to the quadrotor and oriented towards the ground. It is programmed to take pictures at 2 second intervals. Once the UAS has completed the GPS mission, the images are taken off of the GoPro.

Once the mission was complete, the image files were uploaded to a computer running Microsoft Windows 7. Using Microsoft’s Image Composite Editor (ICE), each individual picture was stitched together to create a single, large, high resolution panoramic image of the target area. The output of this process is shown to the right. According to Microsoft ICE’s website, “Microsoft Image Composite Editor is an advanced panoramic image stitcher. Given a set of overlapping photographs of a scene shot from a single camera location, the application creates a high-resolution panorama that seamlessly combines the original images.” This process works well with images that contain different objects and structures. The software runs into trouble with featureless images such as those of a grassy lawn which makes it difficult to identify corresponding features in adjacent images.

ArduCopter Sensor Fusion Utilizing Augmented Reality
July 15, 2013
Wil SelbyThis experiment was a proof on concept demonstration to show the utility of augmented reality. GPS waypoints were used to simulate the location of a point of interest determined by some other sensor type. The points could be pre-programmed or updated in real time depending on the mission.As the GPS location came into view, a visual representation of the point was displayed on the live video. This would allow someone monitoring the video sensor to keep a desired target in view without needing to focus on an external source of information. Additionally, telemetry data such as the ArduCopter’s speed, heading, altitude, and battery life are displayed on the video in real time.The program OSDLite was used to read the MavLink messages from the ArduPilot and overlay them on the video. The video was taken by a GoPro Hero 3 on-board the ArduCopter and transmitted via a 900 Mhz 200mW wireless video system. An EasyCap D60 was used to read the component video output of the video receiver into the computer’s USB port.

Panoramic Image Created Using Microsoft’s ICE

[simple-social-share]

Exit mobile version