Skip to main content
Wikispaces Classroom is now free, social, and easier than ever.
Try it today.
Pages and Files
Assignment 4 - Kinect
So you have a Parrot AR Drone 2.0, but the smartphone controls just don't feel
enough for you? Try Kinect for Microsoft! We have created an interface that allows you to move the drone with your body. This control has everything but the actual sensation of flying.
Kinect, Visual Studio 2012, and ARDrone Control
Kinect for Windows
We used the Kinect for Windows to track gestures that was later translated to drone commands. Common actions inlude creating a circle to start/stop the drone, move left, move right, move up, move down, go forward and backward.
We used the Parrot AR Drone v2 for this project. This is the latest drone available. One of the important feature of this drone is that it has self stabilization in it, which makes it easier to test. The drone creates it own wifi network and unfortunately multiple controllers can't control this at any given time.
Visual Studio, Drone Control.Net API, Kinect Toolbox
Several API's were used to create this project. You can find the link in the resources section below.
Using the Kinect Toolbox and the AR Drone Control API we defined 7 gestures to control the drone's movements. We found code that allows us to control the different tilting and rotating angles in order to move the device. Running program automatically checks that the computer is connected to the Drone's wireless signal. Once connection is detected, the drone reacts to the Kinect gesture feedback. The Drone remains connected until the program is stopped or the wireless connection is broken.
Wikipedia image on flight terms
To start with controlling the Drone we used the DroneControl.Net API. We had issues using the API on a virtual machine, as the API looks for a direct WiFi connection to the drone, and in most virtual machines, the wifi is emulated to behave like a wired connection. The DroneControl API, when used on a Windows machine, allowed us to send commands and check the status of the drone.
We also used Kinect Toolbox for gesture recognition, which was a shift we did from the inbuilt Skeleton project. While the skeleton project gave a lot of flexibility in designing our own gestures, the gesture recognition algorithm was a painful for complex gestures that was something more than moving in one direction, say like a circle. The toolbox has support for complex gestures, which has to be recorded, but comes out of the box with the circle gesture. For this project we just needed one complex gesture to start and stop the drone.
The toolbox code we found had three pre-built gestures already loaded into the API, a "circle", "swipe left", and "swipe right". We used the circle for Takeoff() and Land() functions with the Drone's API software, and we used swipe left and right to control the yawaxis. We then created our own gestures based on the existing swipe gesture by modifying the appropriate variables. Based on wrist movement, we programmed pitch axis tilting to change based on backward and forward thrusting with the left hand. The gaz-axis (altitude change) changes based on lift and lower movements with the left hand. We then readjusted our software so the yaw axis no longer maps to a gesture in the software; instead the left and right swipe gesture with the left hand maps to the roll axis, which successfully moves the drone left and right.
We successfully connected the Drone to the computer, and through the Kinect gesture recognition, control the Drone's movement in flight. While the gesture recognition is touchy, and sometimes mis-recognizes gestures and makes flying the drone very 'glitchy' and unpredictable.
Challenges and Limitations
Obvious challenges include the time available, as this was a shorter project. But other challenges arose that we did not intend. Of the many video examples we found, only few of them had code examples or downloads as to how they accomplished mapping gestures to movement changes with the drone. Once we had found the code available, getting the computer to connect to the drone wifi was difficult with an Apple computer through VMWare, as the wireless connection needed wasn't being created in the virtual machine, and so the Drone failed to connect to the computer. In the end we ditched the Mac for now and loaned a Windows machine to make the drone work.
Other challenges were with controlling the drone itself. It was difficult at first to control the drone using the control app for made by manufacturer for apple or android. The drone take in a lot of parameters and enough practice is needed to get a hang of it.
In the future it would be interesting to allow the interface to have a record option for gesture recognition. Allowing the user to personalize and customize the Drone's control could make the concept more appealing. Perfecting the recognition of each gesture to minimize error when flying the drone.
ARDrone Control API github
to understand front-to-back and vise versa
Understanding drone movements in flight
to understand drone movement
AR Drone kinect control
how to use phone application
help on how to format text
Portions not contributed by visitors are Copyright 2017 Tangient LLC
TES: The largest network of teachers in the world
Turn off "Getting Started"