The Force
Leyla Norooz & Darren Smith


Ever wonder how it would feel if you could control "The Force"? Well, we've made that possible! ...Sort of. We've given you the ability to control a toy helicopter just by using the movements in your arms and body through the Kinect. The information from the Kinect is directed towards the helicopter through an IR emitter connected to an Arduino Uno. Throw that old remote control away!

Video Demo

The Force from Leyla Norooz on Vimeo.


Kinect for Windows
external image en-US_Kinect_for_Windows_L6M-00001_RM1.jpg
The Kinect for Windows sensor, when used with the Kinect for Windows software developer kit, provides you with everything you need to create innovative applications.
Arduino Uno
external image ArduinoUno_r2_front.jpg
The Arduino Uno is a microcontroller board based on the ATmega328. It has 14 digital input/output pins (of which 6 can be used as PWM outputs), 6 analog inputs, a 16 MHz ceramic resonator, a USB connection, a power jack, an ICSP header, and a reset button. It contains everything needed to support the microcontroller; simply connect it to a computer with a USB cable or power it with a AC-to-DC adapter or battery to get started.
S107 Helicopter
external image 1309266683_0.jpg
A fun little toy helicopter that is controlled via a IR emitting controller.
IR LED Emitter
external image az_ir_led_emitter_940.jpg
We bought this IR LED emitter with a detector at Radioshack. For this project, we only used the emitter.
2N3904 Transistor
external image 3904_Transistor_(L).gif
The transistor was used to send more power to the IR LEDs than Arduino alone can handle.


Controlling the Copter through Arduino

First, we concentrated on finding a way to control the copter through the Arduino. Since the remote control for the copter controls it through IR emitting LEDs, we tested several IR LEDs to try controlling the copter. We realized that in order to send enough power to the IR LED to create a longer range of pulses, we needed to use a transistor for power going to the IR LEDs. The copter speaks a specific language transmitted over a 38 kHz signal. Using some code we found online, we were able to control the copter through the Arduino.

Converting Body Positions to Commands

Next, we focused on creating gestures for the Kinect to control the copter. Using the basic skeleton example provided in the Kinect Toolkit, we were able to find the coordinates of the body. We use only the coordinates of the left+right hands and the top of the torso. Throttle is controlled by the average difference between hand height and body. Turning is controlled by the difference in left and right hand depths. Forward tilt is controlled by the average difference between hand and body depth. All controls are scaled by some constant and added to some other constant to make it so the neutral body position is a stationary hover and the full range of comfortable movement is utilizable.

Activation of the helicopter is done by raising your arms to full throttle. This also starts the Star Wars theme song to indicate state. Crossing your hands stops all helicopter commands and ceases the music. This is most useful in the event of a crash otherwise the helicopter will go all over and break itself.


Up: Hold arms above shoulder
Down: Hold arms below shoulder
Left: Hold left arm towards your body and right arm away from body
Right: Hold right arm towards your body and left arm away from body
Forward: Hold both arms away from body
Back: Hold both arms close to body

Controlling the Arduino from the PC

We use serial communication to connect to the Arduino. Our code is based on some tutorial code to establish an initial connection. This code was initially not working but the intent was clear. On the PC end it scans all open com ports, sending a special message. On the Arduino, it awaits the special message and when received returns "Hello from Arduino." This allows the PC to confirm this the correct device, which is handy since device finding is done automatically. From then on, the PC sends 5 byte messages. The first four are the same bytes sent to the helicopter (rotation, tilt, throttle, and calibration). The fifth byte is a constant, which the Arduino checks to make sure it is still in sync. Once Arduino passes the message along to the helicopter it sends a message back to the PC to inform it it is ready for the next command.


Copter Range

Achieving good range when controlling the with the Arduino is a known problem, with ranges typically of only 1-2 meters using a standard setup with 1 LED, resistor, and Arduino. We used 3 LEDs in series to increase emission (the default remote has 3 also). We also used a 2N3904 transistor to control power to the LEDs since the Arduino is not capable of outputting enough current (we got this idea from the Max Power IR LED kit). Finally to achieve really good range, we shorted the 46ohm resistor to achieve maximal current. Normally this may damage an LED, but since the 3 are in series we assumed there would be some reasonable internal resistance, and since we are only pulsing the LEDs for milliseconds, not leaving them on, overheating would not be a problem. This seemed to work extremely well in practice.

Serially connecting to Arduino

When sending serial data to the Arduino, we were sending it data much faster than the Arduino was reading it. This caused the serial buffer to overflow and start dropping data, making our alignment screw up. Our solution was to send a "waiting" command from the Arduino to the PC once we were ready for next command. Even though we got this working, it was a much bigger headache than we anticipated.

Installing Windows

Both of us had huge troubles getting Kinect working on our macs. Eventually Parallels was given a shot, and worked rather painlessly. DirectDraw did not work with Virtual box and VMware, causing the Kinect to be unusable. Bootcamp would fail when installing Windows.

Debugging Kinect

Outputting to the console from the C# code in the draw skeleton function, did not work (probably possible, but not obvious how to do it). To overcome this we output all debug information to the same canvas as the skeleton. This took us a bit of time due to unfamiliarity with C#. Once we did do it, we misunderstood the C# String.Format function, resulting in the Y and Z coordinates always being the same (actually all X, Y, and Z, but we didn't notice that at first). We humorously misdiagnosed this bug as that the skeleton example does not give Z (depth) coordinates, causing us to frantically search the internet for how we can get a 3d skeleton!

Creating Gestures

Creating gestures did not cause too much trouble. One difficulty was the z (depth) coordinate, was distance from Kinect, not distance in space. Therefore if the Kinect is lower than you, when you raise your arms they will appear further away even though you did not move them further back. Our solution to this was to just physically raise the Kinect. The ideal solution would be do some sort of calibration and transform the coordinates.

We also had some trouble with controlling the helicopter when we did not want to. So we created a gesture to toggle the mode on and off. This was problematic due to accidental activation. So we instead created two gestures, one to turn off, one to turn on.

Turning the copter On

Even with the standard remote controller, the copter often (and randomly) takes over 10 seconds to "lock on" to the controller signal. Once it it responds, it remains responsive. This is still a mystery to us, but was observed on all copters and controllers.

Breaking the copter

While creating the video, two of our three helicopters broke! Their lower blades would no longer spin. The copters were more fragile than we anticipated. In one case, the wire to the motor was severed (it seemed like bad design as it wasn't covered). This seemed unrepairable due to the thinness of the wire and its severed point is right next to a plastic hole it goes through to the motor. The other copter just had a gear slide out of place that could be pushed back into position.


Fork us!