Title: Friendly Waving and Dancing Bunny


CMSC838F: Assignment 4 (11/21/12)


Zahra Ashktorab, MS Human-Computer Interaction

Allan Fong, PhD Computer Science


Video

Project Video

GitHub

https://github.com/shirini721/Kinect_

Motivation

It is the joy of every child, and quite possibly, every adult, to experience an animal’s reaction to one’s wave or gesture when they visit the zoo. Children tirelessly wave at pandas, elephants, and other animals in the exhibit for the hope of a glimmer of a reaction. Zoos offer this unique experiences for visitors. Sometimes animals do follow people, wave, or engage with people in other ways. When animals react to one’s movement and gestures, it makes the experience more fun and interactive. One could even conclude, that the main reason in which people visit the zoo is to witness such interactions with the animals. With available technologies, one can recreate this anticipated reaction. For this assignment, we were inspired and motivated by this type of motion and gesture based interaction with animals. We built a bunny system to mimic and respond to people’s movement and gestures. We wanted to build a system that is fun and interactive where people can have an immediate interaction feedback and where people can also “discover” new gesture based interactions.

Description

Our system composed of three servo motors, Legos, a stuffed bunny, an Arduino, Xbox Kinect and Arduino and Processing code, see Figures 1 and 2. The bunny has three main features; it can track you, wave back at you, and dance if given a “high five.”
IMG_4697.JPG
Figure 1: Components for the bunny system
IMG_4709.JPG
Figure 2: System set-up


First, we used Legos and two Futaba S3003 servo motors to create a bunny “skeleton” with actuating arms, see Figure 3 and 4. We bought the S3003 motors because they were small, light, and had sufficient torque (44.4 oz-in to 56.9 oz-in).
IMG_4692.JPG
Figure 3: Two servo motors to move the bunny's arms
IMG_4699.JPG
Figure 4: Legos were used as arm extensions


With this setup, we were able to make the arms do a combination of gestures. For example, we programmed the bunny to wave back to a person. The bunny will also go into a dance mode in response to a person’s “high five”. During the wave, the bunny’s right arm rotates 60 degrees. During the dance, both of the bunny’s arms rotate 90 degrees asynchronously, see Figure 5.
motions.png
Figure 5: Bunny motions


Next we used a Hobbico CS-72 servo motor to create a rotating platform for our bunny to sit on, see Figure 6. We choose the CS-72 because it was the highest torque motor (131 oz-in) we could find in the lab. The base structure was made of Lego and the rotating platform was a tempered hardboard panel, sturdy enough to support the bunny structure and light enough for the base motor, Figure 6. After testing the base motor with the bunny setup mounted on the platform, we found that we had to add more weight to the base structure to prevent it from rotating with the bunny. We add washers and an additional base plate to increase the weight of the base and to increase the frictional forces, preventing the base from moving during actuation. Furthermore, we discovered that the platform would tilt slightly because of the uneven weight distribution of the platform. As a result, we built 4 Lego pillars connected with (paper) guide rails to help support and stabilize the platform as it rotates, see Figure 7. We also slightly "beautified" the bunny platform with fabric to make it look more inviting, see Figure 8.
IMG_4696.JPG
Figure 6: Tempered hardboard panel used as platform
IMG_4700.JPG
Figure 7: Base needed extra weights and guide rails
IMG_4713.JPG
Figure 8: "Beautification" of the platform


This setup allows us to rotate the bunny to track a person moving across its field of view. We limited the rotation of the base to 60 degrees of rotation because 60 degrees provided the most realistic “tracking” response and it also didn’t add too much additional stress on the connecting wires.

The servo motors were connected to an Arduino Uno, Figure 9. The output commands from the Arduino depended on the serial input the Arduino received from the Kinect and Processing code.
circuit_schem.png
Figure 9: Circuit summary



Kinect and Processing

To detect the movement, we used an Xbox360 Kinect and connected it to an Apple Computer. We installed OPENNI, NITE and SensorKinect on our computer [1]. For our environment and programming language, we used Processing [3] and the available SimpleOpenNI Wrapper Library for Processing [2].

Detecting and recognizing gestures

The OpenNI/NITE middleware package recognizes several gestures: “wave”, “raise hands” and “click”. For our project we utilized the implementation of the "wave" and "click" gestures. Because of our motivation, the most obvious gesture to utilize was the “wave”. We decided to program the bunny to reciprocate a wave. We also decided that a dancing bunny is both entertaining and related to our motivation, so we decided to implement dancing as well. The dancing interaction is triggered when a user “clicks”. A click is an interaction that is similar to a “high five”. Though this gesture is hidden and not as intuitive, it can be perceived by users as a way to “unlock” a new interaction. During some demos, users were pleasantly surprised when their “high five” gesture triggered music and caused the bunny to dance.

Detecting a person

Using the Kinect and Processing, we also were able to track single users moving across the room. As the Kinect tracks a person, the Kinect sends a signal to Processing, which in turn sends signals to the Arduino through the serial. The position of the bunny is converted to a degree between 50 and 110 degrees to which the motor connected to the Arduino reacts. As a person moves across the room and within the range of the Kinect, the person’s center of mass is detected and the person’s position in the room is sent to the bunny. Users moving across the room enjoyed this interaction because as they moved they get the immediate feedback of the bunny moving with them.

Discussion

We were able to build an interactive bunny system that responses to people’s movements and certain gestures. Although the timeframe of this assignment was shorter than the other assignments, we still wanted to make something that was fun and can potentially be part of the HCIL or Hackerspace. We were able to set up the bunny system in the HCIL and observed the reactions of various people as they walked by. We got some positive feedback on the “cuteness” and creativity of the system. We think that the bunny can be a fun and interactive part of the HCIL, especially for kids and those who are kids at heart.

Challenges and Limitations
We faced several challenges during this assignment. In our first iteration of the dancing bunny, we wanted the bunny to “twist” while moving its arms. However, several people commented that it looked “creepy” so we decided to limit the dance to just the arms. We also had to improve the platform’s staccato rotations by increasing the detectable increments but there is a trade-off between smoothness in transition and position stability. Nevertheless, the platform rotation was not as smooth as we would have liked. This is an issues working with servo motors.

Another challenge we faced was that sometimes people who walk too fast past the Kinect would not be detected. The Kinect system would also have difficulties tracking a person if there were multiple people in its field of view. Furthermore, the Kinetics would sometimes take a few seconds to detect a person’s presence or not at all depending on the background noise. We tried our best to work around these challenges from the processing end and also with the open resources available on-line [1-2].

Another difficult that we ran into when having other people try our system is that people have different types of waves and high-fives. When people try our system, we would tell them to wave. Some people would wave in front of their faces, others would wave too slow or too fast. Therefore, their waves would not be immediately detected causing them to give up. Similarly, people’s high-five gestures greatly varied and was not always recognized.

Future Work
To make our system more interactive, we can improve the wave and “high-five” detection algorithms. We can also put up a sign by the bunny illustrating the specific gestures that can be detected, but this would eliminate the fun discovery component of this project. We can also increase the types of gestures recognizable by the bunny. For example, if a person raises both hands, the bunny can go into a different type of dance. We can also increase the degrees of freedom for the bunny’s arms, making it able to do different arm/hand gestures or mimic people. There are endless possibilities for making this system more fun and interactive.

Links to Resources, Related Works and Additional Inspiration

[1] http://developkinect.com/resource/mac-os-x/install-openni-nite-and-sensorkinect-mac-os-x
[2] http://code.google.com/p/simple-openni/
[3] http://processing.org/
[4] Bear waving video http://www.youtube.com/watch?v=O6Xo21L0ybE
[5] "Interactive" lion https://www.youtube.com/watch?v=6fbahS7VSFs
[6] Controlling a robot arm with Kinect and Arduino http://vimeo.com/31698679
[7] Head-following Kinect Robot Arm http://vimeo.com/31739856