Auto Pan and Zoom


Motivation

The motivation for this application came from interactive displays in museums and other public places. Most of the displays require the visitor to come close to the display, press a few buttons, scroll the pages to see the description of artifacts. What if the display automatically detected when a person is close and displayed greetings and content proactively rather than reactively? The content could zoom according to the proximity of the user to the display and it would pan according to the orientation of the user. Even better, the display could dim in brightness when no visitor or person was near, to conserve power, and light up automatically when it detects someone in its vicinity. The same principle can be applied to video games interfaces, for instance, to enable users to peer around a wall or look out of the window in first person shooter game.


Thus, as a prototype to demonstrate our idea of an interactive display, we decided to build an application that displayed content whose font size and background color varied according to the proximity of the user to the display and the ambient light in the room. It also panned the content depending on the orientation of the user.

Hardware

autopanzoom1.jpegAs shown in Figure 1, the system consists of four sensors – three IR distance sensors and one light sensor, an Arduino Uno and a laptop. One IR distance sensor is placed on the top of the laptop to detect the proximity of the user to the laptop screen. Two other IR distance sensors, that are placed one each on the left and the right sides of the laptop, detect the orientation of the user with respect to the laptop screen i.e. whether the user’s head is inclined more to the left or to the right of the screen. The light sensor detects the ambient light in the room. These sensors are connected to four input serial pins of the Arduino Uno. Figure 2 shows a closeup view of the Arduino circuit.

Figure 1. Overview of the system

autopanzoom2.jpeg
Figure 2. Closeup view of the Arduino circuit

YouTube Video


Youtube Video
How the system works

As demonstrated in the video, the text zooms and gets bigger as the user approaches the laptop screen. In addition, the user can see the entire text (which is too large to fit in the window on display) without scrolling as the text pans according to the user’s orientation. The background and text colors change according to the ambient light in the room. If there is plenty of light, the color is lighter and as the intensity of the ambient light decreases, the color changes to a scheme that displays content more effectively in poor light.

Challenges

The major challenge we encountered, as a team, was figuring out a common language and platform suitable for this assignment. As a result, we have both java and python code for the purpose of this assignment. Additionally, the sensors themselves are not calibrated very well and data values vary despite the fact that a distant object may be quite still. This makes any type of precise calibration on the software side difficult.

What we liked about the assignment

Overall, this assignment forced us to think and be creative. It was fun to do and brought out the creativity and innovation in us.

Future Work

Since this is a prototype, it demonstrates only the basic functionality of such a system. The system can be enhanced by varying the brightness of the LCD display and displaying flashy graphics according to the proximity of the user. Moreover, the display could switch off when it detected that no user is close, rather than after a fixed time, to conserve power.

GitHub Code