BIO Web of Conferences 1, 00094 (2011) DOI: 10.1051/bioconf/20110100094

This paper presents a novel approach to train how to juggle using a sensor system based on one of the most commercial, sophisticated and accessible input devices, the Wii Remote controller. This platform, particularly its infrared camera, is used to develop a real-time sensor system for hand motion tracking and fit it into a virtual reality interface. The quality of our algorithm was tested through this 3D virtual interface, which has the finality to give the user visual feedback of his/her hand positions, and created for the user the sensation of juggling.


Introduction
Juggling is an ancient tradition.From as early as the Ancient Egyptian civilization to modern days, it is considered a skill that involves moving objects for entertainment or sport [3].The most traditional form of juggling is toss-juggling, in which the juggler tosses up three or more balls into the air to catch and toss up again.This is the most used style of juggling but there are many different styles such as cascade, shower, fountain, etc.With three balls, most neophytes attempt to juggle in the shower pattern (around in a circle), although in the cascade pattern in which the juggler's hands alternate throwing the balls to each other in a figure-eight motion, is far easier.It often takes just hours or days to learn to juggle three balls.But as practitioners refine their sense of touch and toss and add more balls, the learning times can increase.It can take weeks or months for four balls and months or a year for five [1].Recently, researchers found that individuals who practice juggling show a transient and selective structural change in brain areas that are associated with the processing and storage of complex visual motion [2].In our days, it is well known that virtual environments can be powerful training platforms for reallife tasks.Our principal objective is the analysis and design of a cost-effective sensor system that helps teaching and training the juggling essentials at the most effective technological cost: the Light Weight Juggling system (LWJ).We chose the Wii Remote controller device (Fig. 5, photo g)), impressively cost-effective at just $40 US.Among its other very favourable characteristics are the high-resolution high speed IR (infrared) camera and its Bluetooth connectivity [4].We use this device to implement a sensor system based on the IR sensor to track the hand movements and fit it into VR (virtual reality) where the user can train with their hand movement replicated by an avatar.Among the advantages of the cost-effective LWJ system approach, the fact that the system does not need any calibration to adapt to various users should also be highlighted.In this paper, we discuss about the different experiments that we performed with different Wii IR sensor prototypes to find the best solution to implement our system.The quality of our algorithm was tested through a 3D virtual interface, which has the finality to give the user visual feedback of his/her hand position and to give the sensation of juggling for the user [11].To create this illusion, we used two virtual hands and coloured balls in a simulator.The toss and catch are deduced from the user's hand acceleration, velocity and position during interaction.(Fig. 1).

Light weight juggling (LWJ) sensor system design
In this study we focused on researching the above mentioned IR sensor technology and developed a prototype based on the IR camera tracker as part of a LWJ system using VR interface.The next few paragraphs identify and explain the important stages in the development of the Wii Remote platform solution for the LWJ system that we designed.

The Wii Remote as an IR sensor
Before developing the Wii Remote IR prototype, a careful analysis of the two following referents was made in order to find a methodology that could help to develop a system that does not use much computational power and could still be cost-effective.

The Wii Remote infrared camera
It is proven that the Wii's IR camera is great for tracking infrared sources.In the tip of each Wii Remote is an IR camera sensor manufactured by PixArt Imaging.(Fig. 5, photo g).The camera chip features an integrated multi-object tracking (MOT) engine that provides high-resolution and high-speed tracking [4].It tracks up to four simultaneous IR light sources independently and gives out the coordinates and the strength of each tracked object.The Wii Remote camera detects up to 5 meters, and its angle of detection is around:22 • .If the IR light exceeds this angle, the camera will lose the IR signal.When the IR light is close to the Wii-camera the signal is greater than when it is further away.It is the opposite for the camera detection the signal is greater when the IR light is farther from the camera.

The IR light
The IR light that is detected by the Wii-camera can be produced by LEDs.Knowing this data, it is possible to compute the position of the light in 2D or 3D.The library used for the Wii Remote allows an array of four elements to store the four lights separately.This array is shown on the left side of figure 2; the right side of the figure shows the four signals (from two sensor bar) detected by the Wii Remote camera.When an IR signal is lost, its cell on the array becomes empty.If a new signal is found, then it is added into this cell.This situation could create confusion between the IR lights and consequently confusion between hand positions.One of the factors that causes this problem is the visibility angle of the LEDs [9], which is detected by the IR camera.This angle varies depending of the kind of LEDs used, this going from 30 • to 50 • .If the LEDs exceed this angle it is not detected by the Wii-camera.

Strategy of the solution
Section five discuses the different experiments done to decide the final prototype.After these experiments, we concluded that the most suitable prototype to obtain the hand positions is a sensor-array with two strong IR lights one at each end of the array.Instance of the sensor array the ultra and the standard bar could also be use.But, our system give better results when it is use with a light that is produced by five LEDs, which are enough to avoid signal confusion.Evermore, by using two IR lights on an array to obtain the hand positions and not just one IR light gives the LWJ system some extra advantages: 1.It gives better results to compute hand location.
2. It is possible to estimate hand rotation (row, pitch and yaw angles) easily.3. The relative distance of the hands can be computed, while using one IR light it is possible to compute only the hand location.

The hand tracker control
In order to implement the LWJ system based on the Wii Remote platform, we identified three variables to control and track the hand motion: position, velocity and acceleration.From the IR sensor is obtained the position from which is estimated the first derivative to obtain the hand velocity, and the second derivative is done to obtain the hand acceleration.
BIO Web of Conferences 00094-p.2

Computing the position of the hands
A Simulink state machine has been built to estimate the hand positions from the IR sensor array by recursive estimations.The hand positions were estimated at the center of the each sensor array between the two lights.The following estimation was done in order to compute this position: Where P(x,y) gives the center point between the two points lights on the sensor array.B(x, y) and A(x, y) represent the two lights position on the array on (x,y) axes.And A(x,y) is the previous computed point center.

C( x, y) = A(x, y) +C(x, y)
(2) Therefore C( x, y), represent the IR signal vector from the previous center point A(x, y) to the current center point C(x, y).

Avoiding IR lights confusions
When the user is juggling, hand movement is fast and continuous, in this situation the possibility of light points confusion reduces significantly.But even so, to prevent further confusion, the following function was done: Knowing the two light positions (X 1 ,Y 1 ) and (X 2 ,Y 2 ), the system estimates the distance between these two IR lights.This distance± its 15%, is the constant reference which is compared recursively with the newest estimated distance and keeps or fixes the best light couple according to their distance.
The IR captures signals are increased to real human size, since the original signals obtained form the Wii Remote camera is smaller than human size.(See figure 7).By doing this the VR environment is more natural and comfortable for the user.This is very beneficial since the user is already making an extra effort focusing on his/her juggling movements during this interaction.

Specifications of the LWJ prototype
Knowing the pros and cons of the Wii-camera used to detect the IR light, users of this system should follow specific protocols.These are: 1.The user has to be in front of Wii Remote.2. The distance between the camera and the user should be between three to four meters.(We estimated this distance (c) by triangulation.(See figure 3).As we know the camera position (B) and its 22 • angle of detention and the user's initial position(A)).
3. The user should hold the sensor array and be in front of the Wii-camera.
It is also worth remarking that, in order to have a good performance from the system, it is important to keep the battery fully charged on the Wii Remote control and the sensor array.

Implementation of the embedded sensor
In order to enable the PC to handle the IR data obtained by the Wii Remote camera from the sensor bar, a coded "C" S-function was designed.The function relies on the Wiiuse library [5].The S-function sends the IR data to a state machine which recursively estimate the user's hand positions, acceleration and velocity.These referents are estimated and sent to another state flow which recursively estimate the tosses and catches.The tosses are triggered by hand acceleration and the catches are triggered by hand position.The use of the embedded coder and the real-time workshop helped to package everything in a single executable module.Sending the information by a UDP protocol to the virtual interface, the LWJ application allows the user to juggle (three ball cascade) in a virtual reality interaction [11].The Simulink setup provides a flexible framework for handling the recognition in real-time.The embedded master receives the embedded sensor data through the common socket connection and allows the virtual camera to coherently detect movement in a designed scenario.The models used in the VR interface of this project were designed in 3D Studio Max.The virtual interface application was developed in XVR, which provides facilities to operate with 3rd party software without complexity [12].

Test Setup
A Wii Remote controller device was used as a camera to capture the IR signals from two IR sensor array The International Conference SKILLS 2011 00094-p.3 held by the user (Figure 4).The IR camera sensors of the Wii Remote provided location data with a resolution of 1,024 X 768 pixels, more than 4 bits of dot size or light intensity, a 100 Hz refresh rate, and a 45 degree horizontal field of view [4].The Wii Remote camera communicated with the computer via a Bluetooth adapter, which provided a suitable operating range (from 1m to 100m) depending of the device.A 3D television with and screen of (200 X 150 cm) was used to display the images of the 3D Virtual interface.The process that creates 3D images is known as stereoscopy and creates the illusion of depth.The method used for our system was autostereoscopy without the use of glasses, a 3D effect is achieved by the screen producing two images, one for each eyes[6, 7, 8].

Discussion and experimental results
In order to find the best solution for the LWJ system using the Wii Remote platform, we tested different methods.At the beginning we tried to implement a sensor system based on acceleration which was obtained from two holding Wii Remotes.The Wii Remote is shown on (photo 5, g).The acceleration was passed into a low-pass filter to remove the noise and integrated two times to obtain the position.The figure 6 on red shows the obtained position with this method.An extrasmooth signal algorithm was used to clean the remaining noise of the position.(See figure 6, on blue color).Nevertheless, even with all these processes, the position was still not clean.Therefore, we focused our attention on the Wii Remote's IR camera.See (photo 5, g).The Wii-camera lost the IR signals from the sensor bar when we tried to implement the system in the traditional way (a static sensor bar in front of the user and two holding Wii Remotes).This signal lost was due to the small angle of detection of the LEDs.For this reason, the traditional method was discarded and a reversed method was used: using three Wii Remotes, one as a camera and one in each hand of the user (i.e., two handle Wii).Each handle Wii has a mounted sensor array with two LEDs placed at each of its end , see (photo 5, b).The two handle Wii Remotes were used to obtain the hands'acceleration, which was integrated to output the velocity.While the hands positions were obtained from the IR sensor arrays.After some tests with this prototype, we noticed that the Wii-camera constantly lost the IR signal.The IR light produced by one LED wasn't strong enough to be detected by the camera.Moreover, we realized that the two handle Wii Remotes could be omitted since the position obtained from the IR sensor could be derived once to obtain the hand velocity, and twice to obtain the hand acceleration.Such a process allowed us to develop a less-heavy algorithm with less computational power since the LWJ system could now be based on only the IR sensor platform and discard the use of the accelerometers.To address the problem of the IR light signal loss, we tried increasing the amount of light by adding more LEDs to the array.(See photos 5, h, i, j and k)).Several prototypes were tested that were made with different amounts, placements, kinds and colors of LEDs.Even though some of these prototypes worked, in most cases the light emitted by the LEDs was still not enough to be efficiently separated by the Wii-camera to keep up a longer interaction.Here we noticed that blue LEDs are more easily detected by the Wii-camera than white ones.Next, we decided to experiment using two ping-pong balls as light diffusers for the LEDs.(See figures 5, c, d and f).The balls were tested with different amounts, placements, kinds and colors of LEDs inside of them.This idea was particularly desired because the user could then hold the balls while juggling with virtual balls.However, after some testing we realized that the Wii-camera did not detect the diffuser light rather it just detected the LED lights inside of the ball.The same test was done with a Kinect IR camera from Microsoft, which is able to detect even the skeleton of human [10].(See photo 5, e), and it was able to detect the diffuser.Although this time we focused our experiment on the Wii Remote camera, we will consider the Kinect IR camera for future work, as it can also be easily adapted to our system.Finally, we tested two wireless sensor bars from the Wii, (the ultra and the standard bar).(Seephoto 5, a).The ultra sensor bar could be, in some cases, a bit too large for juggling.However, its IR light was better found by the IR camera.In contrast, the standard sensor bar, which is more human hand sized could not as easily detect the IR light, but it still could be used for our system.The results of the user's hand tracking using the ultra and the standard bar are shown in figure 8.Where the hands BIO Web of Conferences 00094-p.4tracking are represented by the pink and blue colors.If we want to design our own sensor array to produce IR light, it has to fulfill the next two important requirements: 1.It must be comfortable for the users (meaning it must be easy to grasp in the user's hand, not be heavy, not have many uncomfortable cables, etc.) 2. It must be designed with enough power (batteries) to produce a strong IR light.
Our decision of use two commercial sensor bars and not our design sensor array was because it did not satisfied all the previous mentioned conditions.The sensor array (See photo 5, b and h) was implemented with two AA batteries while a ping pong ball used four watch batteries (See photo 5, c, d, e and f), though these were not enough to produce the strong IR light we needed.However, there are commercially made sensor bars available that only need four AA batteries, these are well designed and meet our original objective since they are still accessible with a low price range between $7 and $29 US.
Figure 5: Different tested prototypes for the implementation of the LWJ system.
The present system was tested by 8 persons, two of them were juggling expert.All found the system interesting and usable.For the beginners, it was easier to start juggling using the present system and interface that in real juggling.The present system was design for the UM1 science of the human movement laboratory and it will use to performer different experiments related to learning process and hand motions behavior related of juggling.

Conclusions
With the presented sensor system, it is possible to develop many interactive VR applications.This LWJ system was designed in the context of training human skills in VR.We developed a novel and cost-effective sensor system to train novices how to juggle the threeballs cascade.This LWJ system is oriented to improve the learning process of users and to achieve some upper limb exercise.

Acknowledgment
The authors gratefully acknowledge the contributions of Benoit Lange, Vittorio Lippi and Megan O'Brien.

Figure 2 :
Figure 2: Detection of 4 points of light by Wii Remote camera taking from two sensor bars.

Figure 3 :
Figure 3: Top view of the IR prototype designed for the LWJ system.

Figure 4 :
Figure 4: The full system setup.

Figure 6 :
Figure 6: The graphic on red shows the hand tracking position obtained by integrating two times the acceleration, while the graphic on blue shows the same signal ouput by the smooth signal algorithm.

Figure 7 :
Figure7: The graphic on red shows the hand tracking obtained from the Wii-IR camera, while the graphic on blue shows the hand tracking on human size.In this figure we can also see that the IR signals are more clear than these obtained by the acceleration (see figure6).

Figure 8 :
Figure 8: The user's hands tracking( pink and blue colors) during 30 sec obtained from two sensor bar (the ultra (left side) and the standard bar (right side)).The signals on cyan, yellow, red and green are the IR light signals on the sensor bar used to estimate hands positions.