Kinect AAL

Kinect based ambient intelligence in AAL applications


The Microsoft Kinect motion sensor was released at the end of 2010. The infrared projector and camera of the sensor make it possible to sense the observed scene and track the movement of individuals in three dimensions. Kinect-based, real-time, three dimensional tracking is a new approach in AAL (Ambient Assisted Living) applications, therefore the main objectives of this work was to survey the applicability of the Kinect sensor in AAL systems, to design an application which can solve basic AAL tasks, such as fall detection, gesture recognition and ADL (Activities of Daily Living) activity recognition.

The development

In the first part of the work, I described the main aspects of designing AAL systems. Next, I examined the general capabilities of the sensor by preliminary experiments. Finally, I studied the available AAL systems and presented the preliminary design of a possible Kinect AAL system.

In the second part of my work, I presented the steps of system design and implementation. The gesture recognition is based on an available system which uses the DTW (Dynamic Time Warping) algorithm to recognize gestures. The base of ADL activity recognition is a multilevel model, of which the first level is a manually defined virtual map (context) that is created from the real world scene. The second level represents the gesture and movement information extracted from the sensor, and the third level represents the actual ADL activities. The system tracks the user, and maps his movement to the virtual map, which works like a filter for selecting the possible ADL activities. After determining the user’s velocity, position, orientation and gestures, the system is able to recognize the real ADL activities.

Fall detection is a general AAL problem, which is solved by many AAL applications, but with quite different capabilities and precision.
The fall detection system presented in this work measures the velocity, the height of the center of mass, and other temporal parameters, and makes it possible to recognize falls, even in extreme conditions, such as falling behind objects (being almost completely occluded), falling on objects like chairs, and long lasting motionlessness. As a technique, the Kinect AAL system uses a fuzzy inference engine for data fusion, as well as, ADL activity and emergency event generation.