Little table, talk to me

Integrated radar technology makes it easier to care for the elderly

In our aging society, dementia has only continued to become more common. Severely ill patients usually only react via facial expressions and gestures. A remedy to this can now be found in a table that uses the latest radar technology to connect commercially available objects with interactive communication technology on a surface. It can be used in occupational therapy for people with dementia as well as in individual care situations. The idea came from the radar specialists at Fraunhofer IZM. As part of the DAYSI project, IZM is working with partners from the fields of nursing, furniture manufacturing and software development to create a table that can be controlled via gestures and that communicates in written, visual and audio form.

Tech News PM/PR Daysi
© Volker Mai | Fraunhofer IZM
Tables functionally furnished with gesture control not only help with elderly care but can also create virtual worlds at trade fairs or in the field of medical technology.

The first day in a care facility or even in an adapted home is often not easy for people with dementia and their relatives. Life is reduced to one room, consisting of a chair, bed, table, and a sideboard, including a small bathroom. Thanks to the ongoing DAYSI project, however, these pieces of furniture will soon be able to communicate with those affected with dementia and support them in their daily lives and their care.

The basis is a simple table equipped with radar and communication technology. For example, a voice control system can be built into an artificial flower, or a small projector into a picture frame. Through an invisible database integrated in the table, caregivers can be provided with pictures, songs and other information for a more customized interaction. Such a table can already be used during the intake interview: The system recognizes frequently recurring behavioral patterns, such as a person addressing their parent, who is suffering from dementia, by means of speech recognition and responds to them with an adapted reaction. The support personnel will also have the option of enabling or disabling sensitive information via a security query.

The Fraunhofer Institute for Reliability and Microintegration IZM is developing the entire surface for the project. This includes the hardware for the radar sensor, wireless communication and wireless charging of the individual components. By integrating these elements, the table becomes an interactive interface where common objects become gesture-controlled means of communication. The individual components are the contact points to a mini-computer integrated in the tabletop. Additional external components of the system, such as projectors, cameras, as well as a voice recognition system, can be connected via the table's wireless interfaces and incorporated into everyday objects, such as vases or picture frames, as needed. All these items are to be connected by means of automatic playback technology so that they connect directly to the computer when used. In addition to various wireless communication interfaces and a charging device, gesture recognition will also be integrated in cooperation with software developer Creonic. Other partners in the project include the Contag AG who ensures the reliable construction of the hardware according to Fraunhofer IZM's packaging and interconnection technology, the Böhm Group who installs the necessary hardware in the table, and the Evangelische Altenhilfe Duisburg GmbH and Charité Universitätsmedizin Berlin who test and use the table under real-life conditions.

At about the halfway point in the project's progress, the project partners agree: “The DAYSI interface will revolutionize the field of patient care. These systems can significantly simplify the care of those suffering from dementia. Other fields of application are also conceivable, such as in the gaming industry,” said project manager Christian Tschoban from Fraunhofer IZM, whose team came up with the idea to utilize such technology in tables.

Currently, the consortium is working on the hardware for a prototype of the interface. The demonstrator will then be installed in the table by the project partners and the feasibility, user acceptance and possible communication errors will be tested by Charité Universitätsmedizin Berlin and the Evangelische Altenhilfe Duisburg GmbH in various scenarios. For example, a projector is used to simulate family members or friends who can communicate and interact with those affected with dementia via the table. Gesture recognition is used to assess the calming effect and, if this is not successful, the nursing service is informed. In order for the system to also assist with activating animation, gaming capabilities have been integrated that can be controlled through the coils embedded in the table as well as the projector, and even through the use of gesture recognition.

Nursing experts are also investigating the extent to which it helps people with dementia to suddenly experience their previously passive environment more interactively. Meanwhile, Fraunhofer researchers are testing the functionalities of the integrated radar technology and the miniaturized system. Until now, such systems have only existed on a much larger scale. Fraunhofer IZM has over 28 years of expertise in miniaturization and reliable performance of even the smallest systems. The DAYSI project has been funded by the German Federal Ministry of Education and Research for three years and will run until the end of 2021. Until then, all project partners will be working on the interactive, everyday assistance system for people suffering from dementia, which is to be used in care facilities and the home environment at the end of the project.

In addition to elderly care, the Fraunhofer researchers, together with the Technical University of Berlin, are also seeking to collaborate with Garamantis. While the company's multi-touch tables have so far functioned capacitively like a cell phone, the interactive worlds will be operated using gesture control in the future. The first project proposals for developing 3D gesture control for multi-touch environments using novel multi-radar sensor architectures for virtual reality environments have been submitted.

These provide a basis for further fields of application such as autonomous driving or as a facilitator for elderly and impaired persons, such as the visually impaired.

Last modified: