- Climate, Environment & Health
Speaking is a complicated matter. To convert acoustic signals into words, humans need more than 100 muscles. Tiny contractions of the muscles result in different sounds. Communicating through speech is a years-long learning process in childhood. However, acoustic communication can also be a barrier, for example, for mute people, in situations where loud environmental noise makes normal conversation impossible or when speaking in quiet environments disturbs others. One hurdle Tanja Schultz has overcome: she can transform muscle signals into speech.
"We transform the movements of the facial muscles into text by reading individual speech sounds from the movement of the articulation muscles and forming words with them," explains computer science professor Tanja Schultz. A self-developed algorithm and a detailed knowledge of anatomical and neuronal relationships make the development of this 'silent speech communication' possible. The silently spoken text is issued either on the recipient's computer screen or on the telephone via a computer voice. "This is very interesting for security specialists, especially in quiet environments. The technology can also be used in confidential conversations, where the necessary communication must not interfere," says Schultz, who has headed the Cognitive Systems Lab at the Institute for Anthropomatics at KIT since 2007.
"I want to teach computers to understand people."
While multilingual muscle-to-speech recognition is already in prototype use, the conversion of other biosignals is still in its beginning. Schultz says: "Ultimately, almost all human movements can be traced back to neuronal brain signals. We are trying to directly measure and evaluate these signals to generate useful applications." This brain-computer interface holds prospects for a variety of medical conditions. Direct translation of thoughts into speech could enable people with locked-in syndrome to communicate at all. Another device from the computer science team, the workload indicator, continuously outputs the brain's mental workload. This allows the device to detect and indicate user overload and stress.
Tanja Schultz is entering a futuristic field of research: transfering emotions and moods into automated applications. The household robot of the future should not start vacuuming when its owner comes home stressed. This requires a sense of human emotions and a subsequent adaptation of activities. The goal is to implant empathy into machines and thus enable them to react appropriately to us humans," says Schultz. "Until now, we humans have had to subordinate our wishes and needs to the possibilities of technology. My vision is that technology adapts to us humans by means of empathic technologies."