• Artikel
  • Article

Body Reaction as Password

08/22/2019

Forgot your password? Everyone probably knows that. Authenticating yourself by fingerprint or iris scan is a practical alternative. But biometric procedures also have disadvantages, because body characteristics cannot be changed and you show them publicly.  The Software Technology Institute paluno of the UDE is researching a new approach that combines biometrics and password protection. The project is funded by the German Research Foundation (DFG) over three years.

The researchers of the Human Computer Interaction group are focusing on a new class of biometric authentication methods: Functional Biometrics. The system to which a person wants to log on generates an input signal and transmits it to the body. This signal can, for example, be an auditory, electrical or tactile stimulus that triggers a user-specific reaction (e.g. the reflection of the audio signal or a muscle reaction). The response signal is measured by the system and compared to a previously stored response.

This means, a biometric password is generated with the user's body. Professor Dr. Stefan Schneegaß on the advantage of this technology: "Like fingerprints, the body reaction differs from person to person. However, it does not leave traces that are accessible to everyone."

The aim of the project is to investigate the potential of functional biometric approaches and to test which sensors and actuators are basically suitable for the authentication on computers or smartphones. In addition, the scientists want to develop models and algorithms for research demonstrators that allow test persons to authenticate themselves automatically.

For more information, please contact Prof. Dr. Stefan Schneegaß

[1] Stefan Schneegass, Youssef Oualil, and Andreas Bulling. 2016. SkullConduct: Biometric User Identification on Eyewear Computers Using Bone Conduction Through the Skull. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 1379-1384. DOI: https://doi.org/10.1145/2858036.2858152