Problematic face and voice recognition
Automatic recognition of voice, speech and face is increasingly finding its way into everyday life. This does not only bring advantages. Researchers in Switzerland want to ban these applications.
Biometric data is highly sensitive information because it can uniquely identify people. However, beneficial facial, voice and speech biometrics require clear legal frameworks. In a study, researchers at the Foundation for Technology Assessment (TA-Swiss) advocate a series of recommendations to create a trustworthy use of biometric technologies.
Technologies such as voice, speech and facial recognition have made enormous progress in recent years. With the help of facial recognition, for example, security in public spaces can be increased because it also facilitates the search for missing or fugitive persons. However, constant real-time data analysis can also impair personal freedom and lead to surveillance.
Promote social debate
The research group believes that all facial recognition systems should be evaluated regularly. This should be done by independent experts. If possible, the corresponding reports should be made available to the general public. It is also important to regularly train the personnel who use the technologies, so that they are aware of the protective measures they need to take to safeguard the rights of the population. It is also important to promote an ongoing social debate on the opportunities, risks, and ethical challenges of facial recognition by the police and its democratic legitimacy.
Data processing only on the device
Among the most important recommendations, TA-Swiss counts the demand for more transparency about the purposes and processing of personal data. In addition, it should be ensured that manufacturers can obtain explicit and informed consent from users for every program function and all subsequent changes. Users should be able to delete their own data easily. In addition, data processing must be promoted directly on the device and not in the manufacturer's cloud.
In this respect, the study underscores the character of biometric data as personal data requiring special protection. Thus, an analysis of biometric data could also reveal highly personal information such as the current state of health. As a result, more and more data is being accumulated about individual people. With artificial intelligence, there is a risk of discrimination against certain groups of people on the basis of their gender, skin color and age. In addition, so-called false positives could not always be ruled out in the recognition process, since the analyses are based on probabilities.
The complete study of TA-Swiss analyzes a total of eight application examples, including authentication by voice in telephone banking, violence prevention in sports stadiums, early detection of physical and mental illness, emotion recognition, attention analysis at schools, and everyman identification.
Source: TA-Swiss, editorial office