SECURITY

Affective computing meets EU data protection law

0
computer
Credit: Public Domain

Affective computing (AC and sometimes called “Emotional AI’) provides opportunities to automatically process emotional data. However, is EU data protection law fit for purpose when it is applied to such AC approaches?

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1453799284784-2’); });

In his which has been published in the International Data Privacy Law journal from Oxford University Press, Andreas Häuselmann, external Ph.D. candidate at eLaw—Center for Law and Digital Technologies, discusses the automated processing of emotional data based on AC approaches in the context of EU data protection law.

Although AC approaches are already used in practice, the topic seems to be highly underrepresented in privacy and data protection law related research. Because emotional data is personal, sensitive, and intimate by nature, the question arises how such a new category of data suits into the framework of EU data protection law. The researcher argues that the latter is not necessarily fit for purpose when applied to AC approaches. For example, he concludes that emotional data is currently not considered as sensitive from the legal point of view despite its sensitive nature. Furthermore, Andreas Häuselmann highlights that AC approaches are in tension with the transparency and accuracy principle and raise concerns when applied in the context of important decisions (e.g. recruitment, border controls).

The article forms part of Andreas Häuselmann’s external Ph.D. research at eLaw and aims to raise awareness and stimulate further research for the topic, not only in the domain of privacy and data protection law, but also in the context of the broader social consequences of AC technology, including ethical aspects and impacts on consumer law.