Data Protection Impact Assessment Case Study for a Research Project Using Artificial Intelligence on Patient Data

Gizem Gültekin Várkonyi, Anton Gradišek

Abstract


Advances in artificial intelligence, smart sensors, data mining, and other fields of ICT have resulted in a plethora of research projects aimed at harnessing these technologies, for example to generate new knowledge about diseases, to develop systems for better management of chronic diseases, and to assist the elderly with independent living. While the algorithms themselves can be developed using anonymized or synthetic data, conducting a pilot study is often one of the key components of a research project, and such studies unavoidably involve actual users with their personal data. Although one of the derogations stipulated in Article 89 of the GDPR is related to the data processed for scientific purposes, the GDPR still is applicable to that processing in a broader interpretation. The computer scientists and engineers working in research projects may not always be fully familiar with all the details of the GDPR, a close collaboration with a lawyer specialized in the European data protection legislation is highly beneficial for the success of a project. In this paper, we consider a hypothetical research project developed by an engineer dealing with sensitive personal data and a lawyer conducting Data Protection Impact Assessment to ensure legality and quality of the research project.

Povzetek: Prispevek obravnava oceno učinka v zvezi z varstvom podatkov pri hipotetičnem raziskovalnem projektu, pri katerem se z metodami umetne inteligence analizira medicinske podatke uporabnikov.



Full Text:

PDF

References


“The AI effect: How artificial intelligence is making health care more human”, [Online], study conducted by MIT Technology Review Insights and GE Healthcare, 2019. Accessed from: https://www.technologyreview.com/hub/ai-effect/ Last accessed: 20 April 2020.

EDPS (2012). “Opinion of the European Data Protection Supervisor on the data protection reform package”, (7 March 2012).

ICO (2018). Accountability and governance: Data Protection Impact Assessments (DPIAs).

European Commission (2018). Communication from the Commission to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions, Artificial Intelligence for Europe. COM(2018) 237 final.

European Commission (2018) Communication from the Commission to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions, Coordinated Plan on Artificial Intelligence. COM(2018) 795 final

Wright, D. (2012). The state of the art in privacy impact assessment. Computer Law & Security Review, 28(1), 54–61. https://doi.org/https://doi.org/10.1016/j.clsr.2011.11.007

EDPS, Data Protection Impact Assessment: Accessed from: https://edpb.europa.eu/our-work-tools/our-documents/topic/data-protection-impact-assessment-dpia_en Last accessed: 20 April 2020.

Wright, D. (2011). Should Privacy Impact Assessments Be Mandatory? Commun. ACM, 54(8), 121–131. https://doi.org/10.1145/1978542.1978568

Horák, M., Stupka, V., & Husák, M. (2019). GDPR Compliance in Cybersecurity Software: A Case Study of DPIA in Information Sharing Platform. In Proceedings of the 14th International Conference on Availability, Reliability and Security. New York, NY, USA: Association for Computing Machinery. https://doi.org/10.1145/3339252.3340516

Deutsche Telekom (2018). Guideline For the design of ai-supported business models, services and products at deutsche telekom in compliance with data privacy regulations.

CNIL (2018). Privacy Impact Assessment Templates Accessed from: https://www.cnil.fr/sites/default/files/atoms/files/cnil-pia-2-en-templates.pdf Last accessed: 20 April 2020.

Article 29 Working Party (2017). Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679.

European Commission (2020). Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, A European strategy for data. COM(2020) 66 final.

Aïvodji, U., Gambs, S., Ther, T. (2019). GAMIN: An Adversarial Approach to Black-Box Model Inversion. ArXiv, abs/1909.11835.

Melis, L., Song, C., Cristofaro, E.D., Shmatikov, V. (2018). Exploiting Unintended Feature Leakage in Collaborative Learning. 2019 IEEE Symposium on Security and Privacy (SP), 691-706.

Giuseppe A. et. al. (2013). Hacking Smart Machines with Smarter Ones: How to Extract Meaningful Data from Machine Learning Classifiers. arXiv:1306.4447v1

Chandra, S., Ray, S., Goswami, R. (2017). Big Data Security: Survey on Frameworks and Algorithms, in 2017 IEEE 7th International Advance Computing Conference (IACC), Hyderabad, pp. 48-54. doi: 10.1109/IACC.2017.0025

CNIL Open Source PIA software tool. Accessible here: https://www.cnil.fr/en/open-source-pia-software-helps-carry-out-data-protection-impact-assesment. Last accessed: 20 April 2020.

Alnemr R. et al. (2016) A Data Protection Impact Assessment Methodology for Cloud. In: Berendt B., Engel T., Ikonomou D., Le Métayer D., Schiffner S. (eds) Privacy Technologies and Policy. APF 2015. Lecture Notes in Computer Science, vol 9484. Springer, Cham.




DOI: https://doi.org/10.31449/inf.v44i4.3253

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.