Posts with the tag
“UN”

Data Protection and Biometrics: Scanned by International Aid Organisations

20th October 2021 by Karl Steinacker and Katja Lindskov Jacobsen

NGOs and UN agencies have collected sensitive personal data of millions of people in the global South. With due prudence?

Aid and charitable organisations (hitherto referred to as NGOs) have gone digital in recent years. One aspect of this involves collecting and processing large amounts of data about the very people they assist and interact with. Such data is often very personal and may include health and educational data, family relations, religious and political believes. These data sets identify a person by name, address, picture, and increasingly also biometric features.

«Trust is the fuel of our societies» says the contemporary historian and philosopher Yuval Harari. This remark raises an important question, namely how digital data dimensions of the practices of NGOs affect this vital issue of trust.  Can their policies and practices stand public scrutiny and are worth our trust? Can we be sure that personal and often sensitive data has been obtained voluntarily from each of the data subjects that NGOs gather such data from? Is it right to assume that the people have a clear understanding of purpose, benefits, and risks related to the sensitive data they allow NGOs to collect, store and process?

Data Protection and Sovereignty in the Global South

In general, it can be said that NGOs fall under the jurisdiction of the country where they operate. That might reassure the public in the countries of the North where data protection laws have been passed and are being enforced. However, on a global scale, as more and more governments enact data protection laws, we are in the paradoxical situation that NGOs often face new obligations to disclose personal data in support of national sovereignty and security issues. Countries like Turkey and Rwanda have modelled their data protection laws following the example of the European GDPR and demand that personal data of their citizens and residents is kept exclusively on servers on their territory. But NGO data is sometimes also shared with donors to obtain funding or as part of specific data-sharing agreements. Each contractual arrangement with commercial service providers, be it outsourced IT services, such as the storage and data analysis, or simple bank and mobile pay transfers, disclose identity and information about aid recipients.

Those who manage the personal data, the senior NGOs managers and data protection officers, are aware of the issues at stake. They will mention the noble mandate and the non-profit character of their organisation. Some will point out that they have internal data protection policies and regulations. Especially the international NGOs, these are the western institutions led and funded from the global North, will claim that they follow the GDPR, as the current international gold standard in data protection, even in countries outside of Europe.

It is certainly true that data of their institutional and individual donors, including those who have been recruited in the High Street to donate every month 10 Dollars/Euros or so, is kept and processed in a GDPR compliant manner. The NGOs know that they have to gain and maintain the trust of that important segment of the public since their financial and reputational survival is at stake.

Refugees in Chad have to reveal biometric data to obtain an ID card. – CC-BY-NC-ND 2.0 EU Civil Protection and Humanitarian Aid

Open questions regarding digital rights

But what about the people at the margins, those data subjects in the global South, that have surrendered their personal data to NGOs because they need protection and assistance in the face of war, natural disaster, or poverty? In such cases, specific questions need to be asked and addressed, such as:

  • Are these data subjects informed in a transparent and understandable manner about (real and possible) data use as far as mandatory data fields are concerned?
  • Have people, where informed consent is asked for, been educated, even genuinely informed in relation to the specific context, so as to meaningfully make that decision? What are the alternatives offered to them should they decide not to for example not register their biometric data with an NGO?
  • Are the consent rules and practices easy and possible to fully implement? Can consent be revoked?
  • Do individuals have access to their data and to query that data, demand corrections, and which recourse options are open to them? Is deletion of data even possible or is there a right to be forgotten?
  • Are data subjects always informed about data breaches and data sharing arrangements, especially with state authorities?

Unfortunately, answers to these and other questions are in most cases likely to fall short of even minimum standards of data protection. Organisational culture and practices in many NGOs often fail to put the necessary focus on and resources into conducting impact and risk assessments in a particular context before decisions are made to collect, store and process personal data. Financial audits are common while data audits remain an exception. Operations security processes and strategies for protecting critical and sensitive data are rare within organisations and absent at an inter-organisational level. Self-policing is the norm, rather than submitting to independent oversight.

Co-operation with Palantir cause a stir

Special mentioning is necessary of the large specialised UN agencies, such as IOM, WFP and UNHCR. These organisations have, like no other non-governmental bureaucracy, amassed personal data files of tens of millions of people around the globe. Their data subjects for example, surrender their biometric imprints (commonly a fingerprint or an iris scan) for a bar of soap, a sack of rice or a cash transfer, but also for a residence permit, or the opportunity to be resettled in another country. Unfortunately, their policies for safeguarding such sensitive biometric data and importantly the implementation of data protection remain largely opaque. Let’s take the example of the data-sharing agreement that UNHCR has concluded with the government of Bangladesh. Human Rights Watch alleges that personal data, including biometrics, collected jointly by UNHCR and the Bangladesh authorities from Rohingya refugees, who have fled genocide in Myanmar, has been transmitted to the government of Myanmar. Or WFP which has announced that it signed a contract with Palantir, a company specialised in data analytics and which is part of the data-military-industrial complex in the United States. Palantir was criticised recently by Amnesty International for its failure to conduct human rights due diligence around its contracts. These and other data related treaties, contracts, and practices of UN agencies escape public scrutiny and cannot be challenged in any national court because of their diplomatic immunity. Yet, various UN agencies routinely roll out large-scale data collection programmes in many countries whilst failing to ensure appropriate oversight bodies and recourse procedures for their data subjects.

When two months ago western military and civil organisations evacuated their personnel from Afghanistan, large amounts of sensitive personal data, including biometric data, was left behind. Only time will tell whether that data has been adequately protected and cannot be abused. Closer at home, the German Red Cross received the 2018 edition of the Big Brother Award from a civil society organisation for its digital system of asylum shelter management. The Red Cross software instituted humanitarian surveillance and total control of the asylum seekers and refugees by movement tracking to and within the shelter, detailed recording of medical checks, food consumption, relationships, religious and ethnic affiliations and much more. The question is not whether there are similar systems in different setting and places elsewhere in the world but rather how many of them exist.

ICRC and Oxfam work on biometric policies

Looking ahead, we see the contours of different trends: Some civil society actors recognise the emergence of unforeseen risks relating to the use of personal data that have been collected in many different contexts. They now increasingly advocate for intensified discussions of approaches to responsible uses of personal, in particular biometric data. Indeed, the ICRC and Oxfam have defined red lines for the use of biometrics and discuss their data policies and practices with the people they protect and assist.

On the other hand, however, the trend towards surveillance and biometric overkill continues. The most worrying example is the use of DNA by the US Immigration Service ICE. Here a precedent is set where biological features are not only used to uniquely identify an individual but the biological relationship of several persons. Meanwhile, UNICEF and the Gates Foundation are looking into ways of using biometrics for infant and child healthcare in Africa. Newly set-up start-ups suggest introducing biometrics for school kids in Africa so as to control their school attendance.

But there is resistance too. Political organisations working among Rohingya refugees living in camps in Bangladesh are campaigning among their fellow refugees not to deliver their personal data, notably biometrics, to the United Nations and the Bangladesh authorities out of fear that such sensitive personal data could end up in the hands of the military dictatorship in Myanmar. During the Covid-19 pandemic, the UN Agency for Palestinian Refugees in the Middle East (UNRWA) failed in introducing biometric ration and cash cards: The Palestinian refugees wouldn’t have any of it and simply refused to collaborate.

Biometric Surveillance in a Civil War

And so it is not surprising that personal data can also become a weapon. A case in point is Yemen where two competing governments fight a civil war for control of the country. Since 2019, WFP rejects the lists of beneficiaries put together by the government based in the capital Sana’a. The agency demands that itself should be put in charge to biometrically register those eligible to receive food aid. The competing government, supported by the US and Saudi Arabia, allowed WFP to biometrical register food aid recipients in their areas of control. As a result, the bulk of food aid goes to the areas where WFP was able to register the population.

Accordingly, widening current debates and including a broad range of stakeholders will be crucial as we move forward. If it comes to sensitive personal data and trust, charitable intentions, progressive mandates, and a non-profit posture are noble starting points but unfortunately account for little. What happens to the personal data of the so-called data subjects is what matters. For years NGOs have been discussing concepts like do no harm and accountability. Many claim that they are accountable to the people they assist, protect, and seek to empower, rather than to their donors. Now the opportunity has come to implement and showcase meaningful accountability: Give those who are marginalised and in need of protection and assistance the digital rights they are entitled to.

This article was published first in German by Netzpolitik.org on 8 October 2021

Karl Steinacker

Digital Advisor

International Civil Society Centre

Karl Steinacker is currently the Digital Advisor of the International Civil Society Centre. He studied political science at the Free University of Berlin and international law at Cambridge University. He then spent three decades working for the United Nations (UNDP, UNRWA, DPKO, UNHCR) in the fields of development, peacekeeping and refugee protection. At the UN Refugee Agency, he held positions in Africa and at its Headquarters and was responsible for Registration, Statistics, and Data and Identity Management as well as for Camp Coordination/Camp Management.

Katja Lindskov Jacobsen

Senior Researcher

Centre for Military Studies at the Institute of Political Science at the University of Copenhagen

Katja Lindskov Jacobsen holds a PhD in International Relations from Lancaster University and works as a Senior Researcher at the Centre for Military Studies at the Institute of Political Science at the University of Copenhagen. The focus of her research is on security policy and interventions.


Join Making Voices Heard and Count at the UN’s High Level Political Forum

6th July 2020 by Peter Koblowsky

Event Title: Community-driven data as transformative means for accelerated action and SDG delivery
Call link: https://bit.ly/2VFzz6s
Wisembly: https://app.wisembly.com/hlpf2020#stream
Date: 9 July 2020
Time: 8.00-10.00am (EDT)
Facilitation: Wolfgang Jamann, International Civil Society Centre

Download the Agenda

Note:
We will record this event! Room capacity is limited to 300! We advise our audience to arrive at the virtual room before the official starting time. Moderators will be online in the room as of 7.45am.

Peter Koblowsky

Senior Partnership Manager - Leave No One Behind

International Civil Society Centre

Peter joined the Centre in January 2013, back then as a trainee. He completed the traineeship in the advocacy & campaigning office of World Vision Germany. Peter now coordinates the Leave No One Behind project and contributes to the development and implementation of various other strategic formats. Before joining the Centre, Peter worked for various organisations and think tanks in the development sector, being an expert in multi-stakeholder processes. He studied at the University of Bonn and graduated with an MA in Political Science with a focus on multi-actor advocacy for climate policy.