Posts with the tag
“Digital”

Navigating the Digital Landscape: Key Insights from the International Civic Forum 2024

12th December 2024 by Eva Gondor and Rachel Wilkinson
Every year, we bring together leaders from civil society and other sectors to discuss key civic space issues, build connections, and identify opportunities for collaboration at the International Civic Forum (ICF). This year we focused on how digital technology is transforming civic space and explored the implications of digital advancement for civic and civil society operating space.   

 

ICF 2024_ holding a phone and taking a photo of a speaker

 

Collaborating with our partner, the Civic Tech Innovation Network (CTIN), with whom we run our Digital Dialogues event series, we built on synergies and teamed up for their annual conference Civic Tech Innovation Forum (CTIF). At CTIF we hosted a masterclass on how strategic foresight can be applied by civil society to address misinformation and disinformation. The ICF took place immediately following the CTIF, fostering connections between the two events and providing a unique platform to share knowledge and insights.  

During the ICF, we examined how digital trends – ranging from digital authoritarianism and disinformation to the growing role of artificial intelligence (AI) – are reshaping civic space. Participants explored the challenges and opportunities that technology presents, shared experiences and best practices, and engaged in a foresight exercise to prepare for possible civic space futures. 

 

ICF 2024 Panel Presentation

 

Key Learnings and Reflections from the ICF 2024

1. Impact of digital technology on (shrinking) civic space

Civic space is increasingly under threat, with the rise of government surveillance, internet shutdowns, and restrictions on digital freedoms. Digital technology plays a dual role – it can contribute to the erosion of civic freedoms but also provide new tools for civil society to fight back. 

For example, technology can help expose human rights violations, amplify voices, and facilitate global advocacy and coalition-building. However, it also opens the door for new forms of digital repression, such as targeted disinformation campaigns and government surveillance. As we move forward, civil society organisations must engage with technology critically and actively manage its risks, while recognising its potential to drive positive change.

ICF 2024 _ group discussion

 

2. The need to prioritise and upskill on cybersecurity

Civil society organisations are now among the most targeted groups for cyberattacks. Whether large or small, they must prioritise cybersecurity to protect sensitive data and safeguard digital infrastructures. They need to stay informed about emerging digital threats and continually upskill staff to address them. This is not just a technical issue – it’s a fundamental aspect of protecting the work we do and the communities we serve. 

3. Bridging the digital divide and enabling accessibility

While digital technology offers incredible opportunities, we cannot overlook the global digital divide. Over 3 billion people remain offline, and many others face restrictions due to financial barriers or internet shutdowns. For civil society organisations looking to use technology to support and expand civic space, it is essential to ensure that digital solutions are inclusive and accessible. This includes offering offline versions of apps, ensuring data protection, and providing access to free tools like VPNs to overcome censorship. 

 

ICF - Robert Dersley collecting content for graphic illustration

 

4. Need for collective learning and collaboration

A central theme throughout the ICF was the importance of collaboration. Technology is constantly evolving, and staying ahead of emerging trends requires continuous learning and knowledge-sharing. This way, we can build on each other’s efforts instead of reinventing the wheel. From managing digital risks to understanding the impacts of AI, civil society organisations must take collective responsibility for building a more tech-savvy and resilient sector.

5. Artificial intelligence (AI) as a double-edged sword

AI has significant potential to support human rights and democracy, but it also comes with considerable risks. From privacy violations to the amplification of disinformation, the ICF underscored the need for vigilance. Civil society organisations must explore AI’s potential, for example for fact-checking and verifying information. At the same time, they must actively mitigate its risks, such as biased algorithms, and ensure that its deployment promotes fairness, inclusivity, and respect for human rights.

 

ICF presentation discussion

 

6. Anticipating possible futures for civic space

Strategic foresight helps civil society organisations anticipate and prepare for a range of future scenarios, allowing them to navigate potential changes in civic space more effectively. By exploring scenarios for 2030 – from civic space collapse to its transformation – participants identified first steps that civil society can take to respond to emerging digital trends. This proactive approach enables civil society organisations to strengthen resilience and adaptability, ensuring they are ready for multiple possible futures and can safeguard civic freedoms in an ever-changing environment. 


Final thoughts

In conclusion, the ICF emphasised that civil society has a responsibility to engage with digital technology and critically assess its impact, both positive and negative, on civic space. The digital tools we engage with should be used to amplify voices, protect human rights, and strengthen civic space. By understanding the digital landscape, advocating for responsible use of AI, strengthening cybersecurity, and closing the digital divide, civil society organisations can ensure technology remains a force for positive change, rather than one that undermines civic space. Through ongoing exchange, upskilling, collaboration, and strategic foresight, civil society can adapt to the digital age and continue defending and expanding civic space for a fairer and more resilient future. 


ICF participants 2024
Participants at the International Civic Forum 2024

Photos by Zibusiso Manzini-Moyo


International Civic Forum 2024 Graphic Illustration Recording

Day 1
International Civic Forum _Graphic Recording Day 1 by Robert Dersley
Illustration by Robert Dersley, 2024

Click to view larger version

Day 2
International Civic Forum _Graphic Recording Day 2 by Robert Dersley
Illustration by Robert Dersley, 2024

 

ICF Graphic illustratotor in action Robert Dersley
Robert Dersley in action, graphic recording

Eva Gondor

Senior Project Manager

International Civil Society Centre

Eva leads on the Centre's civic space work - the Solidarity Action Network (SANE) aimed at strengthening resilience of and solidarity among civil society actors, and the International Civic Forum (ICF), our annual civic space platform to network and identify opportunities for collaboration. Prior to joining the Centre she worked at the Robert Bosch Stiftung (Foundation) in Stuttgart where she managed the foundation’s projects focusing on civil society and governance in Turkey, the Western Balkans, and North Africa.

Rachel Wilkinson

Programme Manager – Futures and Innovation

International Civil Society Centre

Rachel leads the Futures and Innovation programme at the Centre. She is responsible for managing the portfolio of projects and events as well as leading and developing the Scanning the Horizon strategic peer learning platform. Rachel has more than 15 years of experience working in the third sector, on a national and international level, working for various ICSOs in international development and human rights in both London and Berlin.


From LogFrames to Logarithms – A Travel Log

5th June 2023 by Karl Steinacker and Michael Kubach

Karl Steinacker and Michael Kubach are digital experts based in Berlin, Germany. This article is a reflection on their recent teaching assignment, “Digital Transformation and Social Justice”, at the Alice-Salomon University in Berlin, Germany.

During the 1990s, the world of public administrations and civil society organisations (CSOs) was awash with Change Management Teams. In particular, humanitarian and development organisations were, tasked to professionalise their work, to overcome charity and to create impact. The backbone of the process was the introduction of the Logical Framework, or LogFrame. This tool was originally developed for the US military and then adapted by NASA for its space programmes. After it was adopted by USAID and GTZ, an entire nomenclature for the humanitarian and development sectors was built around it: Results Based Management.

Even one of the authors proudly joined a Change Management Team and worked, so the buzzwords at the time, on Modernisation and Innovation. On top of the list of taboo concepts stood the term Welfare. In fact, the task was to implement neo-liberal policies at the level of the people living in extreme poverty. And in this context, the idea of unconditional support to ensure that members of a society can meet basic human needs, such as food and shelter, was simply rejected. The measure of support was not what a society could possibly afford but a revised paradigm of social justice aiming to reward effort and measure effectiveness. Management manuals were updated with an array of re-defined terminology stretching from authority, to responsibility, and most importantly accountability, to ensure compliance by everybody with these new policies.

Our journey into this realm of non-profit business administration would last for many years. The first task was to develop indicators: performance and impact indicators, at times SMART and then KPIs, but most importantly: measurable. Thus, datafication started on Excel sheets. For organisations mandated to protect and assist individuals, group statistics were deemed not to be sufficiently precise anymore. Registration systems had to be changed too: individual registration was to replace all systems that had been developed and used to provide collective support, to families or other groups.

Think tanks and govtech consulting firms were always eager to help for a fee: The digitalisation and the datafication of the social sphere would replace blanket statistical categorisations and allow for a more precise documentation of the individual. This idea of fine tuning corresponds to the sense of justice of many people working in the aid sector: A detailed analysis is much fairer than stereotypical evaluations[1].

While the early days of social statistics were characterized by the attempt to define meaningful categories, the growth of personalized data means that individuals are no longer counted as part of one big class, but increasingly as social-statistical singularities. The use of metrics and algorithms allows previously overlooked, hidden or willfully ignored differences to be identified, and hence utilized.[2]

Twenty-five years on, we find ourselves in a university classroom. Students, soon to enter the job market in the social sector, are confronted with our professional experience as we discuss digital transformation and social justice[3]. But the learning curve is steep for both, students and lecturers. For the latter, having stepped back from limited areas of responsibility in a single set of organisations, the bigger picture emerges: Yes, individual registration is indeed an empowerment tool providing an identity to somebody who is otherwise invisible, excluded, unbanked, and unfit to participate in the society and (digital) economy. However, it also allows for humanitarian[4] and social surveillance – not as an abstract possibility but as an everyday reality.

Today, authorities all over the world are experimenting with predictive algorithms. That sounds technical and innocent but as we dive deeper into the issue, we realise that the real meaning is rather specific: fraud detection systems in social welfare payment systems[5]. In the meantime, the hitherto banned terminology had it’s come back: welfare or social safety nets are, since a couple of years, en vogue again. But in the centuries-old Western tradition, welfare recipients must be monitored and, if necessary, sanctioned, while those who work and contribute must be assured that there is no waste. So it comes at no surprise that even today’s algorithms focus on the prime suspect, the individual fraudster, the undeserving poor.

Fraud detection systems promise that the taxpayer will no longer fall victim to fraud and efficiency gains can be re-directed to serve more people. The true extent of welfare fraud is regularly exaggerated [6]while the costs of such systems is routinely underestimated. A comparison of the estimated losses and investments doesn’t take place. It is the principle to detect and punish the fraudsters that prevail. Other issues don’t rank high either, for example on how to distinguish between honest mistakes and deliberate fraud. And as case workers spent more time entering and analysing data and in front of a computer screen, the less they have time and inclination to talk to real people and to understand the context of their life at the margins of society.

Thus, it can be said that routinely hundreds of thousands of people are being scored. Example Denmark: Here, a system called Udbetaling Danmark was created in 2012 to streamline the payment of welfare benefits. Its fraud control algorithms can access the personal data of millions of citizens, not all of whom receive welfare payments. In contrast to the hundreds of thousands affected by this data mining, the number of cases referred to the Police for further investigation are minute[7]

In the city of Rotterdam in the Netherlands every year, data of 30,000 welfare recipients is investigated in order to flag suspected welfare cheats. However, an analysis of its scoring system based on machine learning and algorithms showed systemic discrimination with regard to ethnicity, age, gender, and parenthood[8]. It revealed evidence of other fundamental flaws making the system both inaccurate and unfair. What might appear to a caseworker as a vulnerability is treated by the machine as grounds for suspicion. Despite the scale of data used to calculate risk scores, the output of the system is not better than random guesses. However, the consequences of being flagged by the “suspicion machine” can be drastic, with fraud controllers empowered to turn the lives of suspects inside out[9].

As reported by the World Bank, the recent Covid-19 pandemic provided a great push to implement digital social welfare systems in the global South. In fact, for the World Bank the so-called Digital Public Infrastructure (DPI), enabling “Digitizing Government to Person Payments (G2Px)”, are as fundamental for social and economic development today as physical infrastructure was for previous generations[10]. Hence, the World Bank finances globally systems modelled after the Indian Aadhaar system, where more than a billion persons have been registered biometrically. Aadhaar has become, for all intents and purposes, a pre-condition to receive subsidised food and other assistance for 800 million Indian citizens.

Important international aid organisations are not behaving differently from states. The World Food Programme alone holds data of more than 40 million people on its Scope data base. Unfortunately, WFP like other UN organisations, is not subject to data protection laws and the jurisdiction of courts. This makes the communities they have worked with particularly vulnerable.

In most places, the social will become the metric, where logarithms determine the operational conduit for delivering, controlling and withholding assistance, especially welfare payments. In other places, the power of logarithms may go even further, as part of trust systems, creditworthiness, and social credit. These social credit systems for individuals are highly controversial as they require mass surveillance since they aim to track behaviour beyond financial solvency. The social credit score of a citizen might not only suffer from incomplete, or inaccurate data, but also from assessing political loyalties and conformist social behaviour.

Hence, the question becomes urgent: what is social justice in a metric society and which role will CSOs play in this emerging environment, what will be their raison d’être?

Indian CSOs chose a pragmatic approach and engaged to ensure that these systems work in favour of those in need. These CSOs have analysed and strategised, on the basis of their principles and values, they endeavour to improve the newly emerging systems. Thus, some are engaged in advocacy, political and even legal battles to ensure that data protection measures are enacted and implemented. Others assist vulnerable individuals to register and to claim their entitlements and turn low-resolution into highresolution citizens[11]. Yet others engage in educational programmes to teach the users on the new digital environment and their rights therein.

Reflection and discourse might need to go even further and look at transparency and other societal issues, in particular with regard to logarithms. The city of Rotterdam was the only of several cities which agreed to a third party review of the algorithms deployed for fraud prevention purposes. Hence, there is a vast area to be chartered out: from the preservation of confidentiality, copyright and intellectual property rights to the demand for transparency where appropriate.

Lately it has been suggested that anti-racism or decolonisation dimensions will fundamentally change the long-term strategic thinking of important civil society organisations and that this would require developing concrete performance metrics and progress indicators[12]. This shows that the issue will not go away: Should those advocating and working on counter power[13] be using the same methodologies and tools than those currently holding power?

Our own personal story from LogFrame to Logarithms provides a number of lessons. The most important one is to understand the basic concepts underpinning the digital transformation of our societies: Power, social welfare in a market economy, individuals and their identity, and networks and platforms. Accepting that there is no escape room from the megatrend of digital transformation, neither for the individual, nor for the societies we are living in, means that engaging with these new technologies remains imperative. For the organisations that constitute civil society even more is at stake: It’s the race for relevance: Wherever human rights and social justice issues are at stake, whether an organisation focuses on advocacy or service delivery, what is needed are concepts and strategies to actively shape the ongoing transformation based on a clear idea of values and rights. Failing to do so will leave it to the govtech business industry to come up and implement their solutions uninhibitedly.

[1]  Steffen Mau, The Metric Society, 2019, p. 167.

[2] Ibid.

[3] Alice Salomon Hochschule, Berlin, Digital Transformation and Social Justice as part of the ICM programme.

[4] Mark Latonero, Stop Surveillance Humanitarianism, The New York Times, July 11, 2019.

[5] Suspicion Machines – Unprecedented experiment on welfare surveillance algorithm reveals discrimination, in: Lighthouse Reports, 6 March 2023.

[6] Consulting firms talk about welfare fraud up to nearly 5 per cent of benefits spending, while some national auditors’ offices estimate it at between 0.2 and 0.4 per cent. Ibid.

[7] AlgorithmWatch, Automating Society Report 2020, October 2020, pp. 46

[8] Suspicion Machines, see footnote 5 above

[9] Ibid.

[10] World Bank, Identification for Development (ID4D) and Digitalizing G2P Payments (G2Px) 2022 Annual Report (English), p. 2.

[11] Ranjit Singh, Steven Jackson, Seeing Like an Infrastructure: Low-resolution Citizens and the Aadhaar Identification Project, in: Proceedings of the ACM on Human-Computer Interaction Archive Vol. 5, No. CSCW2, Article 315, October 2021,

[12] International Civil Society Centre, Sector Guide #2:Strategic Decision-Making in a Whirly World, July 2021, p. 26,

[13] Democratic Data: Developing Digital Counterpower, a discussion with Salomé Viljoen,

Karl Steinacker

Digital Advisor

International Civil Society Centre

Karl Steinacker is currently the Digital Advisor of the International Civil Society Centre. He studied political science at the Free University of Berlin and international law at Cambridge University. He then spent three decades working for the United Nations (UNDP, UNRWA, DPKO, UNHCR) in the fields of development, peacekeeping and refugee protection. At the UN Refugee Agency, he held positions in Africa and at its Headquarters and was responsible for Registration, Statistics, and Data and Identity Management as well as for Camp Coordination/Camp Management.

Michael Kubach

Digital Expert

Fraunhofer IAO

Since 2013, Michael Kubach has been researching issues around digital identity and trust, where he takes a socioeconomic, user-oriented perspective at the Fraunhofer IAO - Team Identity Management. Michael works/has worked in several European and national cooperative research projects such as the EC-funded projects ESSIF-TRAIN and LIGHTest (on trust infrastructures) and FutureID (federated identity management). Moreover, he is consulting international corporations and NGOs on identity and trust infrastructures as well as blockchain/DLT topics. Michael holds a PhD in economics from Georg-August-University Göttingen. He studied politics and administrative science as well as management in Konstanz, Göttingen and Lille.


The Future of Global Mobility

14th April 2022 by Karl Steinacker and Dr Steffen Angenendt

Our Digital Advisor Karl Steinacker has jointly published with Steffen Angenendt of the German Institute for International and Security Affairs an article addressing the structural bias travellers from the Global South are facing. The article was first published at Stiftung Wissenschaft und Politik (SWP) in German. (English version of SWP‑Aktuell 10/2022) Translation by Tom Genrich

Why We Need a Debate about Multilateral and Digital Solutions to Prevent the Global South from Being Excluded from International Travel

The Covid-19 pandemic has greatly reduced international travel. The economic, social and human consequences of border closures and travel restrictions cannot be fully estimated yet, but they are dramatic. The gap is widening between countries of the Global North, which want to control travel and prevent unregulated mobility, and those of the Global South, which are demanding more legal mobility for their citizens. The freedom to travel is a desirable good that all should be able to access, and is also the object of political negotiations. Unilateral decisions should be complemented or superseded by international agreements between countries about common rules and procedures for a trust-based system. In the meantime, countries should modernise their visa processes and build digital identification systems that create trust. This applies to Germany as well, especially since the coalition government has decided to speed up the issuing of visas.

Many European Union (EU) citizens now to have to show their vaccination certificate to board a plane or cross a border. Proof of vaccination status as a requirement for entry is nothing new: travel to tropical countries, for example, has long been con­ditional on being vaccinated against yellow fever, which had to be proved by a paper document. However, electronic documen­tation, for instance in the form of smart­phone apps, is gaining in importance now – and not just for vaccinations. Travellers to North America are familiar with such sys­tems, which were introduced after the attacks of 11 September 2001 as part of the Smart Borders Initiative.

International Efforts to Create “Smart” Borders

These US security measures contain new security standards for travel documents, the systematic recording of flight passenger data (PNR), the introduction of an electronic entry permit (ESTA), an entry and exit regis­ter (EES) with biometric visa and a screen­ing system to prevent the boarding and arri­val of terrorism suspects. Nevertheless, to fa­cilitate travel, the US has initiated the Trus­ted Traveller programmes. These include among others the Global Entry programme, under which pre-approved travellers who are considered a low security risk by the authorities can have their interview after ar­ri­val instead. Participants in the programme can also use – like US nationals or green card holders – the PreCheck programme, which speeds up security checks at US air­ports.

The US is undoubtedly a pioneer in “intelligent” border efforts; however, since 2008 the EU commission has been driving forwards its own visa information system, which works with biometric data, as well as an entry and exit register and a system for flight passenger data. The European Travel Information and Authorisation Sys­tem (ETIAS) is expected to be operational in 2022. Similar to the US’s ESTA, it is in­tended for conducting security checks on trav­ellers from currently over 60 countries that do not need a visa for the Schengen area. The electronic Entry Exit System (EES) should also be put into service soon, and automatically monitor the travel move­ments of third state nationals at the exter­nal borders of the Schengen area. This IT system is intended to match those entering with those exiting and thus catch potential visa overstayers. Therefore, data sets includ­ing biometrics will be established for first-time arrivals in the Schengen area.

For decades, public health played only a minor role in international mobility management. The Covid-19 pandemic and associated containment measures, such as the designation of high risk areas and virus variant areas, have changed this for an un­foreseeable duration. Over the course of the Covid pandemic, international mobility has collapsed: in 2019, the year before the pan­demic, the global aviation industry carried more than 4.5 billion passengers. In the first year of the pandemic, over 108,000 travel restrictions linked to Covid-19 were imposed across the world. The number of flight pas­sengers fell by 60 percent. Numbers of new international migrants also remained well below previous estimates until mid-2020, and worldwide the pandemic is believed to have reduced their total by 2 million.

Such travel restrictions risk generating new distortions and a further widening of the gap between the Global North and the Global South. The (legitimate) wish to pro­tect populations in industrialised countries against the risk of infection can lead to a blanket disadvantaging of and discrimination against people from countries which do not have the means to vaccinate their popu­lations as successfully as the countries of the Global North, despite all the difficulties, have done. Moreover, the governments of developing nations also want to protect their populations against the risk of infection. But they have fewer options – especially because of their inadequate access to vaccines.

In the context of the pandemic, questions over the legitimacy of mobility con­trols are particularly pressing. The most recent reform proposals to the Schengen rules, which the EU intends to apply in the event of a pandemic to impose common entry bans, can therefore appear ambigu­ous. The planned formalisation could reinforce entry bans but also force the authorities to justify them.

The Issuing of Visas as the Key to International Mobility

At the heart of international mobility is the issuing of visas. This primarily serves to regulate tourism and business travel, as well as justifications for stays (such as labour migration, family reunification and asylum). To pre-empt undesired immigration, EU members and many other desti­nation countries pursue a parallel strategy. They resort to unilateral measures and tech­nologies to control numbers, and they use political pressure to induce countries of origin to cooperate on reducing irregular migra­tion. However, many countries of ori­gin now make their cooperation in migra­tion regulation conditional on destination countries taking seriously their demands for easier international mobility (for in­stance through free-of-charge visas or a generalised visa waiver for their citizens). This is the case with Turkey, for example, which has long been pressing the EU for a visa exemption and has repeatedly linked its demand to the threat of suspending its collaboration on migration control.

In 2020 EU member embassies and con­sulates received around 3.5 million visa applications for stays of up to three months in the Schengen area, a marked Covid-linked drop compared to 2019, when there were approximately 17 million applications. Of the applications submitted in 2020, around 85 percent were granted, slightly fewer than in 2019 (88 percent). However, rejection rates for certain regions of origin, especially sub-Saharan Africa, were far higher. And these numbers only offer a par­tial picture of the mobility blockages since they only take into account processed appli­cations; the many visa appli­cations that were abandoned or not even submitted for reasons of hopelessness are not included in the statistics.

Mobility for Germans looks entirely dif­ferent. German travellers are at the very top of global passport rankings; they have out­standing travel opportunities and access to almost all countries without needing to go through visa procedures. The Henley Pass­port Index currently has Germany in joint second place with South Korea; their citi­zens can enter 190 states without a visa. Only the passports of Singapore and Japan had higher rankings (192 states). Citizens of African, Middle Eastern and South Asian countries have a noticeably experience. African passports generally permit visa-free travel to only 20 to 25 percent of countries, mostly to neighbouring African nations.

In her thesis on the birthright lottery, Ayelet Shachar, the former director of the Max Planck Institute for the Study of Reli­gious and Ethnic Diversity in Göttingen, points to the privileges and disadvantages that result from acquiring citizenship. She argues that the acquisition of such politi­cal membership today corresponds to the acqui­sition of private property in times past.

Public Order or Structural Discrimination?

The EU’s 2001 Regulation on visas stipulates that the visa requirement is the stand­ard mechanism of its mobility system and that any potential exemption is merely a uni­laterally granted exception and a privi­lege. The Regulation, last revised in sum­mer 2021, also explains that all abuse of visa exemptions by nationals of a third coun­try must be combated “where they [the nationals] pose a threat to the public policy (ordre public) and the internal security of the Member State concerned”. In the event of inadequate cooperation by countries of origin, visa exemptions can be suspended. Thresholds have been set for this, for in­stance where a country of origin rejects over half of the EU’s readmission applications, or where fewer than four percent of asylum applications from the country are approved.

The Regulation indicates that greater migratory pressure is to be avoided, and it is obvious that EU members consider the European mobility regime as an instrument to regulate migration. The Regulation, which entered into force in 2009 and was last revised in 2019, also allows the issuing of visas to be used as an instrument to reward good cooperation on readmissions, or to sanction inadequate cooperation.

Nanjala Nyabola, a Kenyan commentator on current affairs and politics who gave one of the opening speeches at the May 2019 Future Affairs Forum on the digital revo­lution organised by the German Foreign Office in Berlin, considers the visa regime of developed countries to be structural racism, aiming to exclude the populations of the Global South from global mobility. The issuing of visas, she argues, reflects neo-colonial structures. It would seem that large swathes of the elites in Africa, the Arab world and South Asia share Nyabola’s position. The local press – and travel blogs – criticise not only the visa policies of the Global North but also the procedures em­ployed in embassies and at borders. These commentators raise accusations of institu­tional racism and racial profiling. Many complain about the humiliating nature of the procedures that applicants and those wishing to travel have to undergo. Such perceptions are counterproductive for win­ning over the Global South to the essential cooperation needed to meet the challenges of migration and forced displacement.

Elements of an International Framework

Discriminating mobility regulations are used to control migration not least because countries mistrust the identity documents and visa decisions of other governments. The Israeli historian and political analyst Yuval Noah Harari speaks of trust being the most important capital of any human society, referring to small communities as well as countries and international politics as a whole. To build trust, we need to strength­en multilateralism.

Two elements are required for a set of rules that shape international mobility: multilaterally negotiated objectives, strat­egies and procedures; and digital technol­ogies that facilitate mobility – as long as they are not an end in themselves but pur­sue political objectives. This need for a set of political rules derives inter alia from the risk that technologies might be used as sub­stitutes for such rules – for instance, when important actors such as the US rely on spe­cific techniques, these are subsequently adopted by international bodies such as the International Civil Aviation Organisation (ICAO), and other countries then have to fol­low suit to avoid being left behind.

Building Block 1: The Political Definition of Goals

Cross-border mobility is still a policy area in which – except for zones with internal freedom of movement, such as the Schen­gen area – national sovereignty is given greater importance than multilateral efforts for joint regulations that are beneficial to all. At the centre of current mobility regu­lations is unilateralism – which also falls short of the goal of the Global Compact for Safe, Orderly and Regular Migration, adopted by Germany in 2018, to improve international cooperation on migration.

Lessons on changes in mobility systems can be learned from history, especially from the political transformation in Europe from the Cold War to German reunification in 1989. This transformation was also inspired by a political declaration of intent. In 1975 the representatives of 35 countries from West and East signed the Final Act of the Conference on Security and Cooperation in Europe (CSCE) in Helsinki, which stipulated many simplifications of cross-border mobil­ity, especially for family contact. It also wanted to enable travel applications for per­sonal or professional reasons, and pro­mote conferences, youth exchanges and tourism.

Of course, this specific historical con­stellation cannot simply be transferred to today’s circumstances: the Warsaw Pact coun­tries did not want to grant their citi­zens freedom of movement, whereas the West saw free movement in particular as the leverage for change, chiefly for inter-system contact. This change in turn was meant to lead to détente and the disman­tling of threat perceptions. Indeed, when the Iron Curtain fell, the result was not total freedom of travel; the latter remained linked to visas and (transitional) rules for work permits.

Nevertheless, a process similar to the Hel­sinki Conference – which would ideally result in a Global Compact for International Mobility – could be a useful complement to existing international law and political agreements on migration and displacement. The process could be based on the idea of a modernised and rule-based international mobility that is a win-win for all participat­ing countries – and thus counter the accu­sation that the North is only interested in reducing irregular immigration.

In such a mobility regime, signatory coun­tries would, as in the Helsinki Final Act, commit to facilitating international mobility for nationals of all countries, in­clud­ing those of the Global South, and in the process exclude discrimination. Coun­tries would also politically commit them­selves to developing and applying technological innovations – particularly digital identity and trust systems – so as to create the technical conditions for a global mobil­ity that is both legal and in accordance with the rules.

Building Block 2: Digital Technologies

This kind of rule-supported international mobility system could only be implemented using digitalisation. Currently 19th and 20th century tools continue to determine inter­national travel control – with a great deal of effort, uncertainty, vulnerability to forg­ery and corruption: passports and visas; stamps and stickers; personal interviews and paper pushing. The experiences from the Covid pandemic could be helpful in modernising the system. For example, the rapid introduction of a digital vaccine cer­tificate (although not forgery-proof) in the EU member states showed that digitali­sa­tion can contribute to upholding freedom of movement.

However, the pandemic has also revealed that current instruments are too unwieldy to react to rapidly changing framework con­ditions, and that trust in traditional regu­latory instruments is low. Any new mobility system must therefore modernise certifica­tion and identification instruments.

Such modernisation is necessary in Germany as well, as the National Regulatory Control Council (NKR) regularly makes clear, inter alia in its annual monitoring reports on the state of digitalisation in Ger­many, published since 2016. The NKR also illustrates how this could be achieved at the national level: by the Council calling on the administration to make data-based deci­sions and treating it as a service provider, which responds to needs and allows the state’s performance to be measured quali­tatively from the perspective of those con­cerned.

Of course, the NKR has a domestic man­date and its recommendations refer to Ger­many. Nevertheless, if its key concern – namely to attend to the interests of those affected – is transferred to international mobility rules, it becomes clear that the current fixation on nationality as the deci­sive criterion for the issuing of visas is prob­lematic. If visa decisions were instead pri­marily founded on other characteristics – such as profession, qualifications, age, integ­rity and health – then the international mobility regime would be more efficient as well as fairer.

The mobility regime of the future must be based on digital trust systems which can certify the identity of persons and attributes in a forgery-proof manner. An example is the EU’s digital vaccination pass, despite the concerns we have raised above: it con­firms the identity of the vaccinated person, that the vaccine has been approved, that the vaccination was carried out by author­ised medical staff and that the issuer of the certificate is competent to do so.

The German government is already pro­moting such “trust systems” through its research policy. They are intended as the future foundation for trustworthy digital interactions and to secure the access to digital services, such as telemedicine and digital prescriptions, the gig economy, online banking and e-government. These intentions primarily concern German citi­zens, but the federal government should also pursue this approach for travellers from third countries.

The Role of Private Service Providers

Private companies have already taken on im­portant functions in international mobil­ity management. This does not have to be a contradiction of the state’s sovereignty and control. On the contrary, states have brought in the private sector for support in pro­viding their consular services but also in securing their borders. This concerns air­lines in particular: under threat of substan­tial fines for omission, they carry out the relevant mobility control tasks for states at whose airports they land. Commercial migration services providers, such as CIBT (from the US) and the market leader VFS Global (founded in India, domiciled in Dubai), are now established, and are com­missioned by states to assume parts of visa processing or delivery functions. Technol­ogy companies offer the requisite hardware, software and data analysis.

In all cases, these companies act as data brokers, which means that they possess large amounts of data on rejected and ap­prov­ed visa applications, regular travellers and irregular migrants. This can lead to controversy since it is unclear under what jurisdiction these service providers might fall, and what legal avenues might be avail­able for complaints. However, the EU has at least pointed out that its General Data Pro­tection Regulation (GDPR) also applies to companies that carry out visa services for nationals of third countries, and that these companies have to ensure an appropriate level of protection for personal data.

The Canadian and Dutch governments along with the technology company Accenture and the World Economic Forum (WEF) are currently testing the Known Trav­eller Digital Identity system (KTDI) for flight passengers. Travellers using this system register biometric and cryptographic data about themselves, for instance on their mobile phones. On request and at their own discretion, these travellers then grant the authorities access to their verified per­sonal biometric, biographical and historical travel data to enable them to conduct risk assessments and pre-screening. KTDI allows journeys to be depicted and traced, with travellers interacting with authorities and private companies via mobile devices by mak­ing available historical and real-time data (“identity attributes”).

The project for a hotel check‑in for busi­ness travellers – which the German gov­ern­ment is using to test the construction of an infrastructure for the secure exchange of identity attributes, not just for the digital identities of humans but equally of institu­tions and objects (Internet of Things) – pur­sues similar objectives. Pilot projects would clarify if this approach can also be applied to nationals of third countries.

It will be indispensable to integrate the private sector into any international mobil­ity regime. Airlines, hotel chains, banks and financial service providers as well as insur­ances should act jointly with consulates and registry offices on a platform that not only offers travellers from third countries services under internationally agreed rules and processes them, but that can also issue verified identity attributes.

Ethical Issues

The most pressing ethical issue that con­cerns all “wallet” applications in which indi­viduals save information on their iden­tity derives from the imbalance in power between the representatives of state bodies and the owners of the data, in this case the travellers. It is debatable whether the latter ultimately retain control over their own per­sonal data and whether they will be allowed to pass on the data exclusively of their own free will and at their own dis­cretion – given immigration and control practices that demand personal data and at times empower agents to ask for passwords or even download the contents of mobile phones, computers or other devices.

This imbalance of power undoubtedly exists. It must be countered through bind­ing and actionable rules. This also applies to the data of travellers stored by author­ities and private service providers. Today they are inaccessible for those concerned (the data subjects): biometric data are stored at the consulate or by the company to which this service has been outsourced. The trav­eller’s digital self is controlled by others.

In the United Kingdom civil society or­ga­ni­sations have sued the Home Office, accus­ing the algorithms of the artificial intelligence employed by this ministry of being racist and discriminatory. Such risks must not be underestimated. It is also true, how­ever, that digital processes, when correctly conceived, can reduce the influence of dis­criminatory prejudices in decision-making.

An ethical debate about the details of the mobility system becomes necessary precisely when digital technologies are to be em­ployed. One criticism will be that a digital identity and trust system based on the im­balance of power between poorer and richer nations will not facilitate the mobil­ity of people from the poorer countries. Objections such as this must be taken seri­ously since people wishing to travel will probably continue to be rejected – and the suspicion of discriminatory prejudices, whether by natural persons or by algo­rithms, cannot be eliminated.

Beyond this, further fundamental issues will have to be clarified. How to deal with the tension between the travellers’ agency over their data and the requirement that a trust system, to be effective, must contain enough data? When has anyone attained “sufficient trust” – or will the expansion of certification and identification create a “spiral of distrust”, in other words an un­capped need for ever more, even better verified but never sufficient data? How to define a “key area” in which no data are collected, such as bank accounts? How to prevent an accountable person’s control being transferred to a machine when arti­ficial intelligence is used and the decision-making process is automated? Are the guarantees contained in Article 22 GDPR sufficient? And how to design effective objection and appeal processes?

The fundamental ethical issues also in­clude the objection that millions of people in the Global South will probably not attain a sufficiently high degree of trust due to their inadequate socioeconomic resources and limited access to digital resources, and will therefore continue to be excluded from travel to Germany and the EU. A counter-argument is the fact that in Germany and other OECD countries the unequal dis­tri­bution of resources also constrains citizens’ mobility: a high passport ranking is of little use for those who cannot afford to travel. Ultimately the objective is to develop tech­nical solutions that correspond to the politi­cal modernisation targets; safeguard individ­ual rights from an ethical perspective; priori­tise data protection; coincide with Chapter 3 of the GDPR; and endow everyone with rights, such as the right to transparent deci­sions, appeal, correction and compensation.

Political Objections

There needs to be a debate about political issues as well as ethical ones. They include the argument that governments tend to view decisions about access to the national territory as central to their activity and do not want to give up any competency in this regard. The counter-argument runs as fol­lows: a modernised travel system would not make decisions about longer stays, for instance for work purposes; such decisions would remain the countries’ “domaine réservé”. At least the debate would address the problem that a travel regime which is frustrating for the Global South would be counterproductive since it would thwart co­operation on other topics as well. In con­trast, a transparent process aiming to facili­tate global mobility for everyone would put co­operation before unilateralism and create new opportunities for cooperation so as to better manage irregular migration and the un­justified onward migration of asylum-seekers.

An international mobility policy would also be generally economically advanta­geous for all participating actors; we could therefore expect enough countries to join in such a project. Germany has the political leeway and technical competence to con­tribute to modernising international mobil­ity. It also has the will, as the December 2021 coalition agreement has shown in ref­er­ence to the issuing of visas.

Recommended Actions

  • The German government should launch a national strategy to digitally modernise international mobility with its partners from business, technology and civil society.

  • The government should also verify whether a trust system with a digital plat­form can be established as a public-private partnership and as a public body.

  • Moreover, the German government should initiate an international policy dialogue that could result in negotiations on a Global Compact for International Mobil­ity, and one that enumerates the weaknesses of technical processes such as Trusted Traveller.

  • Industry, service providers and social partners should participate: first, in the strategy dialogue; second, in building a digital platform to support mobility policy through the provision of services (e.g. insurance, monetary transactions, travel and tourism, consumer advice); and third, in the supervisory bodies of a trust system with a digital platform.

  • Finally, it is important to include civil society, both in this dialogue and in the supervisory bodies of the trust system. There could, for example, be a digital platform that offers civil society services in information, advice and cooperation. Not least, this platform could serve as a means of exchanging with foreign civil society actors on issues of international mobility.

© Stiftung Wissenschaft und Politik, 2022

 

Karl Steinacker

Digital Advisor

International Civil Society Centre

Karl Steinacker is currently the Digital Advisor of the International Civil Society Centre. He studied political science at the Free University of Berlin and international law at Cambridge University. He then spent three decades working for the United Nations (UNDP, UNRWA, DPKO, UNHCR) in the fields of development, peacekeeping and refugee protection. At the UN Refugee Agency, he held positions in Africa and at its Headquarters and was responsible for Registration, Statistics, and Data and Identity Management as well as for Camp Coordination/Camp Management.

Dr Steffen Angenendt

Senior Fellow in the Gobal Issues Research Division

Stiftung Wissenschaft und Politik


Data Protection and Biometrics: Scanned by International Aid Organisations

20th October 2021 by Karl Steinacker and Katja Lindskov Jacobsen

NGOs and UN agencies have collected sensitive personal data of millions of people in the global South. With due prudence?

Aid and charitable organisations (hitherto referred to as NGOs) have gone digital in recent years. One aspect of this involves collecting and processing large amounts of data about the very people they assist and interact with. Such data is often very personal and may include health and educational data, family relations, religious and political believes. These data sets identify a person by name, address, picture, and increasingly also biometric features.

«Trust is the fuel of our societies» says the contemporary historian and philosopher Yuval Harari. This remark raises an important question, namely how digital data dimensions of the practices of NGOs affect this vital issue of trust.  Can their policies and practices stand public scrutiny and are worth our trust? Can we be sure that personal and often sensitive data has been obtained voluntarily from each of the data subjects that NGOs gather such data from? Is it right to assume that the people have a clear understanding of purpose, benefits, and risks related to the sensitive data they allow NGOs to collect, store and process?

Data Protection and Sovereignty in the Global South

In general, it can be said that NGOs fall under the jurisdiction of the country where they operate. That might reassure the public in the countries of the North where data protection laws have been passed and are being enforced. However, on a global scale, as more and more governments enact data protection laws, we are in the paradoxical situation that NGOs often face new obligations to disclose personal data in support of national sovereignty and security issues. Countries like Turkey and Rwanda have modelled their data protection laws following the example of the European GDPR and demand that personal data of their citizens and residents is kept exclusively on servers on their territory. But NGO data is sometimes also shared with donors to obtain funding or as part of specific data-sharing agreements. Each contractual arrangement with commercial service providers, be it outsourced IT services, such as the storage and data analysis, or simple bank and mobile pay transfers, disclose identity and information about aid recipients.

Those who manage the personal data, the senior NGOs managers and data protection officers, are aware of the issues at stake. They will mention the noble mandate and the non-profit character of their organisation. Some will point out that they have internal data protection policies and regulations. Especially the international NGOs, these are the western institutions led and funded from the global North, will claim that they follow the GDPR, as the current international gold standard in data protection, even in countries outside of Europe.

It is certainly true that data of their institutional and individual donors, including those who have been recruited in the High Street to donate every month 10 Dollars/Euros or so, is kept and processed in a GDPR compliant manner. The NGOs know that they have to gain and maintain the trust of that important segment of the public since their financial and reputational survival is at stake.

Refugees in Chad have to reveal biometric data to obtain an ID card. – CC-BY-NC-ND 2.0 EU Civil Protection and Humanitarian Aid

Open questions regarding digital rights

But what about the people at the margins, those data subjects in the global South, that have surrendered their personal data to NGOs because they need protection and assistance in the face of war, natural disaster, or poverty? In such cases, specific questions need to be asked and addressed, such as:

  • Are these data subjects informed in a transparent and understandable manner about (real and possible) data use as far as mandatory data fields are concerned?
  • Have people, where informed consent is asked for, been educated, even genuinely informed in relation to the specific context, so as to meaningfully make that decision? What are the alternatives offered to them should they decide not to for example not register their biometric data with an NGO?
  • Are the consent rules and practices easy and possible to fully implement? Can consent be revoked?
  • Do individuals have access to their data and to query that data, demand corrections, and which recourse options are open to them? Is deletion of data even possible or is there a right to be forgotten?
  • Are data subjects always informed about data breaches and data sharing arrangements, especially with state authorities?

Unfortunately, answers to these and other questions are in most cases likely to fall short of even minimum standards of data protection. Organisational culture and practices in many NGOs often fail to put the necessary focus on and resources into conducting impact and risk assessments in a particular context before decisions are made to collect, store and process personal data. Financial audits are common while data audits remain an exception. Operations security processes and strategies for protecting critical and sensitive data are rare within organisations and absent at an inter-organisational level. Self-policing is the norm, rather than submitting to independent oversight.

Co-operation with Palantir cause a stir

Special mentioning is necessary of the large specialised UN agencies, such as IOM, WFP and UNHCR. These organisations have, like no other non-governmental bureaucracy, amassed personal data files of tens of millions of people around the globe. Their data subjects for example, surrender their biometric imprints (commonly a fingerprint or an iris scan) for a bar of soap, a sack of rice or a cash transfer, but also for a residence permit, or the opportunity to be resettled in another country. Unfortunately, their policies for safeguarding such sensitive biometric data and importantly the implementation of data protection remain largely opaque. Let’s take the example of the data-sharing agreement that UNHCR has concluded with the government of Bangladesh. Human Rights Watch alleges that personal data, including biometrics, collected jointly by UNHCR and the Bangladesh authorities from Rohingya refugees, who have fled genocide in Myanmar, has been transmitted to the government of Myanmar. Or WFP which has announced that it signed a contract with Palantir, a company specialised in data analytics and which is part of the data-military-industrial complex in the United States. Palantir was criticised recently by Amnesty International for its failure to conduct human rights due diligence around its contracts. These and other data related treaties, contracts, and practices of UN agencies escape public scrutiny and cannot be challenged in any national court because of their diplomatic immunity. Yet, various UN agencies routinely roll out large-scale data collection programmes in many countries whilst failing to ensure appropriate oversight bodies and recourse procedures for their data subjects.

When two months ago western military and civil organisations evacuated their personnel from Afghanistan, large amounts of sensitive personal data, including biometric data, was left behind. Only time will tell whether that data has been adequately protected and cannot be abused. Closer at home, the German Red Cross received the 2018 edition of the Big Brother Award from a civil society organisation for its digital system of asylum shelter management. The Red Cross software instituted humanitarian surveillance and total control of the asylum seekers and refugees by movement tracking to and within the shelter, detailed recording of medical checks, food consumption, relationships, religious and ethnic affiliations and much more. The question is not whether there are similar systems in different setting and places elsewhere in the world but rather how many of them exist.

ICRC and Oxfam work on biometric policies

Looking ahead, we see the contours of different trends: Some civil society actors recognise the emergence of unforeseen risks relating to the use of personal data that have been collected in many different contexts. They now increasingly advocate for intensified discussions of approaches to responsible uses of personal, in particular biometric data. Indeed, the ICRC and Oxfam have defined red lines for the use of biometrics and discuss their data policies and practices with the people they protect and assist.

On the other hand, however, the trend towards surveillance and biometric overkill continues. The most worrying example is the use of DNA by the US Immigration Service ICE. Here a precedent is set where biological features are not only used to uniquely identify an individual but the biological relationship of several persons. Meanwhile, UNICEF and the Gates Foundation are looking into ways of using biometrics for infant and child healthcare in Africa. Newly set-up start-ups suggest introducing biometrics for school kids in Africa so as to control their school attendance.

But there is resistance too. Political organisations working among Rohingya refugees living in camps in Bangladesh are campaigning among their fellow refugees not to deliver their personal data, notably biometrics, to the United Nations and the Bangladesh authorities out of fear that such sensitive personal data could end up in the hands of the military dictatorship in Myanmar. During the Covid-19 pandemic, the UN Agency for Palestinian Refugees in the Middle East (UNRWA) failed in introducing biometric ration and cash cards: The Palestinian refugees wouldn’t have any of it and simply refused to collaborate.

Biometric Surveillance in a Civil War

And so it is not surprising that personal data can also become a weapon. A case in point is Yemen where two competing governments fight a civil war for control of the country. Since 2019, WFP rejects the lists of beneficiaries put together by the government based in the capital Sana’a. The agency demands that itself should be put in charge to biometrically register those eligible to receive food aid. The competing government, supported by the US and Saudi Arabia, allowed WFP to biometrical register food aid recipients in their areas of control. As a result, the bulk of food aid goes to the areas where WFP was able to register the population.

Accordingly, widening current debates and including a broad range of stakeholders will be crucial as we move forward. If it comes to sensitive personal data and trust, charitable intentions, progressive mandates, and a non-profit posture are noble starting points but unfortunately account for little. What happens to the personal data of the so-called data subjects is what matters. For years NGOs have been discussing concepts like do no harm and accountability. Many claim that they are accountable to the people they assist, protect, and seek to empower, rather than to their donors. Now the opportunity has come to implement and showcase meaningful accountability: Give those who are marginalised and in need of protection and assistance the digital rights they are entitled to.

This article was published first in German by Netzpolitik.org on 8 October 2021

Karl Steinacker

Digital Advisor

International Civil Society Centre

Karl Steinacker is currently the Digital Advisor of the International Civil Society Centre. He studied political science at the Free University of Berlin and international law at Cambridge University. He then spent three decades working for the United Nations (UNDP, UNRWA, DPKO, UNHCR) in the fields of development, peacekeeping and refugee protection. At the UN Refugee Agency, he held positions in Africa and at its Headquarters and was responsible for Registration, Statistics, and Data and Identity Management as well as for Camp Coordination/Camp Management.

Katja Lindskov Jacobsen

Senior Researcher

Centre for Military Studies at the Institute of Political Science at the University of Copenhagen

Katja Lindskov Jacobsen holds a PhD in International Relations from Lancaster University and works as a Senior Researcher at the Centre for Military Studies at the Institute of Political Science at the University of Copenhagen. The focus of her research is on security policy and interventions.