Posts with the tag
“transformation”

From LogFrames to Logarithms – A Travel Log

5th June 2023 by Karl Steinacker and Michael Kubach

Karl Steinacker and Michael Kubach are digital experts based in Berlin, Germany. This article is a reflection on their recent teaching assignment, “Digital Transformation and Social Justice”, at the Alice-Salomon University in Berlin, Germany.

During the 1990s, the world of public administrations and civil society organisations (CSOs) was awash with Change Management Teams. In particular, humanitarian and development organisations were, tasked to professionalise their work, to overcome charity and to create impact. The backbone of the process was the introduction of the Logical Framework, or LogFrame. This tool was originally developed for the US military and then adapted by NASA for its space programmes. After it was adopted by USAID and GTZ, an entire nomenclature for the humanitarian and development sectors was built around it: Results Based Management.

Even one of the authors proudly joined a Change Management Team and worked, so the buzzwords at the time, on Modernisation and Innovation. On top of the list of taboo concepts stood the term Welfare. In fact, the task was to implement neo-liberal policies at the level of the people living in extreme poverty. And in this context, the idea of unconditional support to ensure that members of a society can meet basic human needs, such as food and shelter, was simply rejected. The measure of support was not what a society could possibly afford but a revised paradigm of social justice aiming to reward effort and measure effectiveness. Management manuals were updated with an array of re-defined terminology stretching from authority, to responsibility, and most importantly accountability, to ensure compliance by everybody with these new policies.

Our journey into this realm of non-profit business administration would last for many years. The first task was to develop indicators: performance and impact indicators, at times SMART and then KPIs, but most importantly: measurable. Thus, datafication started on Excel sheets. For organisations mandated to protect and assist individuals, group statistics were deemed not to be sufficiently precise anymore. Registration systems had to be changed too: individual registration was to replace all systems that had been developed and used to provide collective support, to families or other groups.

Think tanks and govtech consulting firms were always eager to help for a fee: The digitalisation and the datafication of the social sphere would replace blanket statistical categorisations and allow for a more precise documentation of the individual. This idea of fine tuning corresponds to the sense of justice of many people working in the aid sector: A detailed analysis is much fairer than stereotypical evaluations[1].

While the early days of social statistics were characterized by the attempt to define meaningful categories, the growth of personalized data means that individuals are no longer counted as part of one big class, but increasingly as social-statistical singularities. The use of metrics and algorithms allows previously overlooked, hidden or willfully ignored differences to be identified, and hence utilized.[2]

Twenty-five years on, we find ourselves in a university classroom. Students, soon to enter the job market in the social sector, are confronted with our professional experience as we discuss digital transformation and social justice[3]. But the learning curve is steep for both, students and lecturers. For the latter, having stepped back from limited areas of responsibility in a single set of organisations, the bigger picture emerges: Yes, individual registration is indeed an empowerment tool providing an identity to somebody who is otherwise invisible, excluded, unbanked, and unfit to participate in the society and (digital) economy. However, it also allows for humanitarian[4] and social surveillance – not as an abstract possibility but as an everyday reality.

Today, authorities all over the world are experimenting with predictive algorithms. That sounds technical and innocent but as we dive deeper into the issue, we realise that the real meaning is rather specific: fraud detection systems in social welfare payment systems[5]. In the meantime, the hitherto banned terminology had it’s come back: welfare or social safety nets are, since a couple of years, en vogue again. But in the centuries-old Western tradition, welfare recipients must be monitored and, if necessary, sanctioned, while those who work and contribute must be assured that there is no waste. So it comes at no surprise that even today’s algorithms focus on the prime suspect, the individual fraudster, the undeserving poor.

Fraud detection systems promise that the taxpayer will no longer fall victim to fraud and efficiency gains can be re-directed to serve more people. The true extent of welfare fraud is regularly exaggerated [6]while the costs of such systems is routinely underestimated. A comparison of the estimated losses and investments doesn’t take place. It is the principle to detect and punish the fraudsters that prevail. Other issues don’t rank high either, for example on how to distinguish between honest mistakes and deliberate fraud. And as case workers spent more time entering and analysing data and in front of a computer screen, the less they have time and inclination to talk to real people and to understand the context of their life at the margins of society.

Thus, it can be said that routinely hundreds of thousands of people are being scored. Example Denmark: Here, a system called Udbetaling Danmark was created in 2012 to streamline the payment of welfare benefits. Its fraud control algorithms can access the personal data of millions of citizens, not all of whom receive welfare payments. In contrast to the hundreds of thousands affected by this data mining, the number of cases referred to the Police for further investigation are minute[7]

In the city of Rotterdam in the Netherlands every year, data of 30,000 welfare recipients is investigated in order to flag suspected welfare cheats. However, an analysis of its scoring system based on machine learning and algorithms showed systemic discrimination with regard to ethnicity, age, gender, and parenthood[8]. It revealed evidence of other fundamental flaws making the system both inaccurate and unfair. What might appear to a caseworker as a vulnerability is treated by the machine as grounds for suspicion. Despite the scale of data used to calculate risk scores, the output of the system is not better than random guesses. However, the consequences of being flagged by the “suspicion machine” can be drastic, with fraud controllers empowered to turn the lives of suspects inside out[9].

As reported by the World Bank, the recent Covid-19 pandemic provided a great push to implement digital social welfare systems in the global South. In fact, for the World Bank the so-called Digital Public Infrastructure (DPI), enabling “Digitizing Government to Person Payments (G2Px)”, are as fundamental for social and economic development today as physical infrastructure was for previous generations[10]. Hence, the World Bank finances globally systems modelled after the Indian Aadhaar system, where more than a billion persons have been registered biometrically. Aadhaar has become, for all intents and purposes, a pre-condition to receive subsidised food and other assistance for 800 million Indian citizens.

Important international aid organisations are not behaving differently from states. The World Food Programme alone holds data of more than 40 million people on its Scope data base. Unfortunately, WFP like other UN organisations, is not subject to data protection laws and the jurisdiction of courts. This makes the communities they have worked with particularly vulnerable.

In most places, the social will become the metric, where logarithms determine the operational conduit for delivering, controlling and withholding assistance, especially welfare payments. In other places, the power of logarithms may go even further, as part of trust systems, creditworthiness, and social credit. These social credit systems for individuals are highly controversial as they require mass surveillance since they aim to track behaviour beyond financial solvency. The social credit score of a citizen might not only suffer from incomplete, or inaccurate data, but also from assessing political loyalties and conformist social behaviour.

Hence, the question becomes urgent: what is social justice in a metric society and which role will CSOs play in this emerging environment, what will be their raison d’être?

Indian CSOs chose a pragmatic approach and engaged to ensure that these systems work in favour of those in need. These CSOs have analysed and strategised, on the basis of their principles and values, they endeavour to improve the newly emerging systems. Thus, some are engaged in advocacy, political and even legal battles to ensure that data protection measures are enacted and implemented. Others assist vulnerable individuals to register and to claim their entitlements and turn low-resolution into highresolution citizens[11]. Yet others engage in educational programmes to teach the users on the new digital environment and their rights therein.

Reflection and discourse might need to go even further and look at transparency and other societal issues, in particular with regard to logarithms. The city of Rotterdam was the only of several cities which agreed to a third party review of the algorithms deployed for fraud prevention purposes. Hence, there is a vast area to be chartered out: from the preservation of confidentiality, copyright and intellectual property rights to the demand for transparency where appropriate.

Lately it has been suggested that anti-racism or decolonisation dimensions will fundamentally change the long-term strategic thinking of important civil society organisations and that this would require developing concrete performance metrics and progress indicators[12]. This shows that the issue will not go away: Should those advocating and working on counter power[13] be using the same methodologies and tools than those currently holding power?

Our own personal story from LogFrame to Logarithms provides a number of lessons. The most important one is to understand the basic concepts underpinning the digital transformation of our societies: Power, social welfare in a market economy, individuals and their identity, and networks and platforms. Accepting that there is no escape room from the megatrend of digital transformation, neither for the individual, nor for the societies we are living in, means that engaging with these new technologies remains imperative. For the organisations that constitute civil society even more is at stake: It’s the race for relevance: Wherever human rights and social justice issues are at stake, whether an organisation focuses on advocacy or service delivery, what is needed are concepts and strategies to actively shape the ongoing transformation based on a clear idea of values and rights. Failing to do so will leave it to the govtech business industry to come up and implement their solutions uninhibitedly.

[1]  Steffen Mau, The Metric Society, 2019, p. 167.

[2] Ibid.

[3] Alice Salomon Hochschule, Berlin, Digital Transformation and Social Justice as part of the ICM programme.

[4] Mark Latonero, Stop Surveillance Humanitarianism, The New York Times, July 11, 2019.

[5] Suspicion Machines – Unprecedented experiment on welfare surveillance algorithm reveals discrimination, in: Lighthouse Reports, 6 March 2023.

[6] Consulting firms talk about welfare fraud up to nearly 5 per cent of benefits spending, while some national auditors’ offices estimate it at between 0.2 and 0.4 per cent. Ibid.

[7] AlgorithmWatch, Automating Society Report 2020, October 2020, pp. 46

[8] Suspicion Machines, see footnote 5 above

[9] Ibid.

[10] World Bank, Identification for Development (ID4D) and Digitalizing G2P Payments (G2Px) 2022 Annual Report (English), p. 2.

[11] Ranjit Singh, Steven Jackson, Seeing Like an Infrastructure: Low-resolution Citizens and the Aadhaar Identification Project, in: Proceedings of the ACM on Human-Computer Interaction Archive Vol. 5, No. CSCW2, Article 315, October 2021,

[12] International Civil Society Centre, Sector Guide #2:Strategic Decision-Making in a Whirly World, July 2021, p. 26,

[13] Democratic Data: Developing Digital Counterpower, a discussion with Salomé Viljoen,

Karl Steinacker

Digital Advisor

International Civil Society Centre

Karl Steinacker is currently the Digital Advisor of the International Civil Society Centre. He studied political science at the Free University of Berlin and international law at Cambridge University. He then spent three decades working for the United Nations (UNDP, UNRWA, DPKO, UNHCR) in the fields of development, peacekeeping and refugee protection. At the UN Refugee Agency, he held positions in Africa and at its Headquarters and was responsible for Registration, Statistics, and Data and Identity Management as well as for Camp Coordination/Camp Management.

Michael Kubach

Digital Expert

Fraunhofer IAO

Since 2013, Michael Kubach has been researching issues around digital identity and trust, where he takes a socioeconomic, user-oriented perspective at the Fraunhofer IAO - Team Identity Management. Michael works/has worked in several European and national cooperative research projects such as the EC-funded projects ESSIF-TRAIN and LIGHTest (on trust infrastructures) and FutureID (federated identity management). Moreover, he is consulting international corporations and NGOs on identity and trust infrastructures as well as blockchain/DLT topics. Michael holds a PhD in economics from Georg-August-University Göttingen. He studied politics and administrative science as well as management in Konstanz, Göttingen and Lille.