Disrupt & Innovate

Technical Intuition: Instincts in a Digital World

14th August 2018 by Alix Dunn

This blog is republished with thanks to the authour, it first appeared on Alix’s Medium blog, which you can find here

When the pieces continue to multiply and don’t fit together.
When the pieces continue to multiply and don’t fit together.

The digital world is disorienting. It permeates every aspect of our lives, but few of us understand how it works. Worse yet, few of us know where to begin if we want to make it work for us. We are discouraged from asking questions when something feels icky or confusing. If you don’t get it, you’re the problem. The technology is magic, and so are the people that build it. Because — in a slight tweak to Clarke’s third law — any technology sufficiently distanced from our own conceptual understanding is indistinguishable from magic.

What would it take to build the right types of knowledge so everyone can demystify, navigate and leverage the digital world for their purposes? For the past decade or so, there have been two dominant answers to that question:

  • Create accessible opportunities for anyone interested to learn how to code
  • Outsource all of that complexity to a small number of people and exoticise the skill set of that dominant class

But we know that neither of these answers gets to the heart of the question.

Enter Technical Intuition

Over the past decade, I’ve worked with activists interested in strategically adopting technology. And even groups with the most resources, clearest political beliefs and noblest aims struggle to build the right types of knowledge to do it well. It is very difficult — sometimes impossible — to act in your own best interests and in line with your politics and preferences when choosing, managing, and using technology. And while I have long been convinced that one-off trainings on hard skills are a dead end, it was only in the past few years that I have worked to explicitly target a type of knowledge I call technical intuition.

I didn’t want our organisational partners to learn to code, I wanted them to learn how to talk to developers. I didn’t want them to outsource complexity, I wanted them to learn what skills they need in-house, which they could contract out, and how that might change over time. I wanted them to become inventive with technologies that they couldn’t necessarily deploy themselves, but could understand enough and in the right ways to articulate exciting new possibilities. I wanted them to ask incisive questions about underlying data that technologies cast off. I wanted them to actively and regularly wrestle with the challenges data use might present and the decisions they would need to make to use it responsibly.

We have looked for ways of unlocking technology careers for those interested in seeking more sustainable financial futures (throw a rock and you’ll hit a boot camp). But what about those who do not want to learn to code, but still want to make meaningful choices about technology? We are faced not just with a pipeline problem but also with a pedagogical one.

And it’s not just for activists, it’s for everybody

The more I have seen technical intuition in action, the more I am convinced that it is a critical form of knowledge (not simply a skill set), and one that we as a society have not sufficiently explored or optimised for.

Technical intuition is a foundation for agency in and about the digital world and a missing cornerstone of the solution to many of our techno political challenges.

Currently, broad public participation in the decisions we make about our digital future is impossible. Asymmetric access to knowledge, power, infrastructure, and resources that drive the creation of digital spaces is driving inequality. Political, economic, and social inequality. And — as we do with most retraining efforts when economies undergo major transformation — we oversimplify the skills needed to broaden participation and access. We focus on narrow hard skills. But the acquisition of narrow hard skills like coding will re-entrench existing economic relationships rather than reshape them.

Before that happens, I suggest that we rethink what capacities we should be working towards for proactive participation and engagement in politics and new economies.

  • It’s not literacy. That is too passive and submissive to systems as they are. We don’t want systems to be legible, we want them to be pliable.
  • It’s not sub-speciality. In-depth understanding of one programming language that you pick up over a night class may unlock job opportunities in tech but it doesn’t help those that want to understand technology without having to focus on it as their primary skill set.
  • It’s not the ability to follow discourse or news cycles about what technical systems are doing to us. That relegates non-technical people to the role of observer.

So what is it?

Technical intuition is a conceptual frame that we know and see but have never worked towards. It is a key to broad-based access to personalised decision-making within and about technical systems.

There are four dimensions of technical intuition.

To Imagine

An imagination equipped with the information and instincts to conceptualise (good and bad) and suggest (good) technical systems even without the skills of implementing the ideas

To Inquire

An ability to formulate questions that can drive understanding and decision-making, and a clarity on how and where (to what experts) you would need to direct those questions

To Decide

A clarity of how your politics and preferences (both personal and professional) connect to the decisions you can and should make about — and within — digital systems

To Demand

An animated impulse of when to be opinionated, active, and targeted if a system is designed in ways that do not align with our politics and morality

What does it look like in action?

There are many examples and situations that we experience daily in which technical intuition can support more agency and decision-making. When we are at work and considering innovative ways that technology could help us accomplish our goals, when we’re making decisions about how to engage online. But technical intuition comes to play even when we are out shopping for groceries.

I’ll use a consumer example that is nearly universal: my grocery store suggests that I sign up for a discount card that I scan at checkout in exchange for a reduction in the cost of my groceries.

How does technical intuition function in this situation?

First, I imagine

  • What data does this initiative generate? Items my family and I have purchased. When I’ve purchased them. Possibly over a long period of time. In aggregate, this data could say many things about my habits, and it is enough granular data that it is unique to me. It could likely be used to predict my movements, my lifestyle (vices and virtues), which could be used for all kinds of purposes.
  • How might that data be used by the store or by those that might purchase the data? The store may use it to micro-target ads to me. Maybe one day they will strategically charge me more based on goods they can tell I need or have a higher urgency for. An insurance company may want to know how much alcohol or over the counter medicine I purchase. Credit agencies may use the data to develop profiles or predictions about my spending habits. In some countries it may influence a social credit score.
  • How might the data connect to other purchases? If it is used with other companies what might multiple sets of data about different purchases say about me? Taking Nectar cards in the UK as an example, they are used in multiple chains like service stations, railway purchases, grocery stores, and big box stores.
  • You might imagine different outcomes. Like what if all of the data is one day made public? Could I be re-identified in a data set that included your weekly shop details over a 5 year period? Would it matter to me? Would it matter to others?

Then, I inquire

Does the initiative offer any detail on how the data might be used? Does it connect with schemes at other companies? How much in savings do I get? Do I want to participate and therefore incentivise this company to carry out this initiative? Are those savings worth the exchange to me personally? Is it worth it to my family? Will this one day be required? What effect would that have?

Then, I decide

I won’t sign up for a card because I don’t think the data I am exchanging for the cost savings is sufficient enough to warrant it. Or, I will sign up, because I think the 10% reduction in cost is worth the likely surveillance — and I may struggle to pay for my groceries otherwise. If that is the case, maybe I will sign up, and will include fake contact details and swap cards with friends occasionally to muck up the data being collected about me.

Then, I demand

After reflecting on this initiative, I am surprised to learn that there aren’t regulations in place about the sale of data generated through it and that the store didn’t attempt to clearly explain to me what the trade offs were for signing up. If it’s an issue that really gets me mad, I follow up in a feedback form, I raise it with friends and family and I raise it with staff at my local branch. I recognise I may have less leverage with companies than I do with say, government initiatives, but I know that most of the gaps in consumer protection exist because customers don’t have sufficient interest and technical intuition to pressure companies to be better — and politicians think we don’t care enough about these issues to warrant or incentivise regulatory action.

What’s next?

There are many people working to develop new forms of communicating complexity, but often they are designed for people either already working in technology fields, or stumbling into something new. New publications are working to increase explainability of complex content; visual designers are leveraging user interfaces as teachers of technical interplay; animators are breaking down complex technical concepts that underpin probabilistic systems; companies are hiring science communicators; researchers are studying explainability…in machine learning research papers; and people are building entire dictionaries of metaphors that can be used to explain technical concepts.

This work is exciting — we should support and encourage it. But we also need to develop more accessible conceptual scaffolding, more clearly connect concepts, and build a path for those interested in understanding how it all fits together. Our aim is not a world in which everyone is a coder, or statistician, or designer, or engineer. Or a world where everyone wants to be a technologist.

We want a world where it is possible for all of us to build technical intuition and reclaim our individuality and agency within and about digital systems.

If you are working on ways to support non-techie communities to develop technical intuition, I would love to hear from you. What are you doing to create insight and understanding? What types of insights are leading to stronger technical intuition? What effects is that having on those you are working with?

Thanks to Janet Haven, Ali Gharavi, Zara Rahman, Lucy Bernholz, Elizabeth Eagen, and Nicole Anand for helping me shape these ideas.

Alix Dunn

Executive Director and Co-Founder

The Engine Room

Alix is a recovering researcher with a passion for applying creative solutions to difficult problems. She is a hunter and gatherer, identifying data and technology strategies that can empower social change initiatives around the world to maximise their impact and make the most of their resources. She co-founded The Engine Room and leads it to be a nimble organisation that provides direct support where, when, and how initiatives need it. She sits on the Advisory Council of Open Technology Fund, and the Technology Advisory Council of Amnesty International. She plays a mean game of chess.