Networked communities for turbulent times

Promising Trouble is a social enterprise whose work puts community power at the heart of technology and innovation.

Latest Updates

Our work puts community power at the heart of technology and innovation

The courage not to act

Rachel Coldicutt, our Executive Director, in conversation with Jo Barratt at New Constellations, on careful innovation, ethics, and deploying technology in ways that can shape better futures and nurture us all.

  • If we wander down there, we can maybe even begin, you see there’s the nice lake we could even begin down there.

    And if the rain comes, well… I mean, I don’t think it’s forecast. I mean I’m a great optimist about the weather. I mean my partner finds it extraordinary that I kind of believe the weather broadcast as if it’s the truth. I’m sort of obsessed with the weather forecast

    Jo: Do you have special, like, sources?

    Yes. Oh yeah, yeah, yeah. And so I have like my special rain app where you can look at the satellite weather coming in.

    I think weather forecasting is really interesting because it’s some of the first historical approaches to using data and learning from what’s happened previously to forecast what’s likely to happen next. The whole history of it is really interesting and not what you’d expect.

    Isn’t it nice to see kids on whatever they are, what are they doing?

    Hoverboard? Probably?

    Yes. Yeah. It’s not your average south London. It’s a lake. And I think, I don’t really know. I’m not one of those people who’s into getting extremely cold and the water outside, but I think maybe they’ve cleaned up.

    I mean there’s people swimming.

    No, I mean it’s big.

    We are in Beckenham Place Park in South London and we’ve just walked past the lake and we’re walking to a secret hill.

    I find it very difficult to say my name so bear with me, I’m Rachel Coldicutt, I’m a technology researcher.

    I started working on the web in the nineties when it like wasn’t really a job, when nobody sort of really knew what it was and it was a bit of a weird novel thing.

    I suppose I’ve had like the sort of meandering career that you can only have when you work in an emerging area as it’s emerging. Because lots of the jobs I did, wouldn’t have existed a year previously and sometimes they existed cuz I turned up and I was interested and I turned it into a job.

    What I do now is I run a little research studio called Careful Industries and we work with clients to help them sort of take better digital care. There’s a lot of stinging nettles here, maybe this was a mistake (laughs). So we help them develop really practical ways of working that will anticipate and understand some of the impacts of the technologies they’re creating or the policies they’re making. How do you deploy technology with care? And how do you deploy technology in a way that leads to really good outcomes for the people who have the least power? That doesn’t just prop up traditional power and is not extractive, that is more kind of nurturing, more sympathetic. And we make a little bit of money out of that and the money we make out of that we put into our social enterprise which is called, Promising Trouble, very appropriately we’re ducking under a tree. We do work around community powered technologies. How do we help people make an own understanding and sort of governance and control over the technology they use. And so like one of the practical things we’re doing is we’re trying to work out how to roll out a free internet in social housing in South London. So it’s like on the one hand in Careful industries, we do like big thinky, long reports, lots of research. We do ethnography and field work and we do desk research and you know, we’re a mix of like technologists designers. And then we do in Promising Trouble, much more things about, well, actually, how can we take that knowledge and those things we’ve learned and turn them into better outcomes that are maybe not gonna make anybody any money, but that will kind of make a better infrastructure for everyone.

    I think technology is any tool that can be automated in any way. There’s an extent to which novelty is the thing that takes something from being a tool into being a technology. And when things stop being novel, they stop being thought of as being technology.

    Watching a movie now is an incredibly technologica, technologicalised, if that’s even a word, experience. Right. But we don’t think of it like that. We think of it in the same way I think that people thought of it a hundred years ago, it’s like a communal experience of coming together. But that communal experience of coming together, is made possible by the like convergence of many people and many tools. I did a talk the other day it was called ‘we are all technologists now’ because I think we can all adapt those tools to do the things that work for us.

    You know, the first time somebody made a knife, that was a technology. The first time someone turned a thing that was made for one thing into something that could help them complete a task more quickly and more efficiently.

    Whereas I think like the tech industry is like, just this, like, it’s just lots of people who want to be rich, which is I think really boring. It’s like, I think technologies are very often created for people to do things at scale. And the tech industry, I think, is about individuals, entrenching them in power and getting more money and is like a sort of, you know, fundamentally capitalist undertaking. But I don’t think technologies in and of themselves are.

    When you lift up the bonnet of a car and you look at the engine, it’s very different to when you are sitting in the car and you look at the dashboard, but there are two interfaces onto the same information. And I think what very often happens now is we don’t get the bonnet view. We get the dashboard view and the dashboard is kind of like allegedly meant to be more helpful, but it’s kind of like the theatre of helpfulness because you can’t reach into it and move a thing right, in the way that you can with the engine.

    This desire to obfuscate and simplify and take away the workings, I don’t think that’s even a technology issue. It’s about a sort of desire for ease and I think as the world around us becomes increasingly more complex, more automated as we’re expected to be more productive, be more places, do more things, be busier. We’re expected to take advantage of all of that ease to kind of let other things work hard in order to allow us to work more hard.

    There’s a certain sort of technology of technologist who I think is a complexity merchant and who kind of thrives on making things that they don’t understand and that no one can understand.

    It’s like in a way where we are now is that this odd convergence of 150 years of people pressing buttons and the sort of power dynamic of people wanting to create complexity, which is really to just take us back to where we were a moment ago, the very opposite of like library science, which is all about making things, intelligible about putting an interface on the world that is helpful.

    One of the things that we do in our work at Careful Industries is we try and we kind of take care by looking ahead. And I think there’s big, in the technology industry, in the tech industry, there’s this idea that disrupting is better, right? That what we need to do is we need to put a rocket under things, explode them, build them again, you know, and there’s all these terrible terms like blitzscaling, which are sort of, you know, words of dominating and destroying. This idea that there are certain things that represent progress and some of those things that represent progress are like very tied to things people saw on the TV in 1975. They’re kind of very linear outputs from where we were then and where we are now. They’re kind of incremental and instrumental change in a world in which everything is basically the same, but there are like cooler gadgets. And I think what we try and do in a lot of our work is like look at the second, a third order consequences of making a change now and then try and calibrate that change to make the most positive outcomes. The academic Shannon Madden has like this lovely essay about root stock, about, you know, like attending to the beginnings of things. You know, if you attend to the beginnings of things, you look after the roots, you’re gonna get better apples off the tree. And I think a lot of the time technology doesn’t really do that. And there’s, I think the, the other person I think of is Donna Haraway you know, all the stuff about compost and making, you know, good ground for things to grow out of. Whereas I think like a lot of short termism it’s very much about like, can we run into the orchard, knick all the apples, turn them into something and sell them. Not can we have trees here in a hundred years? And I think, you know, that’s the stuff I’m probably more interested in.

    I’m not very concerned about what technological progress looks like. I’m much more concerned with what like progress towards equity looks like. It’s a world that I think you know women and people who are not in conventional positions of power have been imagining for a long time. But those imaginings don’t maybe become mainstream in the same way.

    Like one of the reasons I really love the internet and one of the reasons, I suppose I was many reason, I was so lucky about kind of growing up at the age that I was, is that, so I have a stutter and when I was younger, it was really, really bad. And when I started work, well bad is a judgement, but I stuttered a lot, you know, and I still do occasionally, but not as much as I used to. Back then, when I started work, you did lots of business by ringing people up. Now as I said earlier, I can’t really introduce myself. I can’t, I find it very difficult to pick up the phone and say my name. Which meant that all these things that were very easy for lots of other people were very difficult for me. And then it meant it was incredible that I could do things by emailing people. I could message people. I could write things and publish them and people would read them without having to go through like an approvals print. There was a way in which this, a thing that, you know, this a thing that had, you know, definitely concretely held me back. Like when I went for jobs, when I left university, many people told me they couldn’t employ me because it would be to embarrassing to give me a job. What, yeah. The thing that I loved and that made it amazing for me was exactly that thing, like on the internet, nobody knows your own dog. Right. You know? And that’s only one component, but I think, you know, the stuff that makes technology not equal is not necessarily technology. It’s like people, it’s people being irresponsible and selfish and thoughtless, and then being really lauded for being irresponsible and selfish and greedy and thoughtless, because it’s made them rich.

    You know, I think nothing is inherently democratising, I think many sorts of technologies, you know, the very active, automating trenches of power. There’s no such a thing as the general public, right. This idea that there is a, there is one collective opinion. There’s a mean, and a median that we can all arrive at. I don’t think that was ever true. And that that’s a device for sort of passing things that are acceptable. And that actually what the internet makes possible is many publics and for many publics to have a voice. And I think a lot of the time that gets interpreted as being polarisation. Certainly that is one outcome of trying to impose a very traditional narrative on the fact that there are now many, many voices, you know, trying to simplify it, and turn it into binaries. You can always find the most extreme person on either end. Actually what we get now is a multiplicity, the plurality. And I think that what hasn’t happened is I don’t think our institutions and our organisations have caught up with how to deal with that. And so that the fact that our sort of like the institutions through which we run democracies are used to being able to tell binary stories, you know, like the BBC thing of we’ll have a person with one opinion and then we’ll counter it with the opposite. Actually, for many things, it’s possible to have many, many opinions and that’s kind of amazing.

    We’re not innovating for that in the right way. What we’re trying to do is go back and put all these voices back in the box. And I don’t think that’s really very helpful.

    A really nice tangible thing to talk about is actually this thing where we’re trying to roll out the free internet in some housing estates in South London. So this is like a classic problem of the 2020s. I think it is technically not difficult. The thing that is technically difficult about it is choosing which one of the possibly 50 different ways we could solve the problem. But what makes it difficult is the regulatory environment and the policy environment and the regulatory environment works in the favour of business. Certainly in the last 15, 20 years, we’ve not really had policy people who can really get to grips with technology infrastructure. There’s not very much vision in government for possibly sort of long term difficult infrastructure because what the vision in government for is very often, what can we achieve within the next three years? What that means is we’re operating in a world in which the broadband companies create the paradigm in which broadband works, but that’s not the paradigm. There are, there could be, there could be loads of them, right and the difficult thing is not putting the cable between one place and another it’s making it possible for businesses to acknowledge that it could work another way. It’s making it possible for policy makers to acknowledge that this thing they don’t really understand. That seems to not be a problem because the market is taking care of it is actually a bit more fundamental. And so on the one hand, what can we pilot and what can we demonstrate? Because I think there’s a huge extent to which once somebody sees a thing working, they become a lot less scared of it and it becomes a lot harder to close down. You know, they can’t tell you it can’t happen if it’s happening.

    How can you make some kind of public service that is not within the context of the hostile environment and a surveillance in government. So already there’s quite a lot of different things happening there, but basically can we get internet from one person to another person and can that work.

    But then in the meantime, there’s another set of things to do, which are coming up with the big picture policy ideas that will be attractive and understandable to people who want to be politically famous. You know, there’s certainly a huge extent as well, to which doing something on the ground means you can get someone to come and have their photograph taken in a place where a thing is happening and they become a part of that thing that is happening. But actually it’s like, there are some big infrastructure and ideas that if you were the big government thinker who pushed them through, it would change a lot for a lot of people. And so some of the work that we are doing there is trying to make those policy changes possible. So there’s a little bit of like telling good stories in ways that people can understand making all these things that don’t seem very sexy about digging holes in the ground and cables and stuff, into stories about people whose lives are just better, but equally kind of like snappy lines. And I kind of hate this. But ultimately I think sometimes if you wanna make change, it’s like someone like me, what my job is to hold and understand the complexity and come up with like the snappy line that I can give to somebody else. You know, a lot of stuff that we do is like, be the conduit for the change and I can understand that that work is not attractive to a lot of people, it’s about making it easy for other people to come in and make a change very quickly as well, I think.

    There’s this weird hierarchy where people think technical expertise is somehow more special and better than other kinds of expertise. But if that was the case like we’d all be dead. You know, it’s like we really need that, that more sort of equitable exchange of knowledge and information and enabling others to sort of pick things up and do things with them rather than kind of like hiding stuff, I guess.

    Medical ethics have been formed over centuries through social negotiation and experience you know, they didn’t like spring into being right. And in this kind of amazing thing where everyone disagreed what the right thing was to do. We’re in this interesting period where you could maybe Google and come up with like a hundred different ethical codes, whether they are data ethics, AI ethics, research ethics, services ethics, there are now hundreds of communities all over the world making their own digital ethics codes. I often think particularly AI ethics, it’s like this whole industry of people take like the two, two of the most complex and difficult to understand things that there are, AI and ethics and they put them together into academic institutions that get millions and millions of pounds to ponder the meaning of life, right. And so that’s all okay. But I don’t think it’s very useful.

    How we start talking to people is they come to us to talk about ethics, but where we end up with them is in a different place. And I think very often people believe the thing they are doing, it’s the first time it’s ever happened. And maybe that specific technology in that specific context, it would be the first time it’s ever happened. But it won’t be the first time someone has tried to categorise people based on their affinities or the colour of their skin or what they like to do in the evening, watching Netflix. Like basically it’s like we can learn a lot from history and I think it’s very tempting to believe that technology is just doing newness. Very often people will come to us and they say, oh, we want to do his thing that is a bit risky and a bit difficult and we want to do it ethically. And very often what they mean is we want to carry on doing this thing that is probably bad and we need an excuse. And that what you then do is you go through a process of negotiation with them to help them understand some of the things that would never have occurred to them. That would be the logical consequence of the thing they are doing but that it’s just from a perspective they’ve never looked at.

    A lot of the people who get the money and the opportunity and the power and the influence to make big, important technology projects and products maybe they’ve never really had to develop very much empathy.

    I think ethics is not a cure. I think in any ethical discussion where there is not a pre-agreed ethical code, one of the things you have to do is you have to work out who is the winner. If there’s a trade off, who is disadvantaged by the trade off, who gets the preferable outcome.

    Encouraging people to understand it’s okay to like, change what they’re doing. That it’s not a weakness to say we were doing this wrong. I think it’s a much bigger weakness to do it wrong, take it to market, fuck everything up and tidy up afterwards. You know, so it’s about kind of building in anticipatory care into creating products and strategies, not acting first, apologising later.

    And there are many people for whom that is actually quite a compelling business model, you know, maybe not when you are scaling and you are having to give a, I mean, see lots of promises about what you’re achieving., But you know, if you are, if you’re running a business, you want that business to actually work for people and deliver good outcomes, which I think is often underrated.

    If there’s anything it’s like moving away from the very blunt instrument that the scale and money are good. And the other thing is like, particularly when you talk to really large technology companies, they quite often say yes, but if we don’t do this, who will, and like maybe no one and like, that would be fine. So I think like, there’s another thing about the courage not to act sometimes not to fill the space with your technology and your will is, is actually possibly harder than continuing to deploy.

    Well, let’s go and have a look. The mounded garden from the soils that were used to make the lake. So when my sun was littler, he would scramble up to the top and then sort of bull around in a way that looked really scary. And now I look at it having, maybe not been here for about a year, it doesn’t look as big as I feel it looks when there’s like a tiny child on there. It’s a nice little, random, I think it’s got some lavender on it. I think it’s nice because it is a thing of natural beauty that was made by creating a lake, you know, and they didn’t just take the earth out. They took the earth out and turned it into another thing. I’m a big fan of using something for different reason than the one it was invented.

    It’s a pile.

    Yeah. It’s just a pile of things with some steps, but a nice climb up. Very exciting if you’re on a scooter.

    Dangerous.

    Yeah, I know, but like the five year olds love that.

    I think we’re in like the long end of something, you know. The long end of one thing and the beginning of another. I didn’t really appreciate how long and horrifying and terrible it would be to live through the end of a thing. As very sort of traditional Western white male power, I think is kind of having its last hurrah, it’s becoming more frightened and more intensified and more sharper, like it’s not ambient and ubiquitous in the way it was. It’s kind of coming out in quite aggressive and dominant ways that I think certainly, you know, people like me who got to become adults in the nineties had the luxury of pretending that didn’t happen anymore. I mean, it was obviously happening, but as a white woman in the UK, it wasn’t something I experienced. I think, you know, the thing that I am really hopeful about is sort of what comes after my generation and people like me and the increased urgency and the, you know, the need to change will just become so much. You know it’s like, it’s inescapable and it’s a very weird time because it’s both absolutely horrifying in many ways. I think if there is a job of people with a little power and a little influence now it is to sort of help midwife, the next thing. Which we don’t know what it is, but it’s definitely coming and it’s definitely emerging. I’m kind of hopeful, cuz I can see things that need to change and growing awareness of those things. Maybe we’re like in the swamp, in the like the soup of whatever is gonna come afterwards.

    This lake is actually just absolutely brilliant.

    Isn’t it? And as well like, one of the thing that is brilliant about it, is you know Beckenham is like you know alright, but it’s a very sort of you know its a non exciting suburban london neighbourhood. Like possibly the best thing you could say about it is there’s a Marks and Spencers sort of thing, so you know this is lovely.

    This little pocket of something completely different.

Why do we do it?

Data and digital technologies are arbiters of power.

Promising Trouble is an experiment in redistributing that power: sharing knowledge, capabilities, and connectivity to build community-driven alternatives to Big Tech and platform power. 

Rather than relying on innovation to trickle down from governments or big business, we believe it’s possible for communities to shape and change technology so it works better for everyone.

And outside of formal organisations, radical action is possible – we can all embrace our position as co-creators, not simply “users”, and start  occupying technology with love.

Promising Trouble is the non-profit sister arm of Careful Industries. We’re a social enterprise, committed to growing awareness of the social impacts of technologies and building alternative systems, technologies and communities of practice. The name Promising Trouble is a quote from “Staying with the Trouble” by Donna Haraway