From Impact to Input: Why I'm Still Hopeful About the Future of Tech

Environmentally friendly lightbulb, lit up.
A lightbulb moment (Vertesi 2023)

Q: How do you not get depressed about the state of information and how much it is used in terrible terrible ways?

Someone asked me this question on Mastodon recently. It's a question I get quite frequently, often from my students or when discussing my Opt Out project with friends and colleagues. I've thought a lot about this and it deserves a full post reply.

Because as much as today's algorithmically-amplified-news-hype cycle profits from doom-and-gloom and overwhelming negativity, techno-doom isn't my message.

Don't get me wrong. I'm not a "techno-utopian" or a technosolutionist, believing that technology will solve all our (social) problems and we are all better off living in a world populated by machines plugged into the Matrix or something. If I ever start spouting Ray Kurzweil then you will know I have officially been kidnapped by aliens, so please launch a rescue mission.

Certainly I don't think today's situation is optimal. Far from it. It's totally dismal. The systems we've built uncritically and unthinkingly adopted into our lives are inevitably some combination of racist, sexist, self-destructive, society-destroying, you name it. They're just awful, no matter how gorgeous their interfaces or how sci-fi you feel talking to some new chatbot, or your Alexa.

But that's the thing. Uncritically, and unthinkingly.

Because if you know anything about the sociology of technology, you know we can do it differently. That's what I'm all about.

Opting Out Fights Techno-Depression

It is true that the state of play today is absolutely miserable. When big tech companies are busy building chaos-machines and turning billion dollar profits, it feels like reality has taken a dark turn. Lik we're just living in the Matrix now, and there is nothing we can do.

Honestly, I was more depressed about the state of things over a decade ago, when the personal data economy got started. It's not like we had some halcyon days there and then bad intents or bad actors took over and made things go wrong.

Right from the start, so many of us in the critical design, computer science, and sociology of technology community were screaming at the tech companies from the rooftops. Not even the rooftops --we were at the same meetings, the same conferences, collaborating on grants, even! We had the data, the theories, the evidence to show what would go wrong. But they paid us no mind and went ahead anyway despite the dangers we told them would happen. And did, inevitably.

So I could see the writing on the wall. I knew what was coming.

Not because I can see the future or anything, but because the sociology of technology is so robust by now, that it can be predictive.

We know about mechanisms like "reinforcement politics" -- Rob Kling's 1990 formulation of the idea that new tools will come into existing social systems and reinforce the inequities and political structures that are already there. We know that in places where people are struggling for authority, and without any critical thought or intervention, new tools will be used in ways concordant with that local social context, to maintain the authority of individuals on top. We know that technological systems have political qualities in how they write people into systems or write them out, and whose work is made visible or invisible.

Technology is not a great disruptor--it's a great reinforcer of the status quo.

You can look up scholars like Susan Leigh Star, Geof Bowker, Steve Barley, Langdon Winner, and the myriad people who cite them, to know this is true and will repeat ad nauseum until we learn our lesson.

So I knew what was coming and I knew I had to get out.

And, it's amazing how much better I felt when I left. Not just because I found new tools and new friends, and reconnected with what really matters in my life.

When I left, I discovered a sense of agency I didn’t know I had. It was true technical freedom. Freedom to choose. Freedom to associate with others who are working on the same issues. Freedom to learn from others, to experiment, to take a different path.

Each system I gave up — Google, Facebook, iPhone etc..— made me feel the following: Happy. Powerful. Relief. Gratitude. Calm. Skillful. Clever. Connected.

I met others who care as I do. We work on tools together. Everyone contributes what they can. I feel happier and more connected to the right people, ideas, and communities than I ever did on a platform.

I hold these truths to be self-evident

All this does require a change in perspective.

A change from thinking about impacts to inputs. Away from thinking that technologies just swoop in and change everything, to thinking about how we imbue those technologies with certain capabilities, arrangements, and powers and then let them loose to act on our behalf in the world.

If we start from different premises, and curate and preserve relationships along the way, we end up with very, very different kinds of technologies

Based on my scholarly training, I come to technology with these premises, instead.

1. It could always be otherwise--and it still can.

Most of the technological systems we live with today are not there because they are "the best." They're there because of market capture, economic tricks, boondoggles and rhetorical slights of hand that make people feel like their problems are solved, while ushering others in through the back door. 

In other words, what we have now isn't what needs to be, or what was already coming to us anyway and we'd better just hold on and grit our teeth and adjust. What we have now is because powerful economic interests have shoved certain tools down our throats under the banner of progress.

The problem with today's "progress" narrative is that it assumes all change is unidirectional, that we can look back and see a straight line from the tools of the past to the tools of today. But this is only in retrospect. The possibilities in the moment for how a new tool will go are always plentiful and many pathways are unexplored. There are other options and always have been.

2. We can build viable, real alternatives.

Based on the above, designers in the early 21st century started trying to design around alternatives. This is what critical design is all about. Instead of starting with baseline assumptions about efficiency, the harnessing of human labor, least amounts of inputs versus the most profits etc, we can start with other assumptions instead. Like centering community, conversation, the complications of decision-making, radical data privacy and ownership, or even love.

My colleagues Phoebe Sengers, Carl DiSalvo, Shaowen Bardzell, Lilly Irani, Paul Dourish, and others are great examples of people who start from alternative premises and build more responsible, thoughtful systems. There are many possibilities to optimize for. We don't have to accept the bland versions presented to us.

3. Groups of people working together can and do change the path technologies take.

There are so many technological paths not taken. And all it takes are interested, united groups of people to explore those paths and bring those ideas forward. Working together we can identify the problems that face our communities, and organize and get creative and innovative about how to build collectively to face those problems. This is where innovation really comes from, and it supports meaningful group work besides.

Building our own systems, rallying together to get creative and inventive, is where the fun begins. Despite all those "great inventor" stories, we know by now that individuals don't make change all by themselves. Groups do

4. We can and should choose better human and technical bedfellows.

When we bring new tools and technologies into our world, we build our social lives and connections around those new systems. So we'd better be selective about what we choose. Like choosing a partner, we can be intentional about which systems we allow into our lives and which we say no to.  It's not Ludditism. It's within our rights to make those choices. Especially when so many of the systems out there aim to take advantage of us, wine us for information, and build products and services on our backs. 

Theorist Donna Haraway talks about how we get to choose our "kin," to knit intentional networks together of people, animals, and machines. Personally, I think hard about whcih kinds of systems I want to welcome into the intimate space of my home, my pocket, my bedside table. I want to be sure that the devices that live with me respect my data privacy, and enter into a local ecology of systems that aren't sucking my data out of me and demanding my attention--they've got my back instead.

Do you choose as intentionally? Or are you simply accepting the Big Tech companies' bedfellows? Because their investors don't actually care about you--they care about how quickly they'll see a return on investment. Unfortunately those sorts of premises never have your best interests at heart, and don't result in responsible machinery.

5. Well-thought thru tech brings us together, doesn’t tear us apart.

Working with NASA teams or open source projects has taught me that it's possible to unite around technologies. When we build or maintain technologies, we care for them, and for each other in the process. Like tending to a garden, or teaching children, when we invest our time and energy in something that grows and sustains us, that care is important. It invests what we do with meaning. And it makes us feel connected, to the things we are caring for and to each other.

It's hard to imagine a world of technological care when we live in a world of devices we upgrade and toss in the trash every two years, or algorithmic systems that prey off our preferences to divide us from our neighbors, or machinery that's always threatening to put us out of a job.

But if you look around, and look outside of Big Tech, you'll see people investing their time and energy in technical work that brings us together. Linux. Wikipedia. Space exploration. Mesh networking. Community STEAM or solar. And so on. 

That's also what the opt out project is about. Working together, in community, to adopt better systems into our social worlds and to make change.

Don't stop believing

To have better impacts of technology upon society, then, these are some of the inputs we need. Imagining different possibilities, building around different premises, working together to make change, choosing our bedfellows wisely, and embracing care.

All this leaves me--not depressed. More like, embracing a world of possibilities. I’m hopeful--maybe not optimistic but energized. Is there still a fight ahead? You bet. And we will have to fight on many fronts to make change. Fortunately, privacy isn’t just for geeks anymore. The movement is growing. I haven’t changed my tune in a decade, but many, many more are finding the path now. 

I find inspiration and energy in the work of my colleagues, like people at Data and Society, AINow, DAIR, the Knight Centers, critical designers, the rise of the FAccT (Fairness, Accountability, Transparency in algorithms) community. I’m inspired by the anti-racist work of my friends like Ruha Benjamin in sociology or my students in HCI. The innovative tools coming out of European organizations under the GDPR, open source communities, projects like SailfishOS or Tor, the EFF, companies like DuckDuckGo, Mozilla, ProtonMail… the list goes on.

There is a lot left to do. Is this accessible for everyone? Not yet. Should it be? Hell yes. The more people who join in and contribute what they can, the better it gets for everyone.

Certainly when it comes to surveillance capitalism I am mad as hell and I’m not gonna take it anymore.

But that’s just it —we don’t have to take it anymore. There are myriad ways to push back, to resist, to reclaim, to build and live differently.

My zany little technical world in which my data is MINE, makes me happy. As does the dream that by working together we can sweep all this BS away and make it better for so many more people.

That's why I've embraced opting out as one option--of many--that upholds my technical virtues, and builds a better future for us all at the same time.

Related posts