A karaoke machine
A karaoke machine

Taking Back the City

Christine Phan

A glitchy karaoke machine
A glitchy karaoke machine

Perspectives: Taking Back the City

A headless robotic dog
A headless robotic dog

Murder in the Clavist Autonomous Zone

Rich Larson

A glitchy robot dog
A glitchy robot dog

Perspectives: Murder in the Clavist Autonomous Zone

A vintage Volkswagon van
A vintage Volkswagon van

A Charm to Keep the Evil Eye Away from Your Campervan; Or, Roamin’ Rights

Christopher R. Muscato

A glitchy vintage Volkswagon Van
A glitchy vintage Volkswagon Van

Perspectives: A Charm to Keep the Evil Eye Away from Your Campervan; Or, Roamin’ Rights

An eye with a thick flick of eyeliner
An eye with a thick flick of eyeliner

Crisis Actors

Corey Jae White and Maddison Stoff

A glitchy eye with a flick of eyeliner
A glitchy eye with a flick of eyeliner

Perspectives: Crisis Actors

A glitchy wind turbine and a flowering plant in a pot

Perspectives: Curlews

What follows is a synthesis of thoughts and discussion related to Curlews by Cecilia Ananías Soto.

These largely come from a series of discussion sessions at RightsCon 2025 in Taipei, Taiwan. They are intended as fodder, inspiration, and incitement for creators to dig deeper on surveillance technologies and consider intersections in their own forthcoming works.

The individual complicity of those who build harmful tech or perpetuate its propaganda

It was cathartic to read the clear dissonance and impact of Ava’s perception of her work and the confrontational reality of its impact on Amaia. In particular, the hollowness of the propaganda Ava recites made it abundantly clear that the takeaways of the popular narratives around the app she builds are misaligned with reality.

“Well I'm just doing my job going to work every day, it's not my problem." - RightsCon participant who works in Silicon Valley, on how most programmers there think about the impacts of their code

By removing the ambivalence around programmers creating these technologies, Soto rightly positions the creators of the technology as responsible alongside cops and other bad actors who surveil and harm. But at any moment, a person can break free of their complicities, change their mind, and take accountable action. An increasing number of tech workers do recognize their complicity and organize within their workplaces or become whistleblowers—often with disastrous personal consequences as mirrored by the programmer killing a cop and going on the run. We welcome this reflection of reality as well as stories that take whistleblowing and labor organizing in a new and more honored light.

“We are all webs of complicities—the absolutism of good and evil does disservice to many of us, in service to some of us.” - RightsCon Participant

More:

Community complicity, isolationism, and the theft of time

The first thing this story prompted was a brainstorm of “neighborhood watch” type apps and systems that encourage people to surveil and report on their neighbors instead of getting to know each other: Amazon’s Neighbors App, NextDoor, Crimestoppers, Vecino Vigilante, VenApp. We also discussed the extraordinary prevalence of people informing on their neighbors in communist East Germany. This happened because Soto does a lot of skilled work very quickly in highlighting Ava’s individualism and isolationism. It’s clear she’s been told a bunch of stories that make her afraid to go outside or speak to anyone.

“Safety is how much you trust your neighbour. Surveillance isn’t great for trust.” - RightsCon participant

Ava’s isolationism felt familiar. It seemed not only symptomatic of fear mongering propaganda, but also of the theft of time. For many who work against surveillance, we see how the theft of time to make human connection in hyper capitalist, low-wage society is a direct pipeline to surveillance and fear. People feel that they don’t have the time anymore to get to know their neighbors, and so are more easily seduced into fearing them via maliciously convenient apps that give them a hollow illusion of connection and belonging.

There is also an element here of “evil as entertainment” and glamourization of surveillance that, while this story is set in the future, has a lot of resonance with the world today. In this story, an app with ostensibly fun Tinder-like features is used to normalize and achieve mass rape. It reminded us of how doorbell camera footage today is not only being shared to encourage fear, but also being positioned as cute entertainment. It’s actually quite dark that recordings of delivery people — a whole class of low-wage workers — go viral without their consent when they, for example, express joy at being given food or water. And whether for fear or for fun, the ultimate result of sharing doorbell camera footage is to normalize mass surveillance and replace human connection with online content.

More:

Convenience tropes and diffusing blame

One of surveillance technology’s main lures is convenience and security: AI will detect when a person who could be dangerous walks by, the computer relieves you of the labor of making an informed decision, your phone will “just know” when it’s time for you to leave for yoga class or when you’re out of face wash. The proposition that you will save time and be able to hustle harder if you just put cameras and AI in every corner of your life and let them handle the stuff of daily life for you.

But who is this promised convenience for? Certainly not the people who a biased AI predicts as more criminal, thanks to the decades of racist policing data it trained on. Certainly not the activists who have government profiles built on app or CCTV surveillance. Certainly not abuse victims trying to stay anonymous and survive. The common recognition in our discussion of our frustrations with Ava came to something like this: the convenience of surveillance tech for the most privileged results not in an elimination of inconvenience, but in an export of inconvenience to marginalized communities — inconvenience that increasingly escalates to harm.

That pipeline of privileged inconvenience to marginalized harm is buried underneath miles of marketing language. In turn, such language obscures the interconnectedness of these systems, and diffuses the blame for those that choose to utilize them. 

More:

Bad, biased AI systems are becoming normal AI systems—but you can’t blame the computer

It seems like every day there’s another headline on how AI models have said or suggested or done something terrible, to the point that these occurrences are starting to become normalized and no longer headline-worthy. In reality, AI is being relied on too heavily without a proven track record of the tools’ safety or reliability. As it is in this story, with the forced birth app’s automated monitoring and enforcement of rape during ovulation. While still being dubiously effective for its purported goal of making more babies, the app is a powerful surveillance and compliance machine for making more rapes.

It’s easy to see in this story how AI-powered surveillance myopically enforces the rules it is told to follow, to the exclusion of all nuance, even as it is championed as top-of-the-line technology with its Tinder-like swipe function. 

The narrative Ava believes about her work on the app, that it is patriotic work that is saving her country, is a classic example of how authorities and corporations respond when confronted with the harms of surveillance technologies: first they deny. Then, when they can no longer deny, they point to the computer, and blame the computer.

That’s like pointing to a building that fell on people and saying it was the building’s fault, not the people who built it. It can be similarly harmful to publish stories on surveillance technologies that perpetuate and normalize the narratives that programmers, corporations, and authoritarian regimes hide behind in order to evade accountability.

More:

Can we get something better than musty old USB drives and crumbling tomes to dodge speech and knowledge suppression, please?

“I want to see how technology can help us and reinspire free expression for good, how technologists can look to the future and continue in it.” - RightsCon Participant

This is a story where basic healthcare information is antithetical to the goals of the state, and passed only in secret through old technologies like USB drives or word of mouth. Person-to-person transmission is a powerful thread of connection, trust, and hope. Yet, one thing that would be inspiring to technologists is more visions of new technologies and ways of operating/living that reinforce access to knowledge, even in the face of fascist oppression.

Star Trek didn’t have all the answers for how the communicators that inspired the iPhone worked back in the day—and you don’t have to have all the answers for how a technology works in order to create an inspiring vision of culture, sharing, and connection around it. Stories of future technologies can be as much about what they give to people as how they function. Sometimes, what’s better than a technical breakdown is a shared dream, to be able to point and say “we want that”.

In our discussion at RightsCon, we wanted to ask for more anti-Black Mirror Black Mirror stories that flip the disaster porn of fear mongering and hopelessness around technology. We need to evolve our thinking past the point of “Thanks, I know, this is my terrible world and there’s nothing I can do!” That is one of the most powerful gifts that storytellers can give to our movement: wildly popular examples of hope and shared visions of what we can demand instead.

More:

We loved that this story chooses…

  • TO dig right into how networked technologies including surveillance, AI, and period tracking apps, can supercharge the misogyny of the world—and show that complicity with these technologies is a choice we make.

  • TO NOT pit women or oppressed people against each other. Despite the radical differences between the three characters in this story, it’s a tale of conspiracy between them. Iraida makes room for the bodily autonomy of Amaia—who in turn makes room for the awakening and action of Ava. 

  • TO include climate change in the background, to recognize it as a reality, but not use it as an excuse for oppression.

  • TO take cheap shots at trusting AI with our lives, as with the mushroom poisoning story.

  • TO connect a science fiction narrative to the reality of today. This story tracks the Chilean dictatorship and other historical context into the future. Tracing these trends is a powerful warning.

Content License: Creative Commons Attribution-ShareAlike 4.0 International license


See from Fedi

There are no replies to show yet.

There are no reactions to show yet.

There is nothing to you yet.