Issue #146 After Doomscroll: A Conversation with Chelsea Manning

After Doomscroll: A Conversation with Chelsea Manning

Janus Rose

Trevor Paglen, A War Without Soldiers (Corpus: Eye Machine), Adversarially Evolved Hallucination, 2017. Courtesy of the artist, Altman Siegel, San Francisco, and Pace Gallery, New York.

Issue #146
June 2024

“Doomscrolling” is easily the single most poignant example of internet vernacular to emerge in the past decade. The term is both verb and vibe, and to most people it requires no explanation—a cheeky shorthand for the devolution of digital communications into repetitive patterns of consumption and disappointment.

While it first appeared online in 2018, it’s not particularly shocking that “doomscrolling” really took off in 2020, in the early stages of the Covid-19 pandemic.1 It was a mindset born of the lockdown, Silicon Valley excess, and the exacerbated alienation of late capitalism. In the intervening years, it has only become more evident that the internet as a whole is approaching a crisis point.

Now it’s 2024. Digital news outlets that once provided a counter to mainstream media narratives are collapsing, and the much-hyped proliferation of image- and text-synthesizing “AI” systems have made the creation of viral hoaxes and propaganda trivial.2 Even the tools we’ve come to rely on as essential for finding information, like search engines, seem to be collapsing under the weight of mass-generated “content”—the dystopian Silicon Valley term for anything on the internet that can be used to capture human attention for the purpose of generating ad revenue.3

As a journalist who has been writing about these topics online for more than a decade, I’ve long felt like a slow-motion air crash victim bearing witness to my profession’s inevitable destruction. Along with hundreds of people, I was recently laid off from my job at Vice, the Canadian counterculture mag whose cartoonish mismanagement resulted in the company abandoning its award-winning newsroom and pivoting to … well, nothing.4 The general consensus amongst my colleagues is that we are all facing a kind of vocational extinction: the owners of the platforms we rely on have pulled out of digital journalism entirely, seemingly deciding that they would rather have a glorified Excel spreadsheet fart out “content” than pay human beings to synthesize knowledge, music, or art.

This creates some new twists on some extremely annoying existential questions. Prime among them being: How will we find and share knowledge in an online information environment that is fundamentally antithetical to anything that won’t inflate shareholder value?

Joy Buolamwini, Coded Bias, 2020. Joy Buolamwini places a white mask over her face so that a facial recognition program can “see” her. Courtesy of 7th Empire Media.

With this in mind I turn to Chelsea Manning, my friend and sometimes collaborator who is probably best known for turning information politics on its head in 2008, with her release of classified documents showing evidence of US war crimes in Iraq and Afghanistan.5

Chelsea is a machine-learning expert and former US Army intelligence analyst, so she’s one of the few people I trust who has firsthand experience dealing with such dilemmas; after all, she literally went to prison for it. We talked about how the internet got into this mess, Silicon Valley’s current obsession with “AI,” and how looking at the early internet can help us build new ways to publish, share, and verify knowledge without the algorithmically filtered feeds of captive social media platforms.

​​

Janus Rose: The internet really kind of sucks now. I don’t think anyone still expects it to be some kind of utopia like they did in the nineties, but it feels like even its basic functions of information search and discovery are now failing on a massive scale. What happened?

Chelsea Manning: I think it first really accelerated in the 2000s with venture capital and the advent of Google, Facebook/Meta, and smartphones. I once had a more utopian impression of what the internet could do. Being connected with friends and family, not having to depend on a mainstream publication or TV network—it felt like anyone could be a blogger or an independent journalist and you could just post the info you wanted to have out there. And there was this idea that corporations and states couldn’t do anything about it.

But once these platforms like Google and Facebook that were up-and-coming became the incumbent powers, they were like: okay, we accomplished this and now we need to hold on to this power. So now I think there’s been a concerted effort to make the internet as unusable as possible. Now that they’re in position where they have shareholders, they have to constantly prove that they have a new way of extracting what little wealth is left among the middle class. They have to make sure there’s always another thing they can make as monetizable as possible. That’s really the start of it.

JR: It feels like the elephant in the room is so-called “generative AI.” I knew things like ChatGPT would eventually start flooding the internet with crap, but it’s shocking how fast they’re being adopted anyway. Researchers point out the flaws and they’re ignored,6 and now there are companies devoted to nothing but milking ad revenue with fake articles and mass-generated garbage. And the worst part is this cult of AI-pilled tech-hype bros who are trying to convince everyone this is “The Future.”

CM: Cory Doctorow calls this the “enshitification” of the internet.7 Platforms are becoming increasingly unusable because of the profit motive and the incentive models of engagement.

I was working with natural language processing systems before this, but the kind we interact with now [Large Language Models, or LLMs] only started to come into being in 2020. Then they slapped on this user interface that allowed an average user to interact with LLMs for the first time. It quickly became apparent that normal people were going to be wowed by gobbledygook and rapid regurgitations of information that was already readily available. Now that you have these pre-trained [AI] models, you’re able to do this at a much lower computing cost than what was required to train that data.

JR: What scares me is that while all this is happening, all the people whose job it is to sort signal from noise are losing their jobs. Digital journalism is rapidly collapsing, and I feel like most people aren’t really fully grasping the repercussions of that.

CM: I don’t think digital journalism is collapsing, I think it’s being killed. There’s been a deliberate and concerted effort after the George Floyd protests in 2020. There was a realization [by those in power] that the flow of information was very different in this environment. It wasn’t about where you were geographically, but the communities online you interacted with, and what profiles these online platforms had identified you as having.

JR: All these sites like Buzzfeed News and Vice weren’t perfect, but they at least provided some alternative to legacy media. As someone who’s worked in journalism for years, I’m terrified of what happens when the only thing left is places like the Wall Street Journal and the New York Times, where journalism is just this class of obedient stenographers gathering little info nuggets from out-of-touch elites. You only get “official sources,” and propaganda becomes trivially easy.

CM: Well, that’s the way it was before. These Silicon Valley tech billionaires have started to realize that the mass-mediated propaganda model of the twentieth century was pretty effective at keeping things in check. In the nineties, people were usually talking about one or two things around the water cooler, and pop culture was about the same. Now we’re so separated and split up that we don’t have this single mass-mediated culture anymore. We have these commodified, cellular microcultures which are turned into products and mechanisms of online engagement. I think big companies have recognized that there’s a value in controlling that. Having up-and-coming independent journalists who aren’t backed by venture capital is a threat to the established institutions.

Trevor Paglen, Large Hangars and Fuel Storage; Tonopah Test Range, NV; Distance approx. 18 miles; 10:44 am, 2005. Courtes of the artist.

JR: I just realized the institutions that want to return to this old media model you’re talking about are the same institutions who ignored you back in 2008, when you were trying to share evidence of US war crimes. There are more ways to publish information now, but they all have to be filtered through big tech and social media and algorithms. I think about what’s happening now in Gaza, where you have essentially this 24/7 livestream of genocide and human suffering that everyone can see, but the people in power just dismiss and gaslight and the legacy media outlets help them do it.

CM: This is why the online media platforms started backing certain creators. It’s one of the reasons “content creator” has emerged as a class in the past decade or so. As a content creator you gain sponsorship deals and benefits, but you have to stay within certain lines. Platforms have recognized that having control and gatekeeping authority enables them to have a wide variety of content, but within certain parameters.

JR: Right, but we still see examples of counternarratives breaking through. All these college kids starting encampments for Palestine clearly have a degree of media awareness where they understand how to navigate online censorship and gatekeeping. I’m wondering how the powers that be will respond to try and regain control.

CM: We’ve already seen what their response is going to be. They try to dismiss and discredit, but they also channel the information that they want to proliferate. After the Russian invasion of Ukraine, there was a lot of misinformation and disinformation flowing, but a lot of that was allowed to flow because it was on the side of the incumbent [Western] powers.

It’s the channeling of propaganda as opposed to the creation of it. In the hierarchical era of mass media, you would create a narrative and it would trickle out and it was centrally controlled. Now you have a bottom-up flow of information, but you’re able to pick and choose and channel the information that you want and don’t want to spread. So the incumbent powers and institutions are learning how to make this work in ways that benefit them, and in the process they commodify that and turn it into a product.

JR: All the journalists I know want to keep reporting and writing, but there’s no place left to do it. Publishing doesn’t pay well anymore unless you become one of these cult-celebrity Substackers and podcasters who play to their audience with reactionary clickbait—the Joe Rogans and Ben Shapiros of the world. I feel like we need a return to local, small-scale journalism, but what does that even look like?

CM: I think it’s important to remember that the mass media didn’t come from nothing. It came from the industrialization of information. The mass production of information is how the idea of the “journalist” formed as a profession. Right now I think we’re going back to the reverse of that, where a pamphlet was the most effective means of advancing an idea. I think we’re largely returning to pamphleteering as a mode of information distribution.

Jonathan Harris, We Feel Fine, an exploration of human emotion through large-scale blog analysis, 2006. Courtesy of the artist.

JR: I like the idea of small online publishers as pamphleteers, but the ground-truth reporting still needs to happen somewhere. It drives me crazy when people say, “Oh it’s okay, I get all my news from TikTok anyway.” Like, where do you think that information comes from? Usually when you see a video of a Gen-Z’er talking about news events they’re literally pointing behind them at a green-screened article from a news outlet that is on the verge of not existing. Does it make sense to build tech that can fix this pipeline?

CM: I think tech is part of the problem here. Verification of information is really important to the kinds of work that I do, and I think that one of the problems I’ve seen is that the original source of this information is going through so many iterations that it doesn’t resemble the original info anymore. A TikTok, citing a tweet thread, citing an article, citing an academic paper, etc. The number of layers between the producer and the recipient allows a lot of error to occur. In information theory there is a focus on avoiding as much noise as possible. As it goes through more and more nodes in the network, the amount of noise builds. That noise-to-signal ratio is becoming untenable.

JR: Lately I’ve been thinking a lot about all the different things the web could have been. We had homepages and BBS and all these different ideas about how to publish and discover information. And then search engines and Google came along and dominated the entire ecosystem. That’s a big part of how we got here, I think, where knowledge conforms to some central algorithm instead of the other way around. We badly need a new way of organizing knowledge, and I wonder if it makes sense to go back and revisit some of these ideas that were thrown by the wayside?

CM: Yes. I have very specifically looked at the pre–search engine internet as a potential model. There’s no reason you can’t just put up a website anymore. The thing is that we need to be able to verify that information at the source, and make sure that the signal-to-noise ratio is tuned down as much as possible. I think there are technical ways you can accomplish that, but there hasn’t been investment because it’s threatening the incumbent actors who are heavily invested in the extraction that’s ongoing.

JR: Decentralization feels like an essential element of whatever comes next. These monopolistic big-tech models that revolve around ad revenue are just not sustainable. Subscription models and newsletters seem to be gaining some traction, and I do think a lot of people are willing to pay for quality alternative news. But they all live on these little islands that aren’t connected, and people always have to come back to these corporate-owned social media platforms to share content and build their audience.

CM: We have been deliberately and methodically trained to interact with these social platforms.

JR: By “trained” you’re talking about how we’ve been conditioned by these platforms that quantify social interactions and make people associate quality information with what “gets numbers”?

CM: Yes. Somehow we have to counteract that. I think a counterculture to this culture will develop, but it will take time and we’re only in the early stages of this neo-Luddite rejection of social media platforms as being the way of sharing verifiable, important information.

The other aspect is that there are technical means that haven’t been tried that might address this. There are tools and mechanisms and well-written papers on these concepts, but nobody has invested the time and resources to put them together and try to make them work.

JR: Yeah, people have been talking about building some kind of alternative internet for a long time, but it always comes back to “who is going to build this and how are they going to get funding.”

CM: Right, but the problem I keep seeing is that whenever people talk about an alternative internet, they just come up with more platforms. The alternative to Twitter after it changed and became X was Threads or Bluesky, but those are just other platforms. They look almost identical, and they have many of the same problems, even though they might be managed differently.

I think the paradigm that we’re stuck in is that we’ve only conceptualized the internet through platforms that already exist. We aren’t trying something that might not look like Instagram, that might not look like TikTok or Twitter.

JR: I feel like a lot of the so-called “decentralization” movement has also been co-opted by NFT hustlers and cryptocoin scammers. And often these are the same people who are now pushing the Generative AI trend, which is the new flavor of the week for making a quick buck. How do we get beyond that and build common infrastructure that benefits everyone instead of creating more tech grift?

CM: The so-called Web3 movement is what we’re talking about here, right?

JR: Yeah, exactly.

CM: The underlying technology is fine, and in some instances there are some unproven use cases for the tech. The problem you’ve identified is cultural, and it’s because the incentive structure of Silicon Valley and cryptocurrency is one of acceleration and extraction. You have to be constantly scaling up, because that’s what’s expected by venture capital and shareholders. And that’s why you won’t be able to change anything with that. The decentralization movement actually had some really good people, but they left without getting a chance to do anything because it became grift so fast.

JR: I remember how a lot of people made these revolutionary claims about Bitcoin and cryptocurrency …

CM: Yeah, but cryptocurrency is still a capital asset. It’s using really good math, but it only does it to recreate the existing system. And one of the reasons for that is being able to avoid regulation and the SEC.

JR: If journalism survives, it’s not going to be on a blockchain. I’m extremely confident of this.

CM: Whatever we build, it can’t just be decentralized. Things can be decentralized and can still be co-opted by incumbent powers. It has to be distributed tools, low cost, accessible, verifiable, and require a low amount of power.

JR: I feel like this is a return to the underground. You mentioned pamphlets before. Sometimes I see these political education Instagram posts that feel like the digital equivalent of pamphlets or zines. The question is where do you place them so people will actually find them, if not on social media? Like if we built a decentralized digital zine distribution network, is that something people would adopt?

CM: I don’t think that’s the right question. I think we should focus on what works. People didn’t flock to the internet immediately. You need to build something that people will want to use. It’s lame to say, “If you build it they will come,” but I think you have to experiment and let it play out. You can’t keep doing what’s already been done, and it can’t keep looking the same.

Notes
1

See .

2

Chloe Xiang, “Verified Twitter Accounts Spread AI-Generated Hoax of Pentagon Explosion,” Vice, May 22, 2023 .

3

Jason Koebler, “Google Search Really Has Gotten Worse, Researchers Find,” 404 Media, January 16, 2024 .

4

Katie Way, “VICE Ran Out of Money (For Everyone Except Its Executives),” Hell Gate, July 11, 2023 ; Max Rivlin-Nadler, “VICE Management Kills Vice.Com, Set to Lay Off Hundreds of Employees,” Hell Gate, February 22, 2024 .

5

A story she tells in her own words in Chelsea Manning, README.txt (Farrar, Strauss and Giroux, 2022).

6

Chloe Xiang, “AI Isn’t Artificial or Intelligent,” Motherboard, December 2, 2022 .

7

Doctorow, “The ‘Enshitification’ of TikTok,” Wired, January 23, 2023 .

Category
Internet
Subject
Social Media, Artificial intelligence, Mass Media & Entertainment
Return to Issue #146

Janus Rose is a writer, editor, and journalist whose work explores the impacts of information technology on resistance movements and marginalized communities. Previously a senior editor at VICE Motherboard, her work has appeared in the New Yorker, Dazed Magazine, The Intercept, and the Village Voice.

Advertisement
Subscribe

e-flux announcements are emailed press releases for art exhibitions from all over the world.

Agenda delivers news from galleries, art spaces, and publications, while Criticism publishes reviews of exhibitions and books.

Architecture announcements cover current architecture and design projects, symposia, exhibitions, and publications from all over the world.

Film announcements are newsletters about screenings, film festivals, and exhibitions of moving image.

Education announces academic employment opportunities, calls for applications, symposia, publications, exhibitions, and educational programs.

Sign up to receive information about events organized by e-flux at e-flux Screening Room, Bar Laika, or elsewhere.

I have read e-flux’s privacy policy and agree that e-flux may send me announcements to the email address entered above and that my data will be processed for this purpose in accordance with e-flux’s privacy policy*

Thank you for your interest in e-flux. Check your inbox to confirm your subscription.