“Attention,” a voice began to call, and it was as though an oboe had suddenly become articulate. “Attention,” it repeated in the same high, nasal monotone. “Attention.”
So begins Aldous Huxley’s Island, first published in 1962, as the shipwrecked journalist Will Farnaby wakes beneath the spreading trees of Pala, a closed and mysterious Polynesian island. He has come to this place—his wrecking is later revealed to have been deliberate—in order to persuade the ruling Rani to grant the local mineral rights to an unpleasant partnership consisting of a regional strongman and international oil company. But the troubled and cynical Farnaby is gradually won over by the Palanese combination of kindness, thoughtfulness, and pacifism, which is embodied in the sounds he hears at the moment of his arrival.
“Attention, Attention. Here and now, boys. Attention.” These words, which echo constantly around the island and through the text, are spoken not by human voices, but by flocks of local mynah birds, trained and set free by the islanders. Because “that’s what you always forget, isn’t it?” says little Mary Sarojini MacPhail. “You forget to pay attention to what’s happening. And that’s the same as not being here and now.”1
On December 14, 1735, the English landowner and naturalist Robert Marsham heard a thrush sing. Song thrushes sing to establish territory, and begin doing so any time from late autumn through into the new year. Marsham wondered if there was a connection between when the first thrush sung and other conditions of the natural world. And so, like the Palanese, he paid attention.
The following year Marsham began keeping records at his estate in east Norfolk of the first signs of spring. He began with the first swallow, which arrived that year on the 101st day, April 10 (1736 was a leap year). Two years later he was travelling around Europe, and made note of the first swallow in Piacenza, Italy (79, or March 20) as well as the first hawthorn blossom in Nîmes, France (104, April 14). In 1739, back home in England, he expanded his interest to the appearance of the cuckoo (120) and nightingale (126), the first leaves of the sycamore (65), and the sprouting of most crucial plant to the Norfolk farmer, the turnip (63). Twenty years later he was still at it, observing and carefully noting down snowdrop flowers; oak, birch, chestnut and hornbeam leaves; the arrival of the migrating nightjars and the first young rooks.2
Marsham died in 1797, but the following year’s records were inscribed in the hand of his son—also Robert—who continued keeping notes until 1810. After a hiatus of some fifteen years, the next Robert in line picked up the baton, and the Marsham record continued, nearly unbroken, until well into the twentieth century, and the death of Mary Marsham, the first Robert’s great-great-great-granddaughter, in 1958.
Today, Marsham is recognized as the founder of a discipline now known as “phenology,” from the Greek φαίνω, meaning to show, appear, or bring to light. It is concerned not with the appearance of things in the visual, aesthetic sense, but in the temporal one: not how things appear, but when they do. Phenology thus depends on paying attention, over time, to the here and now.
Between 1850 and 1950, the Marsham records show a couple of clear trends. One is that, over the course of a century, there is a slow but observable increase in mean temperatures, particularly in the winter months. And this observation is correlated in the behavior of plants and animals: oak leaves, for example, appear a little earlier every year. But as the authors of one paper examining the records note, “How the earlier leafing, flowering, and arrival of animals will affect our perception of spring is difficult to gauge. The slow rate of change and annual variation will probably mean that the changes will go unnoticed by a single human generation.”3 This is not a problem limited to humans, or even to generations.
Throughout the long Arctic winter, caribou around Kangerlussuaq in Western Greenland stick close to the sea’s edge, grazing on lichen scraped from the rocks beneath the tundra. In late May and early June, when spring arrives and the ice begins to melt, they begin to move east, heading inland to calving grounds they’ve visited annually for over 3,000 years. Studies first undertaken in the 1970s by Danish researchers revealed what scientists call a high state of trophic match: the caribou’s migration was timed to coincide with the springtime flush of vegetation, and there was plenty to eat for both new parents and their hungry offspring. The caribou flourished.
Since the 1990s, another set of researchers has been observing the caribou of Kangerlussuaq, and they have seen this population start to fall, hampered by a combination of falling birth rates and rising calf mortality, both symptomatic of problems in the food chain. Following the herd inland, the researchers found that the plants the caribou typically grazed on were budding earlier in the year, just like Marsham’s oaks, and had already started to lose much of their nutritional value by the time the caribou arrived. By comparing the creeping schedule of plant growth with satellite observations of Arctic sea ice, the researchers connect the warming of the oceans with the earlier spring greening, and demonstrate not merely a growing trophic mismatch, but a phenological one.4
Phenological mismatches occur when periodic plant and animal life cycle events fall out of sync with one another. Previously synchronous events, such as the caribou’s calving period and the most vigorous greening of the Greenlandic tundra, are slipping apart, due to varying rates of change in the participants’ adaptation to differing environmental conditions. While the plants respond to changes in temperature, and are thus budding earlier in the year—as much as 25 days earlier compared to a decade ago—the caribou time their migration by the length of days, which are getting warmer, but not longer.5 As a result, and even as the indications of spring have shifted ever earlier, the timing of the caribou’s spring movement has not changed, and by the time they arrive inland there is less, and lower quality food to go around.
Other species are subject to phenological shifts, but as they respond to different cues than the caribou they are more capable of weathering the change. Arthropod populations, for example, are responsive to the timing and duration of temperatures warm enough to allow for foraging and growth: the warming Arctic and longer summers means northerly spider species are doing very well indeed. The same goes for yellow-bellied marmots in the Colorado Rockies: as they hibernate rather than migrate, longer growing seasons means they’re growing fatter, reproducing faster, and surviving winter in greater numbers.6
Humans respond to such changing conditions too. Already, millions are being driven off the growing and grazing lands they’ve used for thousands of years. Climate change has been repeatedly asserted as one of the key drivers of conflict in the Middle East—a claim that remains contentious, perhaps only because, like the Marshams, we are mostly capable of only seeing present indications, rather than historical patterns.7 According to the World Food Program, half of Central American migrants ascribe their movement to food insecurity, the result of increasing drought, and warmer temperatures.8 And the phenological mismatch that climate migrants encounter is not limited to shifts in the weather: moving northwards and westwards, it is political shifts, in the form of anti-migrant rhetoric and hardening border walls, that they run up against.
Over the last fifty years, we have come to recognize that the fuel of our civilizational expansion has become the main driver of our extinction, and that of many of the species we share the planet with. We are now coming to realize that is as true of our cognitive infrastructure. Something is out of sync, felt everywhere: something amiss in the temporal order, and it is as related to political and technological shifts, shifts in our own cognition and attention, as it is to climatic ones. To think clearly in such times requires an intersectional understanding of time itself, a way of thinking that escapes the cognitive traps, ancient and modern, into which we too easily fall. Because our technologies, the infrastructures we have built to escape our past, have turned instead to cancelling our future.
At a practical level, the global traffic in information has already become a significant contributor to climate change. Capturing, transmitting, processing, storing, and redistributing data, while practically invisible behind the glass of our computer screens, is an energy-intensive process. As of 2015, data centers consumed about three percent of the world’s electricity and accounted for two percent of total global emissions; approximately the same carbon footprint as the airline industry. Much attention has been lavished on the disturbing idea that Bitcoin, if it continues at its present rate of adoption and growth, could alone account for 2°C of global warming in the next thirty years.9 But individual attentional behaviors act in insidious ways too: charging a single tablet or smart phone uses a negligible amount of electricity at home, but using either to watch an hour of video a week consumes more electricity in a year than two new refrigerators.10
Moreover, in a particularly grim feedback loop, climate change is literally making us more stupid. Global atmospheric CO₂ levels passed 400 parts per million in 2016, and despite ever more apocalyptic reports from scientists and international organizations, we seem incapable of addressing its rise. “Business as usual,” which is the Intergovernmental Panel on Climate Change’s revealing term for our present course, predicts an atmospheric CO₂ concentration of 1,000ppm by 2100. And at 1,000ppm, human cognitive ability drops by 21%. Already, indoor CO₂ in stuffy boardrooms, classrooms, and bedrooms regularly exceeds 2,000ppm, with measurable effects on cognitive performance. Our inability to think about alternatives to climate change and its interweaving with our daily social, political, and daily practices turns gradually into an inability to think at all.11
The deepest level at which this intercession occurs is the one that makes it feel like a phenological mismatch. And it is where the most personal of relationships we have, rather than the remote statistical accounts of distant catastrophe, provide the clearest signal. That relationship is with the digital devices we carry in our pockets, which the average smartphone user touches, taps, and swipes 2,617 times a day.12 Just as the plants of the Arctic tundra have responded faster to the changing conditions of the climate while the caribou have been left behind to starve, so our technologies have, driven by different incentives to our own, out-evolved our cognitive capacity to preserve ourselves.
In 2003, a Stanford psychologist named BJ Fogg published a book entitled Persuasive Technology. The book detailed the numerous ways in which technology could shift human behavior by exploiting certain deep-seated needs and responses, namely: the need for companionship and social validation, the expectation of reciprocity, and the dopamine hits of novelty—small rewards and accomplishments. The examples Fogg gives in the book seem almost antiquarian now: he cites the Tamagotchi craze of the 1990s as an example of a technology performing as a social actor—intervening directly in the life routines of its owners—and the early freemium email client Eudora as a pioneer of social dynamics, masking intrusive and repetitive requests to register behind humorous and self-deprecating dialogue box copy.13 Fogg was also initially perspicacious about the potential of technological persuasion: a contemporaneous seminar of his on “How to Motivate & Persuade Users” includes a slide on the ethics of persuasive technology and outlines six particular concerns, among them that “the novelty of the technology can mask its persuasive intent,” “computers can be proactively persistent,” and “computers can affect emotions but can’t be affected by them.”14 But Fogg’s work turned out to be deeply prophetic, and his warnings, if they continued to be spoken, were heard by only a few of his students.
In 2006 two students in Fogg’s class at Stanford collaborated on a class project called “Send the Sunshine.” At this point it was still years before smartphones became omnipresent and always connected, but the students thought that one day mobile devices might be used to send emotions to other people and proposed a system whereby people enjoying good weather could send it to people who were not. One of those students, Mike Krieger, went on to found Instagram, an application used today by more than a billion people every month and which exemplifies Fogg’s principles at every level, from the obvious social rewards of likes and comments to, as Fogg himself has noted, the choice of filters users can apply to each photo. “Sure, there’s a functional benefit: the user has control over their images. But the real transaction is emotional: before you even post anything, you get to feel like an artist.”15
Perhaps Instagram’s most reviled yet effective feature is the algorithmic timeline: an invention which signals its computational-temporal effects in its very name. The algorithmic timeline is a deliberate phenological mismatch, reordering the flow of time itself to maximize the action/reward loop. No longer can users reassert their position in time by catching up to the present moment; instead the application withholds older photos and reinserts them on later visits to provide a continuous, and uninterrupted flow of “new” content. Each discovery prompts the little hit of novelty so satisfying to the mammalian brain, and keeps users coming back for more and more. The transparent success of this technique has even led to accusations that Instagram performs the same trick with likes, a claim which persists despite being both unproven and denied. Like the persistent belief in Facebook’s ability to target ads based on overheard conversations, the effectiveness of computational manipulation techniques spooks users into attributing even wilder abilities to the machine—but not, it seems, to quitting the addiction.16
Instagram and others’ wholehearted embrace of the techniques of technological persuasion has been dubbed by the attention activist Tristram Harris “the race to the bottom of the brain stem”—and Harris should know, having been Krieger’s collaborator in Fogg’s class on their Sunshine app. Following Stanford, Harris founded a start-up called Apture, which allowed blogs and news media to pull multimedia content from other sites onto their own, dissuading users from following links away from them. Apture was subsequently acquired by Google, and when Harris wrote an internal memo expressing discomfort with the way Fogg’s techniques were being deployed, he was promoted to the harmless position of “design ethicist and product philosopher.”17 He then quit to found the Center for Humane Technology, whose advisors include Justin Rosenstein, the inventor of the Like button, and Hong Qu, who built YouTube’s sharing functions.18
Nothing quite illustrates the race to the bottom of the brain stem like YouTube. With little or no human intention other than the cold logic of capitalism, YouTube’s combination of ad-driven rewards and algorithmic attention maximization has evolved a system which directly targets the attention of very small children, harnessing them to the screen and opening a hellmouth of meaningless, hallucinatory content, disturbing themes, exploitative working practices, and outright abuse.19 And while these effects are now well documented, their implications have failed to fully register at the level of society—which seems just as dedicated as Google to feeding humans to mindless algorithmic optimization systems—or even in adult attitudes to other video platforms. The only company that perhaps rivals YouTube for the effectiveness of the autoplay function and its corresponding phenological attack vector on the human nervous system is Netflix, whose CEO has publicly avowed that the primary competitor to his services is sleep itself.20 The strategy is working: clinical studies are already showing that as binge-watching increases among young adults, more and more report fatigue and insomnia as a result.21
One question we might thus ask is how to reconnect our forms of attention with the information-generating and mutually beneficial modes of the Marshams, rather than the wasteful and exploitative practices enforced by contemporary capitalist technologies. Because one of the many great ironies in this narrative is that while these technologies are both fueling climate change—physically in the form of energy expenditure, and cognitively in the decline of our powers of reason—they will in the not too distant future suffer their own forms of phenological mismatch.
In 2011 Google opened a data center in Hamina, just outside Helsinki. A year later, Facebook followed suit, establishing its first European data center in Luleå, in northern Sweden, just seventy miles south of the Arctic Circle. Both went north for the same reasons that people and even plants are on the move: they are migrating to where its colder, even if decreasingly so. The vast costs of cooling the hot racks of data centers can, for now, be offset by placing them in chillier environments with access to cheap hydroelectric power and other renewable energy sources. But these benefits will attenuate in time, as a warming, wetter earth provokes additional problems for computation. Increasing humidity and rainfall will change the refractive index of the atmosphere, leading to problems with microwave transmission and wi-fi propagation. Rising sea levels will overrun the beaches where cables come ashore and ever more frequent superstorms will make the repair and upkeep of long distance cable networks impossible. Such is the intractable logic of phenology: everything slips out of time.
The greatest trick our utility-directed technologies have performed is to constantly pull us out of time: to distract us from the here and now, to treat time as a kind of fossil fuel which can be endlessly extracted in the service of a utopian future which never quite arrives. If information is the new oil, we are already, in the hyper-accelerated way of present things, well into the fracking age, with tremors shuddering through the landscape and the tap water on fire. But this is not enough; it will never be enough. We must be displaced utterly in time, caught up in endless imaginings of the future while endlessly neglecting the lessons and potential actions of the present moment.
We thought technology was about means, but it has been subverted for ends. Aldous Huxley, whose island paradise of Pala was ultimately overrun by the combined forces of fossil fuel prospectors and violent colonial expansionists, wrote in 1937 that “Good ends, as I have frequently to point out, can be achieved only by the employment of appropriate means. The end cannot justify the means, for the simple and obvious reason that the means employed determine the nature of the ends produced.”22 A first and worthy step in reasserting the means and meaning of the things that surround us might be to hear the mynah bird calling in the jungle and add it to our own list of appearances; that its advice might both serve us in the present, and, as Marsham’s did, assist others throughout time. Our attention is a resource to be reclaimed and stewarded, not in the narcissistic service of self-improvement, nor for the purposes of mindless escapism, but to enable us to see clearly exactly where we are, and act meaningfully in the light of that knowledge.
Aldous Huxley, Island (New York: Harper & Brothers Publishers, 1962).
Ivan Margery, “The Marsham Phenological Record in Norfolk, 1736–1925, and some others,” Quarterly Journal of the Royal Meteorological Society (January 1926), ➝.
T. H. Sparks and P. D. Carey, “The Responses of Species to Climate Over Two Centuries: An Analysis of the Marsham Phenological Record, 1736-1947,” Journal of Ecology 83, no. 2 (April 1995): 321–329, ➝.
Jeffrey T. Kerby and Eric Post, “Advancing plant phenology and reduced herbivore production in a terrestrial system associated with sea ice decline,” Nature Communications (October 2013), ➝.
Eric Post, Jeffrey Kerby, Christian Pedersen, and Heidi Steltzer, “Highly individualistic rates of plant phenological advance associated with arctic sea ice dynamics,” Royal Society Biology Letters (December 2016), ➝.
Abraham J. Miller-Rushing, Toke Thomas Høye, David W. Inouye, and Eric Post, “The effects of phenological mismatches on demography,” Philosophical Transactions B (October 2010), ➝.
Alex Randall, “Syria and climate change: did the media get it right?,” Climate and Migration Coalition, ➝.
“Food Security and Emigration,” World Food Program, August 2017, ➝.
Camilo Mora, Randi L Rollins, Katie Taladay, Michael B Kantar, Mason K Chock, Mio Shimada, and Erik C Franklin, “Bitcoin emissions alone could push global warming above 2°C,” Nature Climate Change (November 2018), ➝.
Mark P Mills, “The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure, and Big Power,” Digital Power Group (August 2013), ➝.
James Bridle, “Air pollution rots our brains. Is that why we don’t do anything about it?,” The Guardian, September 24, 2018, ➝.
Michael Winnick, “Putting a Finger on Our Phone Obsession,” Dscout, June 16, 2016, ➝.
BJ Fogg, Persuasive Technology: Using Computers to Change What We Think and Do (Amsterdam and Boston: Morgan Kaufmann Publishers, 2003).
BJ Fogg, “How to Motivate & Persuade Users,” CHI 2003 Tutorial, 2003, ➝.
Ian Leslie, “The Scientists who make Apps addictive,” The Economist, October/November 2016, ➝.
Jill Petzinger, “Instagram CTO says they do not withhold “likes” to keep users coming back for more,” Quartz, January 14, 2018, ➝.
Paul Lewis, “’Our minds can be hijacked’: the tech insiders who fear a smartphone dystopia,” The Guardian, October 5, 2017, ➝.
➝.
James Bridle, “Something is wrong on the Internet,” Medium, November 6, 2017, ➝.
Rina Raphael, “Netflix CEO Reed Hastings: Sleep Is Our Competition,” Fast Company, June 11, 2017, ➝.
Liese Exelmans and Jan Van den Bulck, “Binge Viewing, Sleep, and the Role of Pre-Sleep Arousal,” Journal of Clinical Sleep Medicine 13, no. 08 (August 2017), ➝.
Aldoux Huxley, Ends and Means: an inquiry into the nature of ideals and into the methods employed for their realization (New York and London: Harper & Bros., 1937).
Becoming Digital is a collaboration between e-flux Architecture and Ellie Abrons, McLain Clutter, and Adam Fure of the Taubman College of Architecture and Urban Planning.