Everything I Needed to Unlearn I Learned from Sid Meier’s Civilization

I’ve been playing Sid Meier’s Civilization my whole video-game-playing life. If you don’t know it, it’s a slow strategy game that models the origins of “civilization” through the near future. Players choose a “civilization” to play (what anthropologists of an earlier era might refer to as a “culture group”) and take turns conducting research, moving units around to explore the randomly-generated board, engaging in diplomacy, waging war, and modifying the landscape to extract strategic resources. Players start by placing a settlement that will grow into a dynamic, urban, capital city over the next 6000+ years of gameplay. If that sounds boring, somehow the designers of the game have managed to overcome the implicit boringness of the premise, and made a game that can half-joking ask players when they’ve finished the game if they want to play “just one more turn” and know that many will. Which is all to say that Civilization is slightly compulsive, and I have lost many nights to playing the game into the wee hours.

The cover of the original version of Sid Meier’s Civilization from 1991. Somehow it perfectly captures a lot of what’s wrong with the game…

Civilization is almost educational. Or it would be if it didn’t fly in the face of a century of research in the social sciences (which I’ll get to briefly). I often think about having my undergraduate students play it, largely because it relies on a set of presumptions about how “civilizations” work, and what differentiates “successful” ones from those that “collapse.” As a game, it attempts to model how societies move from being small-scale, early agricultural communities with a small government to a much larger, continent-spanning, industrialized nation with a “modern” form of government (i.e. democracy, communism, or fascism). All of these are based on a player’s progress through the “tech tree,” a set of unfurling technologies that move from pottery, agriculture, and the wheel, to sanitation, nuclear power, and space flight. If that sounds like unilineal evolution, that’s because it basically is; if it doesn’t sound like unilineal evolution, it might be because that’s an unfamiliar term, which might be familiar in its assumptions.

Unilineal evolution is the idea that there are stages to social development, and societies move from a state of savagery, to barbarism, to being truly civilized. Popular in the US and Western Europe in the late 1800s, unilineal evolution was one of the underlying justifications for imperialism (the “white man’s burden” was to help all of those “half-devil half-child” “savages” move up the tech tree). As a theory, social scientists threw unilineal evolution out decades ago, pointing to the racist, colonial biases in a theory developed by a bunch of white men in the global north that posited that the features of societies in Western Europe (and, begrudgingly, the northeastern US) represented the pinnacle of civilization (secularism, representative politics, industrial capitalism, heteronormative kinship, etc.).

Over time, anthropologists and historians did a pretty good job of showing how wrong that kind of thinking is, beyond its implicit colonial racism. First, civilizations like China and Japan made it fairly clear that a society can have some of these civilizational features without having all of them, and that the development of any one of them doesn’t necessarily depend on the development of a specific preceding stage or technology (e.g. you don’t have to have polytheism before monotheism, or monotheism before secularism; or or the germ theory of disease before sanitation). And second, it became increasingly clear that the idea that societies move from “simple” to “complex” forms of institutions ignored just how complex “simple” institutions can be. What looks to be “simple” from the outside can be exceedingly complex from the inside (e.g. kinship systems in Papua New Guinea). But some form of unilineal evolution persists in Civilization, and it’s very apparent in the biases baked into the game.

Early versions of Civilization were pretty straightforward in their biases. It was difficult to win the game with anything other than a market-driven democracy, even if you were a warmonger (you’ve got to have a market system to pay for all that military R&D and unit support, after all). Over time, Civilization has become a more modular game. It used to be that adopting a government like Democracy came with a set of predetermined features, but now Democracy has a base set of rules, and players can choose from a set of “policies” that offer a variety of bonuses. In that way, you can play a Democracy that depends upon an isolationist, surveillance state or a peaceful Communist state that provides its citizens with amenities to keep them happy. Better yet, the designers chose to separate the technological and “civic” trees, so one needn’t research the wheel before democracy (which can also allow for a civilization that is scientifically invested, but ignores “civic” achievements). But one of the biases that persists is technological determinism.

It might seem silly to suggest that a society needn’t invent the wheel before inventing gunpowder, but the wheel is not a precondition for chemistry. Similarly, one needn’t understand shipbuilding to develop atomic theory. Yes, we live in a world where the wheel preceded gunpowder and shipbuilding preceded atomic theory, but on a planet with a Pangea-like mega-continent, shipbuilding would be unnecessary. Access to some bat guano, sulfur, and charcoal resulting in gunpowder isn’t so hard to imagine preceding the development of the wheel. In all cases, what actually makes a technology possible are the social demands that compel research and encourage individuals and communities to harness a technology’s usage. Hence, gunpowder’s early discovery and widespread abandonment in China or how the refrigerator got its hum. I understand why, for the sake of the game, some kind of tech tree is important, but what continues to confound me is why there are technological bottlenecks where you have to have a specific technology before you can research further technologies (and the same goes for “civics”).

A persistent feature of the game is that each of the civilizations has some set of basic benefits, which can include special units and buildings, and, in some cases, suggest that there is something intrinsic about a civilization’s relationship with geography. Canada and Russia get a bonus for being near tundra tiles; Japan gets a bonus for fighting along water tiles; etc. At its best, these kinds of rules make the game dynamic. At its worst, it fosters a kind of Jared Diamond-esque environmental determinism. (Which, again, historians and anthropologists discredited long before his Pulitzer Prize-winning Guns, Germs, and Steel — but, institutional racism is hard to overcome!) A more nuanced game might allow players to mix and match these bonuses to reflect the complex relationship between what societies value and the landscapes they have to make do with.

One other enduring problem in the game is that the designers really want to focus on the positive side of civilization. These days, Great People can be recruited to join your civilization, each of which has a positive effect (e.g. Albert Einstein gives a boost to your scientific output). But what about all the terrible “Great People” in history? What about the slave trade, on which contemporary capitalism was built? When Civilization 6 was initially released, environmental change (i.e. the Anthropocene, which is what the game is all about) wasn’t included in the game, inspiring the rumor that it was too controversial to include. Maybe including things like racism and ethnonationalism would make the game too grim; maybe the designers simply want players to provide those narratives to the game as they play it. But if any of the criticisms of my above concerns amount to “but that just isn’t realistic,” so too is the history of human civilizations without the ugly side of the nation-state and everyday politics. (As I write this, I kind of wish there was a “utopia mode” that would allow players to avoid things like fossil fuel combustion, factory farms, and the gendered division of labor, to name just three.)

This is clearly not an exhaustive list of all of the problems with Civilization. Whatever its problems, it provides a basis to rethink some of the biases in history and social science — and popular culture more generally. Working through what’s wrong with Civilization helps open up what anthropology and history have done over the 20th century to change the way that social scientists think about “civilization” and what it’s composed of and how it changes over time.

It would be amazing if Civilization 7 was more of an open sandbox, allowing players more flexibility in how they play. It would also be great if there was more of a dark side to Civilization. I don’t think Civilization drove me to become an anthropologist, but it does continue to remind me — each time I play a game — of what has gone wrong with social theory over the course of the 19th and 20th centuries, and how we might work against implicit and explicit biases in the narratives that get told in video games and elsewhere. I hope the next version of Civilization gets up to date with contemporary social science, but, in any case, I’m not going to stop playing it…

School start times: Why so rigid?

Here’s the latest from the UMN Press blog, on school start times:

Over the past thirty years, there’s been a mounting body of evidence regarding changes in long-term sleep needs. Infants need a lot of sleep; children less so; adolescents need more; and adults, less, until our later years, when many require even less sleep.

So over the life course, it’s perfectly normal to sleep as much as twelve hours (even more for infants) and as little as four in a day. Along with these changes in sleep needs are changes in the time of sleep onset: as infants, most of us fall asleep earlier than we will as teenagers or adults; in our later years, we’ll wake up well before we do as children or adults. Sometimes we think about these differences in our sleep as pathological and seek out medical help, especially adults who start sleeping less than they used to, who often complain of insomnia despite feeling well rested.

But before we’re adults, we’re often at the mercy of other people’s interpretations of our sleep. And no one has a harder time garnering respect for their sleep needs than teenagers.

As a teenager, I started high school at 7:30 a.m. (yep, Rochester Adams still hasn’t changed its start time since then.) I would often get to sleep around 11 p.m. or later – not because I was playing video games or texting, which didn’t exist in 1991, but because my circadian cue for sleep onset was later than it had been when I was a child. I would have to wake up around 6:30 a.m. to be to school on time, which often meant that I was sleeping 6 or fewer hours each night. I don’t think I remember anything from my first two periods throughout high school. I would sleepwalk through my morning and “wake up” around midday. I would often nap in the afternoon. And still my daily sleep wouldn’t add up to nine or more hours.

There’s a nice piece on the CBC about experiments with changing school start times that includes an interview with the principal of the Canadian schools involved. It reviews the science of adolescent sleep, which shows that sleep onset at adolescence is later – sometimes as late as 11 p.m. or midnight. Alongside that later onset is a need for greater sleep, on average ten hours each night. The school day for students participating in this program runs from 10 a.m. until 4 p.m., no shorter than for those peers who start at 8 a.m. or earlier. And there’s some anecdotal evidence that it improves grades and attendance. What’s most interesting about the story – as is so often the case – is the comments. Adults weighing in on this change in start times refer to teenagers as lazy, point to their distraction by media technologies and lack of daily labor, and generally dismiss the science of sleep.

Was I just lazy as a teenager?
Are today’s teenagers more easily distracted away from sleep with the proliferation of media technologies?

The science says no. But why might adults be so rigid in their thinking about the social obligation of the school day? Many commenters on the CBC article fall into a slippery slope fallacy, assuming that today’s “lazy” teenagers will be tomorrow’s “lazy” workers and demand that work times shift to later in the day as well. The science doesn’t point to the need to change our work days – though there have been some movements towards flextime and workplace napping – but many of the adult commenters don’t even appear to buy the premise that sleep needs change throughout the life course.

As I discuss at length in The Slumbering Masses, the basis of modern school start times lies in the 19th century, when public schools were developed to care for the children of day laborers—meanwhile, the elite would send their children to boarding schools. The school day developed alongside the industrial workday to allow parents to drop off their children while they worked. There’s nothing natural about it—it isn’t based on some agrarian past where we were more in balance with nature. Instead, it had everything to do with the need to fill factories with able-bodied adults from dawn until dusk and to keep their children busy. Only slowly did this change, as American work schedules changed. Now science can support the organization of our daily obligations – or at least support the advocacy for more flexible institutions, that take things like variations in sleep need seriously.

But why be so rigid in thinking about teenagers being lazy and school start times being just fine as they are?

One of the things that comes through in the comments to the CBC story is that many adults feel as if they did just fine in high school, and that today’s youth should be just fine as well. In one commenter’s language, changing school start times amounts to “molly coddling” teenagers and playing into their entitlement. High school, it seems, is hazing for entry into the “real” world of adulthood, emblematized by work. While this is surely part of what school is intended to do – it models the demands of the workday with deadlines and expectations of outcomes – it is primarily intended to produce competent citizens. If changing the start time to slightly later in the day leads to more engaged citizens and more capable workers, shouldn’t we change our school days?

More insidious and less obvious is that many people have come to think of our social arrangements of time as being based in some innate human nature. If we accept the basic premise that sleep changes over the life course, that alone would nullify any standard of time usage. But many people tend to rely on small sample sizes to think about what’s natural and what’s not; just because modern social formations work for you – or seem to – doesn’t mean that they’re natural or that they work for everyone. How many cups of coffee do you drink each day? Or how much caffeinated soda? Have you eaten a snack today to offset sleepiness? Or taken a nap? Could you have gotten through your day a little easier if you slept in an extra hour?

There’s nothing natural about alarm clocks. And many sleep researchers and physicians would say that they’re one of the worst things for good sleep. But we use them anyway. Maybe it’s time we start to take the science of sleep a little more seriously and begin to rethink how we want our days to be organized. If we could be happier and healthier workers and students, it’s worth the investment in change and thinking past our expectations of nature and norms.