Skip to content
May 22, 2019 / Matthew Wolf-Meyer

Everything I Needed to Unlearn I Learned from Sid Meier’s Civilization

I’ve been playing Sid Meier’s Civilization my whole video-game-playing life. If you don’t know it, it’s a slow strategy game that models the origins of “civilization” through the near future. Players choose a “civilization” to play (what anthropologists of an earlier era might refer to as a “culture group”) and take turns conducting research, moving units around to explore the randomly-generated board, engaging in diplomacy, waging war, and modifying the landscape to extract strategic resources. Players start by placing a settlement that will grow into a dynamic, urban, capital city over the next 6000+ years of gameplay. If that sounds boring, somehow the designers of the game have managed to overcome the implicit boringness of the premise, and made a game that can half-joking ask players when they’ve finished the game if they want to play “just one more turn” and know that many will. Which is all to say that Civilization is slightly compulsive, and I have lost many nights to playing the game into the wee hours.

The cover of the original version of Sid Meier’s Civilization from 1991. Somehow it perfectly captures a lot of what’s wrong with the game…

Civilization is almost educational. Or it would be if it didn’t fly in the face of a century of research in the social sciences (which I’ll get to briefly). I often think about having my undergraduate students play it, largely because it relies on a set of presumptions about how “civilizations” work, and what differentiates “successful” ones from those that “collapse.” As a game, it attempts to model how societies move from being small-scale, early agricultural communities with a small government to a much larger, continent-spanning, industrialized nation with a “modern” form of government (i.e. democracy, communism, or fascism). All of these are based on a player’s progress through the “tech tree,” a set of unfurling technologies that move from pottery, agriculture, and the wheel, to sanitation, nuclear power, and space flight. If that sounds like unilineal evolution, that’s because it basically is; if it doesn’t sound like unilineal evolution, it might be because that’s an unfamiliar term, which might be familiar in its assumptions.

Unilineal evolution is the idea that there are stages to social development, and societies move from a state of savagery, to barbarism, to being truly civilized. Popular in the US and Western Europe in the late 1800s, unilineal evolution was one of the underlying justifications for imperialism (the “white man’s burden” was to help all of those “half-devil half-child” “savages” move up the tech tree). As a theory, social scientists threw unilineal evolution out decades ago, pointing to the racist, colonial biases in a theory developed by a bunch of white men in the global north that posited that the features of societies in Western Europe (and, begrudgingly, the northeastern US) represented the pinnacle of civilization (secularism, representative politics, industrial capitalism, heteronormative kinship, etc.).

Over time, anthropologists and historians did a pretty good job of showing how wrong that kind of thinking is, beyond its implicit colonial racism. First, civilizations like China and Japan made it fairly clear that a society can have some of these civilizational features without having all of them, and that the development of any one of them doesn’t necessarily depend on the development of a specific preceding stage or technology (e.g. you don’t have to have polytheism before monotheism, or monotheism before secularism; or or the germ theory of disease before sanitation). And second, it became increasingly clear that the idea that societies move from “simple” to “complex” forms of institutions ignored just how complex “simple” institutions can be. What looks to be “simple” from the outside can be exceedingly complex from the inside (e.g. kinship systems in Papua New Guinea). But some form of unilineal evolution persists in Civilization, and it’s very apparent in the biases baked into the game.

Early versions of Civilization were pretty straightforward in their biases. It was difficult to win the game with anything other than a market-driven democracy, even if you were a warmonger (you’ve got to have a market system to pay for all that military R&D and unit support, after all). Over time, Civilization has become a more modular game. It used to be that adopting a government like Democracy came with a set of predetermined features, but now Democracy has a base set of rules, and players can choose from a set of “policies” that offer a variety of bonuses. In that way, you can play a Democracy that depends upon an isolationist, surveillance state or a peaceful Communist state that provides its citizens with amenities to keep them happy. Better yet, the designers chose to separate the technological and “civic” trees, so one needn’t research the wheel before democracy (which can also allow for a civilization that is scientifically invested, but ignores “civic” achievements). But one of the biases that persists is technological determinism.

It might seem silly to suggest that a society needn’t invent the wheel before inventing gunpowder, but the wheel is not a precondition for chemistry. Similarly, one needn’t understand shipbuilding to develop atomic theory. Yes, we live in a world where the wheel preceded gunpowder and shipbuilding preceded atomic theory, but on a planet with a Pangea-like mega-continent, shipbuilding would be unnecessary. Access to some bat guano, sulfur, and charcoal resulting in gunpowder isn’t so hard to imagine preceding the development of the wheel. In all cases, what actually makes a technology possible are the social demands that compel research and encourage individuals and communities to harness a technology’s usage. Hence, gunpowder’s early discovery and widespread abandonment in China or how the refrigerator got its hum. I understand why, for the sake of the game, some kind of tech tree is important, but what continues to confound me is why there are technological bottlenecks where you have to have a specific technology before you can research further technologies (and the same goes for “civics”).

A persistent feature of the game is that each of the civilizations has some set of basic benefits, which can include special units and buildings, and, in some cases, suggest that there is something intrinsic about a civilization’s relationship with geography. Canada and Russia get a bonus for being near tundra tiles; Japan gets a bonus for fighting along water tiles; etc. At its best, these kinds of rules make the game dynamic. At its worst, it fosters a kind of Jared Diamond-esque environmental determinism. (Which, again, historians and anthropologists discredited long before his Pulitzer Prize-winning Guns, Germs, and Steel — but, institutional racism is hard to overcome!) A more nuanced game might allow players to mix and match these bonuses to reflect the complex relationship between what societies value and the landscapes they have to make do with.

One other enduring problem in the game is that the designers really want to focus on the positive side of civilization. These days, Great People can be recruited to join your civilization, each of which has a positive effect (e.g. Albert Einstein gives a boost to your scientific output). But what about all the terrible “Great People” in history? What about the slave trade, on which contemporary capitalism was built? When Civilization 6 was initially released, environmental change (i.e. the Anthropocene, which is what the game is all about) wasn’t included in the game, inspiring the rumor that it was too controversial to include. Maybe including things like racism and ethnonationalism would make the game too grim; maybe the designers simply want players to provide those narratives to the game as they play it. But if any of the criticisms of my above concerns amount to “but that just isn’t realistic,” so too is the history of human civilizations without the ugly side of the nation-state and everyday politics. (As I write this, I kind of wish there was a “utopia mode” that would allow players to avoid things like fossil fuel combustion, factory farms, and the gendered division of labor, to name just three.)

This is clearly not an exhaustive list of all of the problems with Civilization. Whatever its problems, it provides a basis to rethink some of the biases in history and social science — and popular culture more generally. Working through what’s wrong with Civilization helps open up what anthropology and history have done over the 20th century to change the way that social scientists think about “civilization” and what it’s composed of and how it changes over time.

It would be amazing if Civilization 7 was more of an open sandbox, allowing players more flexibility in how they play. It would also be great if there was more of a dark side to Civilization. I don’t think Civilization drove me to become an anthropologist, but it does continue to remind me — each time I play a game — of what has gone wrong with social theory over the course of the 19th and 20th centuries, and how we might work against implicit and explicit biases in the narratives that get told in video games and elsewhere. I hope the next version of Civilization gets up to date with contemporary social science, but, in any case, I’m not going to stop playing it…

April 23, 2019 / Matthew Wolf-Meyer

We’re Having a Generational Transition Problem…

That’s Luke and Yoda, from “The Last Jedi,” watching the original Jedi temple burn to the ground. My apologies if that’s a spoiler in any way.

There was a moment when a senior faculty member and I were talking in a shared departmental office — just catching up really. The faculty member was talking about their daughter, who recently had a child, started a career, got married, and bought a home. The senior faculty member said she was “finally getting it together” in her early 30s. And it dawned on me that my senior colleague was basically talking about me. I was the same age as their daughter, in a similar place in my career trajectory and personal life. It made me suddenly realize that part of the generational transition problem I was seeing in our institution and the academy more generally, was that Baby Boomers were in the position of handing things over to people that were basically their children’s age (thanks to a series of hiring freezes in the 1990s and early 2000s). When my senior colleagues looked at me, I realized that they were seeing their children or their children’s friends, with all of their career and personal foibles. Why would they hand off to those children, especially something precious like their career’s work of institution- and discipline-building?

I’ve watched senior faculty — nearing retirement — at several institutions basically sabotage departments, programs, and centers that they’ve built rather than anoint and mentor younger faculty to take the reins. For the last several years, I’ve been trying to think through what I’ve seen in the university, particularly around the transition from an older generation of scholars (mostly Baby Boomers) to people of my generation (Gen Xers, although I think I’m on the tail end of the spread). Why not hand off rather than let things fall apart?

It comes in many forms. The benign neglect of not having faculty meetings to talk about necessary changes to the curriculum as faculty retire. The secrecy — if not outright denial — about faculty retirements and when they’ll happen. The gatekeeping that insists on junior faculty consulting senior faculty with the classes they want to teach or improvements they seek to make. The lack of actual mentoring on the part of senior faculty toward their juniors. The deliberate ambiguity about institutional expectations and opportunities, spanning everything from tenure requirements to the availability of resources. And then there’s the more aggressive and deliberate actions that some faculty take: spiking junior faculty’s tenure cases, arguing against diversity hires as unmerited, withholding access to resources, and running centers, programs, and departments aground rather than help steer them in a new direction.

It’s hard not to see some of this behavior as a function of the changing demographics of faculty hires, including shifts in representation of gender, sexuality, race, ethnicity, disability, but also a greater diversity in the institutions that are producing Ph.D.s. Visions of what the discipline of Anthropology (and probably every discipline) are and will be are changing, sometimes radically. I can imagine that for many senior faculty, seeing these changes occur is alienating, and, for some, deeply distressing. Which all has me thinking that part of the generational transfer has to be some collective ego work: labor to help make evident to senior faculty that their lifetime of contributions to the field are vital, and also work on the part of younger faculty to articulate visions of Anthropology (and other disciplines) that redevelop the canon to acknowledge the generations before us while developing supple visions of the disciplines that build upon their pasts, address present needs, and develop livable futures.

Taken from the Louvre’s archives, that’s the image from an ancient Greek vase depicting a relay race (between naked Greek men, to be specific about it).

Years ago, a senior faculty member I knew well retired as soon as he could. His rationale was that he had spent the last 30+ years trying to build a specific vision of anthropology, and after decades of frustration with the institutions he was a part of and the colleagues he had, he was just done. He could have coasted for several years, teaching a set of courses he cared about, but he preferred to cut himself loose from the institution, travel, and write. I really admired his graceful exit.

Before that, a group of senior faculty I knew (different institution, different time) were dealing with the demographic and institutional shift among the faculty by thwarting junior faculty efforts to hire even more diverse faculty. It was only when a couple of the senior faculty broke ranks — acknowledging that the department wasn’t really theirs to build any longer — that the junior faculty had enough of a quorum to make the hires they wanted to. One of the junior faculty described it all as a problem of grace — that some people couldn’t manage the intergenerational transfer gracefully.

I’ve recently become aware of way more younger faculty quitting their academic jobs. Maybe this always happened, and I didn’t see it. But I know personally of several faculty in tenure track jobs (some tenured) who have either quit without a job lined up or have made a calculated exit from academia. And the internet is littered with additional examples. It’s hard for me not to see people driven to quit as responses to institutions that they don’t see futures in — and feel like they don’t have the mentoring or support to make a livable institutional future. Somehow I have a hard time seeing quitting as a form of grace.

The problem with both my impulse to interpret my senior colleague’s actions as one of gracefulness and that junior faculty’s similar impulse to interpret a lack of grace on the part of senior faculty is that it places the onus on specific faculty to behave in particular ways. If we’re going to navigate this intergenerational moment generatively, it’s going to be through collaboration, not individual choices.

That all said, I’m not sure what the right way forward is. I do know that universities are decidedly conservative institutions, and that incrementalism is probably the only sustainable way forward. What might that look like?

Develop sustained dialogues between junior and senior faculty. That might be through workshops or conference panels, or, locally, by having faculty give guest lectures in each other’s courses or discuss their work in seminars. Having a regular space to come talk about ideas, one’s scholarship, and one’s place in the field keeps lines of dialogue open. It also makes it clear that whatever else is happening, there’s a relationship that’s being maintained between people who recognize one another as scholars (even if they might disagree as respected experts).

Collaborate intergenerationally whether in writing or funds-seeking or conference planning. I don’t doubt the first two of these can be hard, but it might be a site where very deliberate mentorship can happen. Working on panels for conferences together (or local workshops) can serve as a way to introduce each other to networks of scholars with shared interests.

Share writing. Writing is so central to the profession, and, for better and worse, people’s relationships with their work. Sharing writing, often without the pretense of needing anything like feedback, is helpful to keep lines of communication open, but also to help develop expanding networks of connection between scholars across generations.

Create structures of care, which can range from the occasional check-in email, a meal or drinks, or even home visits if you’re familiar enough. Some of the best, most humane interactions I’ve had with other faculty have been in one-on-one meals or drinks — not dinner or department parties — and they’ve produced some of the most lasting scholarly friendships I have with people more senior than me.

I’ve been very deliberate to pitch these suggestions without presuming that invitations need to come from juniors to seniors or vice versa. Kindness is the rule, and building a sustainable future depends on actors across generations working together to have something to hand off to the generations to come.

Other ideas? Other good experiences? Please post them in the comments.

March 12, 2019 / Matthew Wolf-Meyer

The Mentoring Compact

Faculty mentoring of graduate students is one of those things that is rightly the subject of recurrent conversation; there are good mentors and bad, lack of clarity in faculty expectations and student responses, sustainable and deeply-broken models of graduate student training, all of which seem to perpetuate themselves (often unreflexively). The faculty who are best at mentoring recognize that it is a dynamic process, and that not one model works for all students, and, moreover, that the process of mentoring students leads to new techniques and understandings of the process. Sometimes it takes graduate students to precipitate some faculty growth. That all said, this is what I’ve learned in my eight years of graduate study and eleven years of working with graduate students, which I offer as a two-sided compact, what students should do, and what faculty should provide:

That’s the original Karate Kid being mentored by Mr. Miyagi, maybe the greatest mentor of all time? (Apologies to Yoda and Obi-wan…)

Students: Keep lines of communication open with your adviser and committee. If you’re a graduate student, don’t wait for your adviser or committee to contact you. Instead, make a regular practice of keeping people up to date about what you’re doing and how things are going. I make this suggestion because I’ve found that when things start to go poorly for graduate students (during grant writing, dissertation research, dissertation writing, job seeking, etc.), many students take to not communicating with their committee, often, it seems, out of fear of communicating that things are going poorly. If you send your adviser a monthly email keeping them abreast of what’s happening, it keeps lines of communication open and ensures that when difficulties arise, there’s already a channel open. (Other committee members might receive email every four or six months.) Just answer these three questions: 1) what have you been working on?, 2) what problems have you faced?, 3) how have you addressed those problems? (#3 is a good place to ask for help, if needed.)

Students might worry that sending advisers and committee members emails obliges them to respond, thereby creating unnecessary work for faculty, but it’s okay to preface emails like this with something along the lines of “There’s no need to respond to this email; I’m just writing to keep you in the loop.” Most faculty, I’m sure, will take the opportunity to not respond, but know that faculty are keeping students in mind when they receive emails like this.

(I’ve thought about writing a contract with graduate students, part of which would give them an automatic out of the advising relationship. For example, if you’re my advisee and I don’t hear from you for six months, then I assume I’m no long your adviser. I’ve watched students struggle with taking faculty off their committees, often because the lines of communication between faculty and student are troubled. But I’ve not gotten to an actual contract yet…)

Faculty: Have guidelines for responding to student emails. I tell my advisees that I’ll always respond to an email within 24 hours (unless it’s the weekend or I’m traveling); if I’m a committee member, it’s no more than 72 hours; if I’m just some random faculty member, it can be up to a week. If it’s an actual emergency — and I can do something about it — I’ll break these guidelines. If I’m going to be running late because I want to be thorough in my response, I always make sure to send an email to that effect. That said, I try and abide by a minimalist email policy and send as few emails as possible (if only to have a very clear and direct chain of communication). Only when students start working on their dissertations do I give them my phone number, since I assume that before that the kind of help I can provide is largely bureaucratic (i.e. email and meeting based).

Students: Do what faculty ask you to do. One of the recurrent sources of frustration voiced by faculty who work with graduate students is that students come seeking advice, faculty do a lot of work in making suggestions and providing feedback and resources, and then students don’t follow through by doing what faculty ask them to do. Even if you think the suggestion is off base, it’s better to do the work than to avoid it; showing a faculty member that you did the work and proving that the suggestion was insufficient or off base is a clearer demonstration of the paucity of a suggestion than not taking the suggestion seriously. If you can’t do something, it’s so much better to explain why you can’t than to just not do it (which open lines of communication can facilitate). If an assignment (or job task, like grading) has a clear set of instructions, follow the instructions as provided. Again, it’s better to show the paucity of the instructions by following them than to let faculty think you’re just lazy and trying to find workarounds.

Faculty: Be clear in your expectations and provide instructive guidelines. When I have teaching assistants grading for me, I provide them with very clear rubrics to use; when I am helping students generate reading lists, I’m very clear about how many items should be on it, and what kinds of things those readings should be (e.g. books, chapters, articles, etc.). I find that being very clear in my expectations helps students immensely, and that when they don’t follow the instructions I’ve provided, I can point to the instructions as the basis of our next interaction (see below). I try and take notes of my conversations with students and provide them with a copy of those notes after the meeting (either in writing or via email) so that I know we’re on the same page.

Students: Give faculty lead time to prepare themselves for what you need. There’s very little I find as frustrating as someone else’s deadline being imposed on my work schedule. Having students give me something they’re seeking feedback on shortly before the due date is a case in point: if you want careful reading and generative feedback, I probably need a week to fit it into my schedule and make sure I give it the time it needs. Preparing faculty for upcoming deadlines and the prospect that you’re going to send them something ensures that you get the attention you want. This might be something to communicate in a monthly email (e.g. “I have a fellowship deadline at the end of the month and plan to send you the application in two weeks.”), and is definitely something to give people at least a week or more to prepare for. If it’s big — like a dissertation draft — give them a month or more to prepare for it.

Faculty: Tell students what their windows of opportunity are. I’m pretty regimented in my work planning, and I imagine most faculty are. Because of that, I know — roughly — when I’m going to have more or less time to give students feedback, set up meetings, etc. At the beginning of the semester, I try and give my students a sense of when these windows will be, and try and set up deadlines around them. For example, when I know a student is going to be sending me a dissertation draft, I let that student know when I’ll have a week to dedicate to reading it and commenting on it. If they miss the window, I’ll still get the work done, but I’m clear that it will take me a little longer than if I have it in the window.

Students: Plan to educate faculty on standards and policies. This is especially true for faculty new to your institution or in other departments than your own: faculty tend to not know what the policies are that dictate student lives. If you can provide them with written documentation (i.e. from a graduate student handbook), it can go a long way to clarifying faculty expectations of your work. If standards vary from policies, then you also want proof of that (e.g. if the graduate handbook says comprehensive exams comprise 100 texts but everyone actually does 75, bring some recently defended comprehensive exam reading lists to talk through). Faculty may not vary from the policy as written, but if there is an emerging norm, you’ll want them to know about it and have proof that it exists.

Faculty: Provide a prompt for the material basis for meetings. I find that having some kind of written product to talk through with students makes meetings feel much more productive than not having something to focus on. This can be a dissertation proposal, a grant application, a reading list, an annotated bibliography, an article manuscript, something you’ve both read recently, etc. Having something to focus on ensures that the conversation is well focused and there’s a direct outcome of the meeting. There can be small talk too, but having a clear work plan for you and the student helps to make sure that there are deliverables and that the student has the feeling of being materially supported.

That’s Daniel-san, having vanquished the bully Cobra Kai thanks to the inimitable mentoring of Mr. Miyagi.

Elsewhere, I’ve provided some guidelines for thinking about how to compose a dissertation committee, and what the overall professionalization timeline might look like given today’s academic job market. The latter might be especially helpful in thinking about the material basis of meetings and to provide a trajectory for the mentoring relationship (at least during grad school). Other tips? Insight? Post them in the comments.

February 25, 2019 / Matthew Wolf-Meyer

On Having an Ax to Grind

“Productive scholars have an ax to grind.” That was a lesson imparted on me by one of my undergraduate mentors, Brian Murphy. We were walking across campus during my senior year, and I had been talking about the possibility of pursuing some kind of graduate degree, at the time in Literature. Brian was narrating how, despite enjoying the scholarly work he had done throughout his career, he never felt particularly driven to participate in the arguments motivating many scholars in the discipline. (Little did I know that the postmodernism debate was in full rage mode at the time.) Frankly, I didn’t really know what to do with the advice at the time, but I tucked in away.

A man comes to get his ax ground, sometime in the Middle Ages? (From Married to the Sea)

While I was working on the Master’s degree that followed (in Science Fiction Studies at the University of Liverpool), I had things I was interested in, but the work was driven more by curiosity and expediency than having a real argument to make. Over time, the thesis I wrote there developed more of an argument and ended up being publishable as a couple of articles about superhero utopias and the role of law and capital in superhero comics. But to this day, I’m not sure I have much of an ax to grind when it comes to superheroes.

It was while working on revising that content that I received a second piece of advice, this time from Hai Ren, a faculty member I worked with at Bowling Green. Hai suggested that to write a dissertation, one needed “three theorists.” Hai’s point, as I understood it, was that you need some parameters on the ideas that you’re working with, and that having three theorists — who, he suggested, one reads in their entirety (queue up the qualifying exam reading list) — gave a writer the ability to play off differences and consensus between sets of theory. If I wasn’t sure what ax I had to grind, Hai gave me a way to craft one.

I’ve made the same recommendations to students over the years, but I add that the theories that one adopts should really be ontologically compatible. So monists and dualists don’t go together, nor do communists and free marketeers, nor biological determinists and social constructionists, etc. I had started thinking about this after reading Judith Butler’s The Psychic Life of Power, where she draws together Freud, Lacan, Bourdieu, Foucault, Kristeva, Irigaray, Hegel (and others, I’m sure, but memory fails me). Butler’s “toolbox” approach struck me as eliding the profound differences between a thinker like Freud, who really believes in some form of biological determinism, and Foucault, who really does not. You can put them together, but you can’t really build a sound theory out of them because the ontologies don’t fit together. That is, unless you find ways to treat some thinkers as existing within an ontological paradigm developed by others who you take more seriously (e.g. Freud’s use of biology is a form of Foucaultian discourse and not really materially reductive. But I’m skeptical.).

If you go look at the introduction to The Slumbering Masses, I’m pretty explicit about using three sets of theory and having an ax to grind: I’m trying to work through the overlap between Bruno Latour, Bernard Steigler, and Gilles Deleuze & Felix Guattari, and I’m trying to bring them to bear on how we conceptualize the interaction between medicine and capitalism in the U.S. That means, in part, that I’m rejecting medicalization as a way to think about human nature and its interactions with capitalist forms of medicine (which you can read about here).

That doesn’t mean that I’m only working through the theories that come out of those four, white, relatively elite, able-bodied, heterosexual French men. I’m intensely aware of who these men were (and are), and use their monism to engage with other thinkers (especially Genevieve Lloyd and Moira Gatens, two Australian Spinozist philosophers) and the subfield interests I have (especially science and technology studies and feminist medical anthropology). Those engagements helped me suss out things from the theories I was using and guided me through my interactions with those rather large fields of literature. It also gave me a way to talk about things like my “contributions” to the field and the “significance” of my research (scare quotes to denote my general skepticism of that kind of grant-speak criteria). In saturated areas of study, being clear about your theoretical commitments can also make clear what you’re doing differently than other people working on the same topic or area of study.

I try and get students to think about what they believe. Stop thinking through the ecumenical polytheism of graduate study, and consider what kind of world you want to make with your scholarship. What is the ontology that you’re committed to? And who are the right thinkers to join to that project? It shouldn’t just be people that you enjoy reading, but people (and sets of theories) that you fundamentally share a common sensibility with. In committing to a set of thinkers, what differences can you map out between them and how might they guide your interactions with key concepts in your field? That can provide a ton of grist for the mill, both in terms of the initial dissertation, but also in articles and other spin-off projects.

I have other axes to grind — especially around racism in science and medicine — and those too are informed by my theoretical commitments. Having a pretty solidly determined ontological commitment gives me a framework to engage with whatever springs up. And, over time, I’ve changed the people most central to the projects I work on now. But having a set of theoretical commitments helps to guide what and how I read as well as the kinds of questions I ask about the phenomena that I’m drawn to work on.

It took a while, but now I have axes to grind…

February 11, 2019 / Matthew Wolf-Meyer

Experimenting with Montessori in the College Classroom

When my first child enrolled in preschool, I started to get interested in Montessori approaches to education. As a pedagogical practice, Montessori approaches have largely focused on preschool and elementary education, but there’s a growing interest in finding ways to apply Montessori approaches to higher education — through middle and high school, as well as in the university classroom. In Fall 2018, I taught a small Social Studies of Science and Technology class where I decided to experiment with a more Montessori-like approach. At its heart, Montessori education seeks to instill in students the ability to ask their own questions of the course material, and to facilitate their finding answers to those questions. Rather than impose expectations of content from above through the lecturer-expert instructor, a Montessori approach seeks to create a more symmetrical relationship between the instructor and students. Overall, it seemed to work pretty well.

That would be Theseus following Ariadne’s thread straight into the Minotaur’s lair. A bit of a metaphor, maybe?

The course ended up only having five students (down from a peak of 13), and they were sophomores and juniors, only one of whom was an Anthropology major (which matters only because it was an Anthropology class, ostensibly). I declined to put General Education distributions on the class, which likely kept the enrollment low since students are Gen Ed hungry at my institution. I have a sneaking suspicion that the format of the class might have also turned off some students, especially since my recent experience is that many students just want to be lectured to (hence my experimenting with a format like this). As a result, I thought the class was a little too small, and that a larger class (more like 12-15 students) would have worked better. As it was, one student was really engaged, and most of the other students seemed to be along for the ride… (You can see a copy of the final version of the syllabus here, as well as a list of the guiding questions and threshold prompts.)

This is how I structured the class: We started by reading a book that I hoped would set the tone for the class, and also help the students generate a list of “guiding questions.” That book was James Jones’ Bad Blood, which is about the Tuskegee syphilis “experiment” and the long history of medical and scientific abuses of racialized bodies in the United States. It raises all sort of questions about race, objectivity, data, ethics, media, and methodology, and is written for a pretty general audience. We took a class period to watch Nova’s The Deadly Deception episode, which interviews Jones and brings the book’s project up to date (circa 1993). We then took a day in class to generate a list of guiding questions that would help frame the next section of class. This all took about a week and a half.

As we prepared for the next section of class, we read the Introductions (and sometimes first chapters) of several books — Helen Verran’s Science and an African Logic, Kim TallBear’s Native American DNA, Priscilla Song’s Biomedical Odysseys, Warwick Anderson’s Colonial Pathologies, and Mario Biagioli’s Science Studies Reader. My goals in doing so were to give the students a sense of the breadth of possible topics that might be covered, as well as the shared mission of science studies scholars. We spent a lot of time talking about methods and the citational practices of each of the authors. Who was being cited, how were they being cited, and what specific articles, chapters, or books were being discussed?

That led us to generate a list of author names and readings that we used our guiding questions to shape. As much as we could, we relied on the Science Studies Reader to provide us with readings, which was easy for a lot of the most canonical content (e.g. Bruno Latour, Michel Callon, Donna Haraway, Sandra Harding, etc.). It also led us to select readings from the Science Studies Reader that I normally wouldn’t have picked, but which made sense given our guiding questions. Toward the end of this section, we again took a day to generate a list of updated guiding questions, which revised the earlier list with some nuance and added several new questions.

At this point, the students had their first assessment, a written Threshold assignment. Each of the Thresholds — and there were two more throughout the semester — posed a big question and asked the students to pair guiding questions to it in the search of an answer. Students were free to use any of the course materials from the preceding section by way of providing their answer, and could pair the guiding questions in any way they wanted to. Over time, they had more and more guiding questions to draw from, and at only one point did I ask them not to use a specific guiding question (because they had all used it once already) in a Threshold paper.

Since I had demonstrated to them how to go about finding readings associated with the guiding questions (through our citation tracking), the next section of the class was curated by the students. They picked guiding questions they wanted to seek answers to, and I paired them together based on their shared interests. In a larger class, the groupings would have been larger, and this section of the class would have lasted longer. Each student was responsible for presenting a reading to the class, and they could draw on any of the books we had on hand, as well as articles and book chapters they located through library searches. The result was way more diverse than I would have ever planned — they picked scholars, topics, and readings that I had never encountered.

The presentations — like all student presentations — were a mixed bag. But when I needed to, I intervened to keep things on track and get students to think about the connections between the stuff they had picked and the other course content. And throughout the course, I sometimes stepped into the role of lecturer, especially when they were encountering difficult content for the first time. This didn’t stop during the presentations, but I tended to use my interventions sparingly and definitely let students feel the pressure of being underprepared.

Because the group was rather small, the time I had set aside in the syllabus for student presentations was too much. As we approached the end of the student presentations, I asked the students what they were curious about and we generated a list of topics. It ended up resolving into a section on bodies as epistemic objects, and we covered a wide variety of kinds of bodies, from the microbial to the human to the planetary.

Opening the class to student curiosities — and supporting their labor — definitely resulted in a different class than I would have planned on my own. That said, in many ways the kinds of questions and answers the students generated were along the lines I had hoped they would be, but the ways they chose to get there varied (especially in terms of the readings they chose). A larger version of the same class would have probably been much more dynamic. I doubt such an approach would work for a class larger than ~25 students, but maybe…? If you experiment with classroom approaches like this, let me know — I’m really curious about how it might be refined.

January 28, 2019 / Matthew Wolf-Meyer

Diversifying the Network

In one of the first meetings I had with my dissertation adviser, Karen-Sue Taussig, she recommended that I read Catherine Lutz’s “The Gender of Theory” and “The Erasure of Women’s Writing in Sociocultural Anthropology.” (If you haven’t read them, go read them right now.) Lutz makes two interrelated points: despite the number of women working in sociocultural anthropology, they tend to get cited less frequently than men, and when they are cited, they’re cited as providing empirical evidence that supports an argument rather than theory that can be tested or employed. (And if you think that was a problem of the 1980s and 1990s, you can read the follow-up, “The Problem of Gender and Citations Re-raised in New Research Study” [although the link doesn’t seem to be working…] and then mull over what’s really going on in pieces like this.) At the age of 25, and a few years into my graduate studies, I might have been in just the right frame of mind for such an intervention. It resulted, immediately, in a hyperawareness of my citational practices — and shaped the kinds of questions and projects I wanted to pursue.

One of those projects has been steadily diversifying the network, both personally and professionally. In 2017, I was asked to comment on an early version of Nick Kawa, José A. Clavijo Michelangeli, Jessica L. Clark, Daniel Ginsberg, and Christopher McCarty’s “The Social Network of US Academic Anthropology and Its Inequalities,” and reading its final version was a stark reminder of just how much work is to be done. If you ever wanted evidence of that, here’s Kawa et al.’s data rendered in one handy image:

kawa social network us anthro.png

A network analysis of Ph.D. placements of tenure-track faculty based on where their degree originates and where they were hired. See more here.

Here are some practices to consider if you want to disrupt the reproductive tendencies of the discipline at every level. My guiding principle is that power is meant to be subverted, and whatever meagre institutional and reputational power I have should be used to make more inclusive social and institutional networks.

Every year when I’m pulled back to the American Anthropological Association meetings, I make sure that I participate in two panels. One has to include a majority of people who I’ve never been on a panel with before; and one has to include at least 50% recent Ph.D.s (or in-progress ones) and contingent faculty or “independent scholars.” Sometimes both of the panels meet both of the criteria. I’m not sure that I have much draw on my own, but whatever draw I have should be shared with less secure or established scholars than myself. Beyond that, I want to be exposed to ideas and research that I wouldn’t otherwise encounter. I can read my friends’ work any time, but curating a panel with strangers on a topic of my choice lets me engage with new content and publicizes it for others. It also means that my network grows in these AAA-related spurts, and I’ve watched my network permanently diversify over the years through this practice.

If you keep having the same conversation with the same people, something is wrong. Even if those people are diverse, if the network stabilizes, it’s not being as inclusive as it could be. It can be hard to exclude old friends from conferences, workshops, special issues of journals, whatever, but if the collective project is to diversify the network, they should be doing the same thing to you. And this leaves you open to be included in other people’s efforts. Stale networks are pretty obvious, both from the inside and the outside. My guiding rules are a place to start to disrupt reproductive tendencies, and I’m sure that employing them will help refine a system that works for other people.

If someone asks me to do something and I can’t, I suggest a junior scholar or someone at a non-elite institution (or both). If I can’t do something — a talk, peer review, a conference panel, whatever — I always try and make sure that I provide at least three names of people who might fit the role. My preference is always for younger people than me, although I’m very sensitive to my ability to say “no” and the obligations younger scholars fell toward saying “yes.” That said, I will commit to doing something even as over-commitment if I know that the next person to be asked is someone who isn’t as diversity-focused as I am. Better a white person with an eye towards diversification than one who isn’t diversity focused (or at least that’s how I console myself).

I don’t just count citations; I also consider how a citation is being used. This is true for syllabuses and publications. I tend to start syllabuses by piling up books and articles that I’m sure I want to include in a class, and at that point make sure that the foundation of the class is diverse (i.e. at least 50% books by women, with attention to minority status ensuring that 50% of the books are also from authors from underrepresented backgrounds). After I put the rest of the syllabus together, I go through it and make sure that it’s diverse throughout. In cases where I have to include a dead, white, male writer, I make sure that the texts around that person are by other kinds of writers. I tend to make sure that 60% of a syllabus is comprised of non-white male contributors. I also try and make sure that theory and evidence are supplied equally by all of the contributors to the syllabus. (If you think that teaching the canon means only teaching dead white guys [or living ones], just remember that it’s not in the canon if it hasn’t reached the point that non-white, non-male scholars are discussing it!)

In terms of publications, I tend to make a first pass through the manuscript citing as few people as I possibly can. Part of that is pragmatic — I don’t want to get hung up on inserting citations, and if there’s a lot of new stuff I’m planning to cite, I prefer to do all of the data entry and management during the revision process. But the other part is that I learned in the past that I over-cite. I would tend to cite too many things and then have to remove them to reach the word limit I was shooting for. I found that having to remove citations was harder than having to put them in afterwards, and that working this way helped to see who I really needed to cite. Moreover, it meant that when I was inserting citations, I could be more deliberative about who I was citing for what. Like with my syllabuses, when I do have to cite a dead white guy, I try to ensure that the citations around him are more diverse. And when I have to engage with a lot of white guys, it’s usually because I’m doing some critique…

All of these citational practices are aspirational, and I’m sure that not all of my publications meet the criteria I’ve set for myself over the years. That might be hypocrisy, but it’s also due to requests from peer reviewers and editors to cite certain work and the stark reality that working in some corners of academia means there are limited sets of scholars to engage with. The solution to the latter is to develop frames for one’s work that are capacious and bring in perspectives from feminism, critical race studies, disability studies, class-focused research (not just Marxism), and postcolonial studies. The solution for the former — sometimes — is to just not cite those people, despite requests (which gets easier to do with seniority).

When serving on hiring committees, one of the implications of Kawa et al.’s research is the need to make sure that the committee is institutionally diverse. One sure way to at least contest the dominance of particular departments in the placement of Ph.D. holders into tenure track jobs is to have people who aren’t from those institutions serving on hiring committees. If your department lacks people that fit this criteria, have a faculty member from another department serve in a non-voting, consultative role. I served on a committee like this years ago, and it was helpful because the person from outside of Anthropology couldn’t have cared less about the institutions that people were coming from since his discipline had different elite institutions; he helped to focus other committee members’ attention beyond institutional backgrounds. If that sounds uncomfortable, you could have someone go through all of the applications and redact institutions, people’s names, and acknowledgement sections. (If there isn’t an Adobe macro for this, there should be…)

I’m convinced that underlying a lot of the resistance to change in the academy is a fear of being displaced in the present and the future, especially in the context of fears about the end of the tenure system and job scarcity. Wholesale displacement is unlikely, but some marginalization is inevitable. But that’s in relation to a century and a half of dominance in the university by white, male voices, so it’s relative to total dominance. Incrementalism can get a bad rap, but when the allies in power are faced with their own potential obsolescence, a gradual approach can make important headway while ensuring that the threats to individuals are mitigated. Changing institutions is a long game, and keeping the end point in mind while addressing the concerns of the present is one way to ensure that change will come, however gradual it might be.

These practices are a start towards diversification. If you have other suggestions, post them in the comments or provide links.

January 15, 2019 / Matthew Wolf-Meyer

Introversion in the Age of Relentless Academic Self-Promotion

I am, by all accounts, a bit of an introvert. I’m awkward at groupings of more than four people and dislike parties of any size. I’ve made a career out of feeling more comfortable around books than people. And, in terms of my career, I’m nervous about every first day of class and don’t like talking about my research and writing in contexts larger than a group dinner (and even then constantly worry about talking too much). I can be comfortable — sort of — in the role of lecturing, giving a colloquium talk, or doing fieldwork, where it’s not about me or I have a performative frame that I’m comfortable with. Even writing this is difficult; if you look at the archives of what I’ve written about in the past, even the personal stuff is pretty clinical in its detachment.

dad-watching-from-the-old-lodge.jpg

That’s a beaver hiding in a lodge. I’ve borrowed this from Cheryl Reynolds and Martinez Beavers.

But I started thinking about my introverted tendencies lately because I found myself writing about myself in two book manuscripts in ways that I hadn’t previously. The first, about neurological disorders, had me brushing up my personal history with dementia and sensory and speech impairments. The second, about speculative fiction and social theory, had me dipping into my past to think about what I was reading, where I was reading it, and why it made sense in its moment. Those experiences, in turn, got me thinking about the intimacy of getting to know an author through the written word, a relationship that’s uneven, but something that I have enjoyed as a reader throughout my life. In the age of social media, that relationship-building seems more apparent on Twitter than it does in books and articles, as the demands of neoliberal self-presentation heighten self-promotion in one medium and performed objectivity in the other.

Around the same time, I was watching an academic friend using social media to promote a forthcoming book. He tweeted several times each day, on and off the topic of the book, and garnered tons of likes and DMs. Knowing him personally, I knew he was more like me than not — a bit of an introvert, but more seasoned through decades of experience. He was able to overcome those introversion tendencies to engage — at least unidirectionally — with interested readers. I bought the book — and I’m sure a lot of his other followers and friends did too — but I’m not sure that I would have pre-ordered it if it hadn’t been for his relentless use of social media leading up to its publication…

Which, in turn, had me thinking back to a conversation with a publicist when my first book came out. She recommended that I open a Twitter account, set up a Facebook author page, and tweet five times each day and post on the Facebook page once daily. I think I managed that for about a week or two before I couldn’t bear to do it anymore. It took me years to not feel bad about failing my publicist in that way, and years more to feel comfortable using blogging as a substitute for the publicity machine that she suggested.

Part of my being okay with my failure at self-promotion is that — like so many academics — I’m not in it for book sales. But I am in the profession for the conversation (and, since I’m what psychologists call “disagreeable,” the arguments too, but those seem harder to find). So when I find myself drawn to self-promotion (usually through Twitter, since self-promotion seems to be 50% of the medium and we’ve collectively agreed to that), it’s usually because I so want to have a conversation about something I’ve just published.

After six months of thinking about this post, I finally committed to write it — not because I felt like I had profound insights into my introverted self and how to manage an ethical and sensible web-presence, but because I didn’t. (The one tiny bit of advice I’ll share, and this is from Jean Langford, is to “pick your fidget” — if you’re going to fidget during a presentation or while teaching because it relieves stress, just pick one thing to fidget with. I empty my pockets and get rid of any easy distractions just in case, which means my fidgeting is usually confined to moving my feet in particular ways.) But I know I’m not alone. So many people are drawn to academic work precisely because they’re introverted, and based on what I can tell from the internet, they seem to deal with the same challenges that I do, albeit with variations. On some level, it’s possible to succeed being an introvert in the academy (since, it seems in a world of introverts, being able to manage one’s introversion is a real asset). On another level, maybe it goes to show how in a world of relatively introverted professionals, a little performed extroversion goes a long way.

That said, whenever I do the obligatory self-promotion that comes with an article, a  colloquium talk, a conference presentation, a book, a podcast — whatever — I still struggle with being uncomfortable about talking about any success, however marginal. Part of that discomfort is based on a deep knowledge of the contingency of any success — from growing up relatively privileged, to the luck of the draw with peer reviewers working out for me, and everything in-between. There’s no mitigating those benefits, other than through their subversion wherever and however I can.

I’ve also come to realize that building intimacy between authors and readers is not just a mechanism for selling books and driving downloads of an article, but also a necessary political praxis. So often it is women, people of color, and minorities who are compelled to give an account of themselves, and white, heterosexual men sit silently by. I have always been interested in the institutional ethnography of the US academy, which this blog has represented, but I’m increasingly interested in the affective qualities of the institution and how it shapes people (hence a new little book about peer review [which, I guess the mention of is a little self-promotional]).

I’m still struggling with wanting to delete this whole post. I’m going to accept that as an indication that I should do the opposite.

October 16, 2018 / Matthew Wolf-Meyer

What’s it like to be an Associate Professor? (Research University version)

Several years ago, I tried to sum up the perspective I had gained on being an assistant professor at a research university. I attempted to capture all of the things I either wasn’t told in graduate school or didn’t have a real grasp on until I was on the tenure track — and they were largely behind-the-curtain, what-the-job-is-actually-like details, including lots of meetings, emails, and teaching prep-work, alongside the demands of publishing and other scholarly activity. What does the job look like on the other side of tenure? Mostly the same, but there are some important differences.

sisyphus

Sisyphus and his beloved rock (representing institutional demands post-tenure)

The ink was barely dry on my tenure contract when I was asked by my dean to serve in an administrative role. It was probably about two weeks between when I was notified of the administrative approval of my tenure and this request on the part of my dean, which made it kind of difficult to say no to it (but it was really compelling administrative work, so I probably would have said yes anyway). In a nutshell, if the road to tenure is largely anticipatory and structured by tenure demands, the associate professor road is characterized by managing social relationships — with other faculty, with administrators, with students, with colleagues at other institutions, with journal and press editors, with university bureaucrats — many of whom helped on the road to tenure, and are now calling in their debts.

“Debt” seems like a slight mischaracterization, but “favor” also seems too light. These activities can range from work like peer reviewing for presses or journals (especially those you’ve published with) and reviewing grant applications, which I had been doing pre-tenure, but after tenure there was a significant increase in requests. It can also include serving on committees at departmental, university, and national organization levels — and being asked to do that rather than volunteering for it. In addition, there’s serving on ever-more dissertation committees as well as doing tenure review for colleagues at other institutions. Singularly, they don’t seem like much, but taken together, they can be time consuming — and for some faculty, they seem to provide a trajectory while they figure out what their path to full professorship looks like. This isn’t to begrudge these various debts and favors and the people attached to them, but just to note how they pile up — and continue to pile up — and to recognize that strategies need to be developed to handle them maturely.

I’ve been thinking about the non-arrival of tenure for years. After all the stresses around tenure and its quasi-mythological nature, when it happens, it’s actually a slow and drawn out process, which makes it less of a rite of passage and more of a long, bureaucratic process (which is what it is). Between the department vote, the dean’s approval, the university personnel committee, the chancellor or provost, and the president (and regents), there are a lot of not-final approvals (i.e. it’s still not a sure thing). At each stage, the appropriate administrator makes sure to tell you congratulations, but it’s not over yet. And by the time the tenure contract arrives, it has both felt inevitable and had some of the wind taken out of its ceremonial sails.

At least that was my experience. Some of my feelings about the tenure process may be due to the fact that I met my institution’s tenure requirements early and was able to go up for tenure a year ahead of schedule. I’m sure that for people who are in more precarious positions, tenure might come as more of a relief. But in any case, the non-arrival of tenure is also about what happens after tenure.

The assistant professor period is characterized by the project that is getting tenure — there’s real momentum around publishing the requisite amount of stuff, and there’s a deadline. But between associate and full professor, there’s not usually a deadline even when there’s clear expectations (usually doubling whatever it took to get tenure in the first place). Sabbatical is meant to serve as a research period, but it seems like most people use it as a period to reconnect with family and catch up on what they missed during the march to tenure. And unless you’re primed to get to work on the next big research project immediately, it can be hard to use sabbatical to its full effect. For better and worse, things slow down after tenure for a lot of people.

Part of that slow down is bureaucratic and an effect of the job and its duties. Part of it is also just straight up existential. After being told that tenure is the most magical thing in the world, the reality is that the job doesn’t fundamentally change — and, in many respects, there’s less time for research and writing when all of the other institutional demands are factored in. Teaching might be at a new plateau — you might get to the point where you’re teaching the same classes with few or no revisions and can autopilot them. But in my case, I was tired of teaching the same classes and needed to change things significantly (I moved from teaching mostly medical anthropology courses to teaching more general intro and social theory classes). And that meant spending more time prepping classes than I had in several years.

I had started working on my second book before I finished my first one, but like so many people’s second projects, it didn’t work out quite the way I had planned (insufficient funding, lack of dedicated time, difficulty with the IRB, etc.), leading to some redevelopments in the project and some slow down. Because it was a significant shift in focus, it also needed some time devoted to developing new contacts, reading new stuff, and just thinking through the problems of the project — all of which started during the assistant professor phase, but couldn’t really take off until after tenure. I also had a second child, changed institutions, and moved across the country, all of which slowed things down too. For some people, any of those events or aforementioned difficulties might have led to abandoning the project and starting over from scratch with something new — so I can understand why some people take a long time between getting tenure and going up for full professor.

But here’s the other thing: if tenure is marked by its non-arrival, full professorship is marked by its deferral. The difference between associate and full is largely administrative (yes, there’s a pay increase, and maybe there’s some prestige?), meaning that most associate professors are protected from being department chairs or serving as associate deans or conveners of university committees. For some people, it seems like the pay increase isn’t worth the trouble — which is compounded by the difficulties people face in getting a second project off the ground. Maybe we need better incentives, but more likely, we probably need better support to help people have time, money, and space to develop new projects.

This isn’t to diminish tenure — it’s important job security and helpful to have the bandwidth to explore new ideas and projects — but to point out how it isn’t a panacea. In some respects, tenure is integral to the kinds of favors that need to be returned (i.e. with tenure, you can be asked to do things that you can’t be asked to do without it). But the job is the job, and that fundamentally doesn’t change with tenure.

So what do I wish I knew about the post-tenure phase?

1. If you can change when you take sabbatical, wait until you know it will be immediately useful and not for preliminary work (like seeing if a project is viable). Make sure that you have the funds and access to get the data you need, and then use sabbatical time to make it happen.

2. Change your teaching only enough to freshen it up, unless you’re committed to finding a new niche in your department’s curriculum and spending  year or two doing so.

3. Be selective about what you agree to do. Sometimes opportunities quickly become obligations, and there will be plenty of both. It’s okay to say no to an invitation, especially because there will be more coming.

4. Develop small projects that can result in an article or a book chapter. These might be collaborations with graduate or advanced undergraduate students, experiential learning classes, or cannily constructed classes. Steady projects like these help to allay the big existential dread that might present itself in the absence of a second book-length project.

Post-tenure is a strange place. Prepare yourself.

April 10, 2018 / Matthew Wolf-Meyer

Start Your Own Professionalization Series

When I first started running a professionalization series for graduate students in Anthropology at the University of California, Santa Cruz, my plan was to create a curriculum that could be revisited annually (or biannually, in some cases), and could be arranged modularly. On the quarter system, it meant that we had 10 weeks for each term, and I hosted a professionalization seminar every other week (usually four each fall, and three each winter and spring). The idea was to develop the content and then rearrange it as student demand dictated.

June_30_1917 cr.jpg

In its earliest version, it roughly followed the job market, starting with a session about looking for jobs in anthropology, then writing job letters, a day spent talking about academic CVs, and then a section devoted to practicing conference presentations (since that lined up with the annual meeting of the American Anthropological Association). The winter term was generally focused on teaching, and we had seminars on writing syllabuses, developing assignments, and dealing with problem students. The spring term was devoted to academic writing, and I broke a discussion of writing academic articles into two meetings, the first on the content and form of journal articles, and the second on identifying the right journal to submit an article to. The spring term was a little more of a wild card, and over the years it had sessions on alt-ac careers, translating a dissertation into a book, and preparing for campus visits. What I didn’t cover — and this was because we had a dedicated course for it — was grant writing.

After the first couple of iterations, the students asked that we move some of the job preparation sessions to the spring, so that they could have some more time to prepare for the fall job market. The only challenge was that there weren’t many — if any — job advertisements that we could talk about, so I had to rely on ads I had saved over the years (most of which were quirky, which is why I saved them).

If you know this blog, then you know that most of the posts originated in the conversations that I had with students, and that, over time, I worked to summarize those conversations in the blog posts (or at least my side of the conversation). Once the posts had been written, I asked students to read them in preparation for our meetings, so that we could start with a shared basis for the conversation. That helped to move from meetings where I spent a lot of time seeing what I thought (following E.M. Forster’s “How do I know what I think until I see what I say?” dictum) to meetings where we could have more of a free-flowing conversation about the topic.

Over the years, I started to invite faculty to attend the sessions as well, usually trying to identify either a foil to my perspective (someone from a different generation or subfield or both), or someone who I knew had a special interest in whatever the topic was. I also made sure to advertise the topics in advance and invite the faculty to attend, which often related to their specific professionalization interests. If you’re inviting faculty, it’s really helpful to identify people that have been on the job market recently as well as people who have served on hiring or promotion committees. Those two perspectives — what the job search is like from the applicant and reviewer positions — is really demystifying. Because what hiring committees are looking for has been changing, it’s helpful to have people who are also familiar with promotion requirements, since they have a trickle down effect on search committees.

The one thing I tried to do and was never successful at was getting the faculty who taught the first year foundations seminars to build the stuff we were doing in the professionalization seminar into their courses (e.g. having students turn in a CV after we talked about them in the seminar). I couldn’t even swing it when I was teaching in that sequence, so I don’t blame anyone for that failure — but it would take an extra level of coordination. I think it’s a good idea, but difficult to manage with the changing content of the professionalization seminar and a changing cast of faculty teaching the foundations sequence. My hope was that getting students into the professionalization seminar early would serve to socialize them to the need for professionalization (instead of waiting until they were going on the job market).

So, here’s a plan, boiled down:

Come up with a curriculum. You can follow my Professionalization Materials checklist or come up with one of your own.

Schedule the series. It’s helpful to plan a whole term or semester at a time and to send out the calendar as early as you can. I found that 1.5 hour meetings worked well, although there were times that we could have talked for hours. And planning meetings over lunch made it possible for most people to attend. Please don’t schedule meetings to interfere with people having to take care of their children or after the 9-5 workday (stop academic self-exploitation!).

Invite faculty guests to share their experiences. Try for two for each seminar, and at least one if the convener of the seminars is a faculty member with no more than three faculty in the room (who were specifically asked to be there). Too many faculty perspectives can be confusing. And make sure that faculty know that they don’t have to prepare anything for their visit — they can just show up and answer questions.

Collect relevant materials. It’s helpful to get faculty to contribute CVs, job letters, teaching and research statements, syllabuses, grant applications, etc., to a common folder that graduate students have access to through the university or department. Some faculty can be a little squeamish about this, and in those cases I asked for paper copies that I copied and distributed.

Find some good websites to help guide the conversation. There’s no shortage of people chiming in on these concerns, and toggling between disciplinary and generational perspectives can be really helpful. And since you’re already here, this doesn’t count as self-promotion.

Keep it going. Sometimes there’s a lot of anxiety in a department and that translates into momentum for this kind of thing. But the triage model doesn’t actually help the problem of the continued need to professionalize students for the job market. Just having a regular space and time where students can talk about this stuff is really helpful and can improve morale while demystifying the job market. If you can also provide cookies, then even better.

 

March 23, 2018 / Matthew Wolf-Meyer

So You’ve Got a Ph.D. in Anthropology…

Okay, let’s assume the worst. You’ve spent the last 5-10 years focused like a laser on researching and writing your dissertation. You haven’t attended any conferences, let alone presented any of your research. You haven’t taught any courses, and maybe haven’t even served as a teaching assistant. You haven’t published any of your research. When you’ve talked to your committee members about preparing for the job market, they’ve dismissed your concerns telling you that “everything will work out” and “you don’t need to worry about it.” And now here you are, Ph.D. in hand, dissertation behind you, and the yawning chasm of adult professional life gaping before you…

Made in Abyss - 09 - Large 12.jpg

That’s a cat abyss.

So, the bad news: unlike during the fantastic world of yesteryear, there are no ready jobs waiting for you now that you’re credentialed. The slightly better news: there’s a lot you can do to make yourself appealing to potential employers, in and out of the academy. And the best news? There are many possible professional paths your Ph.D. has prepared you for, but it will take a lot of work to prepare for applying to those jobs.

Now, hopefully you already know all of this, and you’ve spent the last several years preparing yourself for the kind of job you are interested in, whether in the university system or outside of it (or maybe both). One of the best things you can do as you prepare to go on the job market — say a year or two out — is to start paying attention to job advertisements. Ads list the kinds of qualifications that they’re looking for, from specializations (teaching methods and intro, especially, as well as courses in your area) to the kinds of documents that they require (graduate transcripts, teaching statements, CVs or resumes, writing samples, syllabuses, and more).

If you’re looking at jobs outside of the university system, they’ll likely be looking for other kinds of documentation: white papers, policy documents, portfolios, etc. The best preparation for these paths is to complete coursework in relevant areas, if not actually achieve accreditation (like getting a Master’s degree in Public Administration, Public Health, or Museum Studies, for example). Beyond helping acquaint you with professional genres, they should also help to develop professional networks in the right areas. If you’re really on the ball, you can complete these degrees alongside your coursework for your Ph.D., using whatever tuition waiver you might have to cover the cost of enrolling in these complementary courses. But you might also have to go back to school to get these credentials and develop new professional networks.

In 2015 I ran a professionalization workshop with current graduate students and alumni who had moved into non-academic careers. The collected professionals painted a relatively positive picture of moving into careers outside of the university. One of them pointed out that in the university system, almost everyone has a Ph.D., making it a rather banal credential; outside of the university, a Ph.d. is treated totally differently, and coworkers respect the expertise that it represents. They also pointed out how a lot of Ph.D. students think that they’ve been deskilled, but that the kinds of skills one learns as a graduate student can be translated into desirable workplace competencies. “Teaching” is “Public Speaking,” “Writing a Dissertation” is “Copyediting” and “Research-based Writing,” and “Finishing a Dissertation” is “Research Management.” Then there are skills like grant writing and data management (all that data sorting and labeling) that you might have picked up along the way. After spending thousands of hours researching and writing a dissertation, you have more skills than you might think; but, as these professionals helped to make clear, academics don’t tend to think of all of the skills they have as marketable proficiencies.

That said, there might need to be an intermediary step to hone those proficiencies into actual skills that produce deliverables that employers can do something with. And there are opportunities like Congressional Graduate Recruitment Program that seek to do just that. There are also organizations, like RAND, that hire qualitative researchers — as do many of the think tanks and policy organizations in Washington D.C. (you just have to be cool with doing research on topics that other people determine, and, potentially, stomach their politics as well).

Over the last several years — thanks to the Great Recession and the blogosphere — there has been growing attention to the exploitative and demoralizing political economy of adjunct labor in the university system. If you’re not aware of what people have been saying, it boils down to this: there are fewer full time, tenure line jobs than there were before, and to meet enrollment demands, more part-time laborers are being hired to offer courses. These adjuncts tend to teach on limited contracts (one semester or one year), lots of courses, and for little pay — often a fraction of what faculty are paid for the same course. Sometimes, rather than adjunct labor that looks like this, it can come in the form of a Visiting Assistant Professorship or even a postdoctoral position, both of which may be better, since they likely offer benefits on top of the salary. Beyond the obvious bummer of being low paid for the same job, and having lots of work to do — which often is very intro-level focused — adjuncting can create a lot of anxiety as people struggle to secure future employment, and there’s not a lot of time to produce the stuff that would make one appealing for full-time positions (which is sometimes referred to the “adjunct trap”). If you find yourself in the adjunct trap, you’ll need to engineer a way out, which may mean refusing work in the short term in order to prepare yourself for a long term career (which may mean going back to school to get another degree).

If you’re interested in pursuing alternative to academic careers (sometimes shortened to “alt-ac”) there are growing resources to help Ph.D. recipients to navigate the divide between the university and other career possibilities. One of the challenges that job seekers face is confronting assumptions about what policy jobs or market research look like on an everyday basis. If you’ve spent the last however-many-years essentially being your own boss and working alone, many professional careers mean being managed and working with actual coworkers. Friends who have made the jump into these kinds of jobs report being surrounded by great, creative people — just as smart as any academic, but skilled differently.

So, hopefully you’re reading this as an early graduate student (or before enrolling in graduate school) and can take the time to plan appropriately. A recent conversation hosted by the Society for Cultural Anthropology on academic precarity got me thinking that it was about time for a post like this, since the initial foray on the topic made it seem as if it was a real shock that getting academic jobs is hard and that there are very few out there. I had also heard from a friend that she had witnessed a group of recent Ph.D.s openly fret about the prospects of going on a bleak academic job market since they hadn’t been prepared for the reality of it, and thought they might just jump into some alternative career — which is sure to be a rude awakening.

You might beat the odds and get one of the few academic jobs available in any given year. If so, remember that just as much as it’s about your hard work, it’s also about contingency, and that it’s important to make sure that the world of the university is more open to promoting diversity and fair hiring practices — as opposed to yesteryear (which largely favored white men from elite institutions). For everyone who works in the university system — faculty, staff, administrators, adjuncts, students — we need more attention to what makes desirable job candidates and support for creating more full time employment opportunities.

Somatosphere

Science, Medicine, and Anthropology

%d bloggers like this: