Category Archives: memoir

growing up with computers

Ethan Zukerman’s review of Kevin Driscoll’s The Modem World: A Prehistory of Social Media made me think back to my own early years and how I experienced computers and digital culture. I was never an early adopter or a power user, but I grew up in a college town at a pivotal time (high school class of 1985). As I nerd, I was proximate to the emerging tech culture even though I inclined more to the humanities. I can certainly remember what Ethan calls the “mythos of the rebellious, antisocial, political computer hacker that dominated media depictions until it was displaced by the hacker entrepreneur backed by venture capital.”

  • ca. 1977 (age 10): My mom, who’d had a previous career as a statistician, took my friend and me to see and use the punchcard machines at Syracuse University. I recall their whirring and speed. Around the same time, a friend of my aunt owned an independent store in New York City that sold components for computer enthusiasts. I think he was also into CB radio.
  • ca. 1980: Our middle school had a work station that was connected to a mainframe downtown; it ran some kind of simple educational software. The university library was turning its catalogue into a digital database, but I recall that the physical cards still worked better.
  • 1982-85: I and several friends owned Atari or other brands of “home computers.” I remember printed books with the BASIC code for games that you could type in, modify, and play. We wrote some BASIC of our own–other people were better at that than I was. I think you could insert cartridges to play games. The TV was your monitor. I remember someone telling me about computer viruses. One friend wrote code that ran on the school system’s mainframe. A friend and I did a science fair project that involved forecasting elections based on the median-voter theorem.
  • 1983: At a summer program at Cornell, I used a word processor. I also recall a color monitor.
  • 1985: We spent a summer in Edinburgh in a rented house with a desktop that played video games, better than any I had seen. I have since read that there was an extraordinary Scottish video game culture in that era.
  • 1985-9: I went to college with a portable, manual typewriter, and for at least the first year I hand-wrote my papers before typing them. The university began offering banks of shared PCs and Macs where I would edit, type, and print drafts that I had first written by hand. (You couldn’t usually get enough time at a computer to write your draft there, and very few people owned their own machines.) We had laser printers and loved playing with fonts and layouts. During my freshman year, a friend whose dad was a Big Ten professor communicated with him using some kind of synchronous chat from our dorm’s basement; that may have been my first sight of an email. A different dorm neighbor spent lots of time on AOL. My senior year, a visiting professor from Ireland managed to get a large document sent to him electronically, but that required a lot of tech support. My resume was saved on a disk, and I continuously edited that file until it migrated to this website in the late 1990s.
  • 1989-91: I used money from a prize at graduation to purchase a Toshiba laptop, which ran DOS and WordPerfect, on which I wrote my dissertation. The laptop was not connected to anything, and its processing power must have been tiny, but it had the same fundamental design as my current Mac. Oxford had very few phones but a system called “pigeon post”: hand-written notes would be delivered to anyone in the university within hours. Apparently, some Oxford nerds had set up the world’s first webcam to allow them to see live video of the office coffee machine, but I only heard about this much later.
  • 1991-3: My work desktop ran Windows. During a summer job for USAID, we sent some kind of weekly electronic message to US embassies.
  • 1993-5: We had email in my office at the University of Maryland. I still have my first emails because I keep migrating all the saved files. I purchased this website and used it for static content. My home computer was connected to the Internet via a dial-up modem. You could still buy printed books that suggested cool websites to visit. I made my first visit to California and saw friends from college who were involved with the dot-com bubble.
  • 2007: I had a smart phone and a Facebook account.

It’s always hard to assess the pace of change retrospectively. One’s own life trajectory interferes with any objective sense of how fast the outside world was changing. But my impression is that the pace of change was far faster from 1977-1993 (from punchcard readers to the World Wide Web) than it has been since 2008.

revolutionary art without a revolution: remembering the eighties

When the 1980s began, I was a nerdy little white boy in middle school in the rapidly de-industrializing Rust Belt city of Syracuse, NY. When it ended, I was a grad. student in England, but I had lived in New Haven, London, Florence, and New York City. I was interested in classical music and the history of (European) philosophy and was pretty much the opposite of hip. However, I walked around with my eyes and ears open, and my friends were less nerdy than I. So I went in tow to venues like CBGB or Dingwalls. Much more often, I rode graffitied subway cars or watched breakdancers with boom boxes.

Two recent exhibitions have brought back the aesthetics of that period and helped me to understand it a bit better.

Mike Kelley and Jim Shaw were a decade older than me and from further north in the Rustbelt (Michigan), but I recognize the world they grew up in. They collected doodles drawn in ball-point pen on lined paper while the teacher wasn’t looking, fundamentalist tracts, album covers, semi-professional local ads, cable-access shows, comics, sci-fi paperbacks, D&D manuals, second-hand children’s book covers, toy packages from the dime store, pinups, and posters for high school plays. They imitated that material and mashed it together in their gallery art and for the stage performances of their punk band Destroy All Monsters.* I got to see samples of their work in “Michigan Stories: Mike Kelley and Jim Shaw” (MSU Broad Museum).

Born 6-8 years later than Shaw and Kelley, but famous when he was very young, Jean-Michel Basquiat mashed up Gray’s Anatomy (the book), old master paintings, documents from Black history, graphic symbols, sci-fi, jazz album covers, expressionist and pop art, found objects, and graffiti to make his groundbreaking work, which is featured in the Boston MFA’s Writing the Future: Basquiat and the Hip-Hop Generation.

Basquiat’s drawings and paintings are very striking, but it’s possible that the music videos steal the show. In Blondie’s Rapture (1981), which you can watch any time on YouTube, Basquiat is the DJ because Grandmaster Flash failed to show up for the filming. As Debbie Henry switches from punk to rap, she sings:

Fab Five Freddie told me everybody’s high
DJ’s spinnin’ are savin’ my mind
Flash is fast, Flash is cool
Francois sez fas, Flashe’ no do

That Haitian creole must be for Basquiat. Henry was the first person to purchase one of Basquiat’s works. It was news to me how closely punk and rap were intertwined.

Six years before this video, New York City had narrowly averted municipal bankruptcy. The subway had the highest crime rate of any mass transit system in the world and suffered from severe maintenance problems. A big part of the reason that graffiti artists could live in squats in lower Manhattan and paint whole trains was the economic crisis of the city. Meanwhile, the US auto industry that had sustained both urban Michigan and my Upstate New York hometown was shedding jobs. Between 1978 and 1982, 43% of automotive jobs (about half a million positions) were lost. No wonder Henry sings:

You go out at night, eatin’ cars
You eat Cadillacs, Lincolns too
Mercury’s and Subaru’s
And you don’t stop, you keep on eatin’ cars

“Rapture” was filmed in the deep recession year of 1981, when the Dow was down along with the rest of the economy. But as the decade progressed, markets rebounded and the culture celebrated finance—more, I would say, than industry or small business. It was the decade of Wall Street. And Wall Street’s Zuccotti Park is just 2.4 miles from Tomkins Square Park, the center of the bohemia portrayed in “Rapture.”

Basquiat’s art is explicitly anti-capitalist. I assume that artists who covered whole subway cars with their work considered the government that owned those trains as basically illegitimate and proposed a different form of ownership. Yet Basquiat started to make a lot of money in Manhattan gallery shows. Several of his close associates also moved from the economic margin to the center of the economic universe. For instance, in 1983, Basquiat and his girlfriend Madonna lived together in the Venice, CA studio of the art dealer Larry Gagosian (later known as “Go-Go” for his business acumen). Madonna was a legitimate member of the same bohemia as Basquiat, but she was on her way to selling 300 million records as the Material Girl. Even “Rapture,” which depicts a bunch of East Villagers who wouldn’t have a lot of money in their pockets, was beamed into millions of suburban rec. rooms through MTV.

Race was another dynamic. In places like Syracuse, Black/white racial integration reached its historic high. The school district implemented an ambitious desegregation plan. The ratio of African Americans to whites in the city’s population was also more balanced than it is in today’s “hyper-segregated” metro area. Syracuse has lost 35% of its population since 1950 in a process of suburbanization and re-segregation that was just getting started in the ’80s. Kelley and Shaw were white, and their musical genre was punk, but you can observe them admiring their Black counterparts from close up. Basquiat became famous in a predominantly white world but remained socially very close to Black and Caribbean New Yorkers. There was money to be made packaging rap for white teenagers, and money to be made subverting Reagan’s America in art or music.

A hostile critic would charge the ’80s bohemians with hypocrisy or even nihilism. (Those trains didn’t belong to them; most citizens preferred a subway without graffiti.) But I see pathos. This was revolutionary art without a revolution, an expression of left radicalism at a time when the deep cultural movement was rightward.

*this paragraph is self-plagiarized from Mike Kelley, Jim Shaw, and memories of Rust Belt adolescence.

some notes on receiving tenure

This week, the Tufts Trustees voted to grant me tenure and make me a full professor. I am very grateful to them, the Political Science Department (which is my tenure home), the Tisch College of Civic Life (which will remain my main base) and its dean, Alan Solomont, and the other people–known and anonymous to me–who were involved in advancing and reviewing my case.

I have been working full-time in universities (Maryland and then Tufts) since 1993. However, I don’t believe I should have held tenure until now. Tenure means job security for teaching. It’s a way of protecting instructors’ intellectual freedom. Until now, I have never held a teaching position. More years than not, I’ve taught at least one credit-bearing college course, but not as part of my paid employment. Instead, my salary has come from external sources (philanthropy, contracts, and–in the early years–a state appropriation). These funds have supported me and my colleagues to serve external constituencies with research and organizing. That kind of work must be contingent on funding and performance or it would turn into a sinecure.

So really the big change in my life is that I will be teaching virtually full time, starting in 2019-20. One motivation is our new Civic Studies major at Tufts, which is a major commitment of mine. This major is also part of a more general strategy of making the study of civic life a core academic focus at Tufts, which is another personal commitment for me and a key strategy for Tisch College. At the same time, I am looking forward to the role of a teacher/individual scholar, because that should allow me to explore certain topics more deeply than I have so far–mainly, topics in political theory.

My career is unfolding in backwards order. My degree is in a humanities field, philosophy. Humanists usually start by teaching alone and doing single-authored research: in short, reading, writing, and presenting. Some of them gradually begin leading departments, serving on committees, planning conferences, conducting collaborative and interdisciplinary research projects, and interacting with publics.

I started in an externally-funded center within a state school of public policy, where our work was applied, interdisciplinary, collaborative, and done in public. From an early age, I was heavily involved in working with other people. I was rarely in a classroom but almost constantly on the phone or email, communicating with peers. My most significant publications were co-authored; the Civic Mission of Schools report lists 60 authors.

I will not give up that kind of work but I do plan to spend more time teaching and conducting individual research. If this backwards order makes any sense intellectually, the advantages will be: 1) Breadth–I never sought tenure in a discipline that would have expected me to demonstrate deep specialization, but I had to learn a bit about a lot of things, and 2) Experience in how knowledge, power, money, networks, and organizations relate to each other in the 21st century. I’m hoping to make that second topic a focus of my research.

The immediate plan is to keep doing the collaborative work that I’m doing now (so don’t be alarmed if you are a collaborator) while developing several new courses in 2019-20. I have completed a book manuscript that is under review, so if that goes reasonably smoothly, I will feel free to focus a lot of attention on curriculum and pedagogy during the next academic year. I also have a sabbatical coming up, and I plan to spend that time learning network science and continuing to collect network data of different kinds, toward one or two books on networks and political/moral thinking.

It’s very rare for someone to switch to the tenure track after 26 years in the business. It’s like lifting a heavy locomotive and putting it down on different rails. Tufts has been tremendously supportive, flexible, welcoming, and creative in making this possible in my case. I feel a deep sense of gratitude and loyalty to this institution and my colleagues and students.

on hedgehogs and foxes

“A fox knows many things, but a hedgehog knows one important thing” — Archilochus

This proverb is in the news lately because Philip Tetlock has shown that foxes (flexible and curious generalists) are much better at predicting events than hedgehogs (specialists who hold deep expertise). See David Epstein’s Atlantic article on Tetlock, and see Axios for a current competition funded by the US intelligence agencies to test his theories.

Tetlock draws from Isaiah Berlin’s 1953 essay, which is light but offers some insights, I think, about specific authors. Berlin argues that Tolstoy was psychologically a fox but believed–for theological/ideological reasons–that we should all be hedgehogs. Our one big idea should be the Imitation of Christ. This tension was at the heart of Tolstoy’s books and life. I also endorse Peter Hacker’s view that Wittgenstein was temperamentally a hedgehog who forced himself self-consciously to become foxlike in his late work.

If you take the proverb literally, it seems more impressive to be a fox. The fox uses its brain to hunt and escape, whereas the hedgehog just instinctively rolls up to take advantage of its best physical asset, its spines. But the metaphor is loose. Human hedgehogs are among our deepest, most original thinkers. They are the ones with the discipline to construct whole, coherent worldviews. They don’t merely employ a strategy but create it.

In contrast–and I write this as very much a fox–foxes can be ad hoc and derivative, eclectic in a bad way. A fox can employ the available ideas that seem to fit the situation without generating any new frameworks for others to use. A fox can be a jack of all trades, master of none. We foxes need hedgehogs to develop new ways of thinking, from which we borrow superficially and pragmatically.

But it is interesting that the hedgehogs are so consistently wrong about what will happen next. They are more likely to suffer from confirmation bias. They can make any data fit their theory. And they are worse than foxes at recognizing exceptions, tradeoffs, and zones of uncertainty. They lack phronesis, practical wisdom.

I therefore think it’s a problem that hedgehogs have an advantage in the competition for attention. If you are associated with one big idea and you keep hammering away at it, you have a “brand.” People turn to you to say that one thing, even if they don’t agree with it, and so your fame rises. You must compete with the other people who say the same thing, but if you’re first or more effective at communicating it, you can own the space.

So as not to offend anyone alive, I’ll use the case of my late colleague Ben Barber, who was early to revive the idea of “strong democracy.” (More democratic engagement is always better; the good life is lived in public; liberalism is too individualistic; etc.) He wrote several best-sellers, and I attribute his success in part to his capturing a particular brand. For courses, debates, conferences, etc., you may need someone to say, “More democracy!” Barber cornered that market.

Temperamentally, I am with the foxes. As soon as I write an argument for anything, I immediately become fascinated by the arguments against it. I have a limited attention span and jack-of-all-trades tendencies. I frequently disappoint practitioners and advocates, who know that I have written in favor of campaign finance reform, public deliberation, service, or civic education and want me to say it again to a new audience with more conviction. In fact, I am almost always on the verge of apostasy and retraction.

I really do admire the hedgehogs. But I’ll say a few things in favor of foxes.

First, the moral world is immensely complex, because it emerges from myriad human interactions and takes the form of communities, cultures, and institutions that overlap, interrelate, and become loaded with historical resonances. Thus an adequate moral theory is almost certainly partial, inconsistent, and ad hoc.

Second, acting like a fox keeps you mentally alive. It may be a self-indulgent concern, but I fear ceasing to think. Even the greatest hedgehogs, it seems to me, have stopped their quest for knowledge. They already know, and know that they know, and are done.

I’ll also say one thing against foxes. At least in folkore, a fox is a solitary hunter. What if you also like people and feel loyalty to groups of peers who share goals and missions? Then you cannot simply act like a fox.

To switch metaphors, Keats admired the “quality” that forms a “Man of Achievement especially in Literature and which Shakespeare possessed so enormously—I mean Negative Capability, that is when man is capable of being in uncertainties, Mysteries, doubts, without any irritable reaching after fact and reason.” I also admire Negative Capability, but it is a virtue of the poet, not the ally. Negative Capability is good for writing fiction that explores many different perspectives; it is not so helpful for co-writing a mission statement for an organization and then following through.

So I would like to be a fox who is helpful in a pack. The question is to what degree that’s possible.

See also: the politics of negative capability; loyalty in intellectual work; in defense of Isaiah Berlin; structured moral pluralism (a proposal); and Tolstoy, Shakespeare, Orwell

Mike Kelley, Jim Shaw, and memories of Rust Belt adolescence

Mike Kelley and Jim Shaw (pictured below) were born in urban Michigan during the 1950s. By the time they were art students in the early 1970s, they’d seen all the stuff you don’t study in a University of Michigan classroom: doodles drawn in ball-point pen on lined paper while the teacher isn’t looking, fundamentalist tracts, album covers, semi-professional local ads, cable-access shows, comics, sci-fi paperbacks, D&D manuals, second-hand children’s book covers, toy packages from the dime store, pinups, and posters for high school plays. They collected that material, imitated it, and mashed it together in their gallery art and for the stage performances of their punk band Destroy All Monsters.

 

Their two-man gallery show, “Michigan Stories: Mike Kelley and Jim Shaw” (MSU Broad Museum), evokes the claustrophobia of adolescence, when you realize that you’re being raised to play a role in a society that you don’t much like. All those adults coming at you to tell you what to believe and do feel like monsters from a late-night horror film. Kelley, who died in 2012, was explicit that his adolescence was miserable.

That wasn’t my life. I was raised in a protective family, encouraged to explore a wide range of paths, and basically in love with the world. Yet I can summon memories of, say, junior high school in Syracuse, NY circa 1980 that powerfully evoke Kelley and Shaw. I was a half-generation younger, so when they were blasting their “noise rock,” I was afraid of big kids like them. But the graphic art and music of their cohort formed the backdrop for us early Gen-Xers.

For me, these guys evoke something more specific than perennial adolescent claustrophobia. They witnessed the particular disappointment of Rust Belt America when manufacturing crashed and the postwar promise turned out to be hollow. Black people and women captured some power in cities like Detroit and Syracuse, and everyone got permission to be a bit more free–just as capital and opportunity drained away. Kelley and Shaw were white boys watching a society that seemed unfair or cruel to others and pretty hollow for people like them. Out of that experience, they made some powerful visual art.

See also: Detroit and the temptation of ruin.