Monthly Archives: October 2010

the youth vote in 2010: what would success look like?

Cathy Cohen, a distinguished political scientist at University of Chicago and the PI of the Black Youth Project, writes:

    When record numbers of young African Americans turned out to vote for Barack Obama nearly two years ago, political pundits predicted the start of an important and positive trend. Socially marginalized young blacks buoyed by the election of the nation’s first black president would supposedly begin to see themselves as newly politically empowered and engaged. …

    So how is it that heading toward midterm elections in November, large percentages of black people ages 16 to 25 continue to feel alienated from mainstream American society and contemplating not who to vote for but whether to bother voting at all? … The bottom line is that we’re going to see lower turnout among young people next month, and we’ll see even substantially lower turnout among young black people.

She concludes: “If these young people don’t come out to vote, the Democratic Party will have only itself to blame. Instead of harnessing the energy of young voters across the board, particularly black ones, and nurturing their political momentum, President Obama and his party ignored them once the election was over.”

As we approach the 2010 election, one way to think about youth participation is by looking at the trend in previous midterm elections.

The last midterm vote was in 2006. That year, youth turnout reached 25.5%, up for the second cycle in a row. (Young African Americans, by the way, slightly beat young whites and all other ethnic groups in turnout that year.) So we could set 25% as the baseline for youth turnout and declare success if 2010 sets a higher mark, which remains possible. If we keep seeing upward progress of, say, two points every four years, then we could reach 50% turnout by the year 2060. (Although I will be 93 then if I’m lucky enough to live so long.)

Another way to look at this issue is the way Cathy does. We should have much better turnout. In the 1800s, more than 90% of eligible voters outside the South voted in some years, a rate that remains common in many other democracies. What would it take to move us from our current rate to an acceptable one? I’d advocate for political reform, but some kind of transformational event could also help.

Now imagine that a presidential candidate recognizes the potential of a diverse and energized young population to exercise political power. He develops a powerful bond with them and offers them exciting ways to engage with his campaign. More than half of eligible young voters turn out, and two thirds of those vote for him. He wins, thanks, in significant measure, to their support.

This sounds like the basis of a generational transformation. And yet two years later, we are asking whether youth turnout will rise to 27% or fall to 22% or 23%. It is not too soon to ask Cathy Cohen’s critical questions about the responsibility of our political leaders for missing a remarkable opportunity.

notes on a developmental ethic

We are morally obliged to treat our fellow human beings and their communities as subjects in development. In this post, I take a stab at defining “development” and a developmental ethic.

Any theory of development expects constant change, as opposed to a theory that assumes stability, ignores the dimension of time, or overlooks the potential for bad things to improve. Development is not random change, but it also is not fully pre-determined and predictable. When something develops, we can say that changes occur because the object is moving toward some kind of objective or end, whereas ordinary physical objects change only as a consequence of what is done to them.

In the case of a biological organism, we are able to talk about “development” and change toward objectives or ends because the physical structure of the object includes guidelines for its own growth and transformation (mostly encoded in its genome). In contrast, we would not say that a mountain “develops,” even though it changes, because there is no design encoded in it that makes it change in a particular direction.

Human beings’ development is more complex, because we are able to reflect on our own trajectories and strive to change them. Not every influence is random or encoded in our genes. On the other hand, we are never fully free, because our developmental course up until the present influences our efforts to change. Thus development can involve intentions and self-consciousness, but it is not simply a matter of choice.

Nor do human beings pass through automatic “stages.”* Personal decisions and external events and opportunities disrupt the standard progression and can produce wide variation. Nevertheless, some phases or periods of development are encoded genetically, or are deeply embedded in our cultural traditions, or come logically before or after other phases. For example, there are important reasons that individuals typically babble before they talk, learn to read before they become sexually active, attend school before they vote, fill the roles of children before they are parents, and hold jobs before they retire. Some of these sequences are biologically necessary; others are wise conventions; and some might be mistakes. To think developmentally is to pay attention to the typical (and atypical) sequences and the timing of opportunities and experiences. The usual course of human development is open to critique but it cannot simply be ignored.

The same is true for communities, institutions, and other groups of human beings. Like individuals, they have developmental trajectories that are shaped by their initial designs, constrained by logic, affected by random events, and yet susceptible to deliberate alteration by the group itself. Sun Belt boom towns are at a different stage of development from Rust Belt inner cities. The government of the United States has developed rapidly since the ratification of the Constitution and cannot now reverse course. To think developmentally about a community is to take its past seriously and not to imagine that it can simply start over from scratch, but also to recognize the potential for deliberate change.

An ethic of development, then, is a particular way of making judgments and intervening in the world. It does not presume that every person, community, or institution has equal merit and virtue: some are better than others. But if we think developmentally, we are alert to the ways that the past has shaped each one’s present, the limits of choice, and the potential for any person or community to change for better or worse.

For example, I know college professors who are offended that their students are relatively superficial and undisciplined thinkers. That perspective fails to view students as individuals in development; their thinking will change rapidly. On the other hand, if you are a college teacher who simply tolerates and expects your students to think immaturely, you are not contributing to their development. If you try to make them think better, you ignore the inevitable responsibility of human beings for their own development. But if you leave them to do and think whatever they want, you forget that healthy development requires guidance and support. If you treat a 12-year-old just like your college students, you are unreasonable. But if you try to shepherd a fellow adult intellectually, as you would your own students, you misunderstand your limited rights and responsibilities for other people’s development. In short, thinking developmentally is not easy—it raises a host of empirical, strategic, and ethical questions—but it is indispensable.

*Important stage theories have been presented by Sigmund Freud, Jean Piaget, Erik Erikson, Lawrence Kohlberg, and others. These theories offer important insights, but I am persuaded by a general critique. The idea of stages makes the developmental process seem internally regulated and automatic except under exceptional circumstances. That is plausible for language-acquisition but not for civic or moral identity after early childhood. Development is a complex and variable interaction between the organism, its own norms, prevailing external norms, and other aspects of the environment.

young campaign volunteers in 2008: the numbers

In 2008, for the first time in history, more young people than older people said that they had volunteered for a campaign. That tells an important story about how the Obama campaign in particular–and perhaps other political campaigns as well–engaged young people. The 2008 election was also a much more inclusive one than we had seen for some time, based on the proportion of Americans who said they had “done any work for a party or candidate.”

On the other hand, the long-term trend is a decline in political volunteering, as campaigns have evolved from broad, grassroots, labor-intensive efforts requiring many willing volunteers to highly professionalized enterprises driven by fundraisers, media consultants, and pollsters. Politicians are now more dependent on donors and less reliant on popular support. A very important question is whether 2004-8 was a blip or the beginning of an upward trend. (Source: American National Election Studies, analyzed by me.)

the rise of an expert class and its implications for democracy

Civil society is increasingly dominated by people who have received relevant professional training or who officially represent firms and other organizations. In local discussions about schools, for example, a significant proportion of the participants may hold degrees in education, law, or a social science discipline or represent the school system, the teacher’s union, or specific companies and interest groups.

Such people can contribute valuable sophistication and expertise. But if my arguments here are correct, we should not be satisfied with public discourse that is merely technical or that reflects negotiations among professional representatives of interest groups. We should want broad deliberations, rooted in everyday experience, drawing on personal experience and values as well as facts and interests, and resistant to the generalizations of both professionals and ideologues.

Technically trained professionals already intervened powerfully in public policy and institutions a century ago. The ratio of professionals in the United States doubled between 1870 and 1890, as society became more complex and urbanized and scientific methods proved their value. More than 200 different learned societies were founded in the same two decades, and learned professionals specialized. For example, physicians split into specializations in that period. The historian Robert L. Buroker deftly describes the implications for politics and civic life: “By 1900 a social class based on specialized expertise had become numerous and influential enough to come into its own as a political force. Educated to provide rational answers to specific problems and oriented by training if not by inclination toward public service, they sensed their own stake in the stability of the new society, which increasingly depended upon their skills.” At best, they offered effective solutions to grave social problems. At worst, they arrogantly tried to suppress other views. For instance, the American Political Science Association’s Committee of Seven’s argued in 1914 that citizens “should learn humility in the face of expertise.”

One of the great issues of the day became the proper roles of expertise, specialization, science, and professionalism in a democracy. The great German sociologist Max Weber interpreted modernity as a profound and unstoppable shift toward scientific reasoning, specialization, and division of labor. One of Weber’s most prominent students, Robert Michels, introduced the Iron Law of Oligarchy, according to which every organization–even a democratic workers’ party–would inevitably be taken over by a small group of especially committed, trained, and skillful leaders. In America, the columnist Walter Lippmann argued that ordinary citizens had been eclipsed because of science and mass communications and could, at most, render occasional judgments about a government of experts. Thomas McCarthy, author of the Wisconsin Idea, asserted that the people could still rule through periodical elections, but expert managers should run the government in between. John Dewey and Jane Addams (in different ways) asserted that the lay public must and could regain its voice, but they struggled to explain how.

Thus the contours of the debate were established by 1910. If dominance by experts is a problem, it was already evident then. But even if the conceptual issue (the role of specialized expertise in a democracy) is the same today as it was in 1900, the sheer numbers are totally different. This is a case in which quantitative change makes a qualitative difference.

Just before the Second World War, the Census counted just one percent of Americans as “professional, technical, and kindred workers”: people who according to, Steven Brint’s definition, “earn[ed] at least a middling income from the application of a relatively complex body of knowledge.” This thin slice of the population was spread fairly evenly. There was usually a maximum of one “professional” per household, and even in a neighborhood association or civic group, there might just be one physician, one lawyer, and one person with scientific training. Often these people (mostly men) had been socialized into an ethic of service. They had valuable specialized insights to offer, but they were obliged to collaborate with non-experts on an almost daily basis to get anything done. Without romanticizing the relationship between professionals and their fellow citizens, I would propose that the dialogue was close and reciprocal.

Today, in contrast, there are so many “professionals” (and they are so geographically concentrated) that particular neighborhoods, and even whole metropolitan areas, can be dominated by people who make a good living by applying specialized intellectual techniques. As holders of professional degrees, these people possess markers of high social status that were much more ambiguous a century ago, when gentlemen were still expected to pursue the liberal arts, and the professions still smacked slightly of trades. When wealthier and more influential communities are numerically dominated by people with strong and confident identities as experts, the nature of political conversation is bound to change.

In 1952, of all Americans who said that they had attended a “political meeting,” only about one quarter held managerial or professional jobs. Many more (41 percent) worked in other occupational categories: clerical, sales and service jobs, laborers and farmers. The rest were mostly female homemakers. In short, professionals and managers—people trained to provide specialized, rational answers to problems—were outnumbered three-to-one in the nation’s political meetings. By 2004, however, 44 percent of people who attended political meetings worked in managerial or professional occupations, and 48.5 percent held other jobs. The ratio nationally was now almost even, and professionals were the dominant group in affluent communities.

These are crude categories that do not tell us how people talk in meetings. A clerical worker could argue like a technocrat; a physician could tell rich, personal stories, laden with values. But I think the increasing proportion of professionals and managers in our meetings tells a story about a society dominated by people with specialized training and expertise.

Theda Skocpol notes that traditional fraternal associations like the Lions and the Elks, which once gathered people at the local level who were diverse in terms of class and occupation (although segregated by race and gender), have lost their college-educated members. But non-college-educated or working class people remain just as likely to join these groups. It is not so much that working-class people have left civic groups, but that professionals have left them–moving from economically diverse local associations to specialized organizations for their own professions and industries.

The proportion of all Americans who are professionals or managers has roughly doubled since the 1950s. That is a benign shift in our workforce, reflecting better education and more interesting jobs. It largely explains why highly educated specialists have become more numerous in meetings. They bring sophistication and expertise to community affairs. Still, two thirds of people do not classify themselves as professional or managers, and it important for their values and interests to be represented. The steep decline in traditional civil society leaves them poorly represented, to their cost and to the detriment of public deliberation.

[works cited here: Burton Bledstein, The Culture of Professionalism (New York, 1976), pp. 84-6; Robert L. Buroker, “From Voluntary Association to Welfare State: The Illinois Immigrants’ Protective League, 1908-1926,” The Journal of American History, vol. 58, no. 3 (Dec, 1971), p. 652; APSA Committee of Seven (1914, p. 263, quoted in Stephen T. Leonard, “‘Pure Futility and Waste’: Academic Political Science and Civic Education,” PSOnline (December 1999); Steven Brint, In an Age of Experts, The Changing Role of Professionals in Politics and Public Life (Princeton: Princeton University Press, 1994), p. 3; Theda Skocpol, Diminished Democracy: From Membership to Management in American Civic Life (Norman, OK: University of Oklahoma Press, 2003), pp. 186-7. Statistics on political meeting participation are my own results from the American National Election Studies.]

a grand bargain on voting rules

For what other activity would you be required to register and then wait more than a month before actually doing the thing? Today is the last day to register to vote if you live in 17 states and the District of Columbia.* The actual election is in November. In most states, you may only vote within a limited span of hours, at one particular site in your neighborhood.

I mention this because I happened to hear the host and callers on local Boston conservative talk radio expressing astonishment that you don’t have to show a photo ID to vote. The tenor of the discussion was a series of rhetorical questions: Would you be able to take money out of a bank without ID? Would you be able to check into a hotel?

Well, maybe: ATMs don’t require photo ID, although they do take and store your picture. But certainly there are ATMs all over the place, open 24/7, and ready to use as soon as you put money in the bank. Voting in Massachusetts is possibly easier than other transactions in one respect (no ID is required)–but it is far more difficult in other ways.

I am not personally concerned about voter fraud in the form of people pretending to be other people at the polling place. Doing so would risk a felony conviction, and for what?–to cast a single extra vote for the candidate you prefer hardly seems worth the risk. Lori Minnite found no evidence that it happens.

That said, I’d be willing to enter a grand bargain with the folks I heard on talk radio. Let’s make voting really like a secure financial transaction. You’d have to prove who you were, but you could vote at your convenience, 24/7, with no pre-registration or re-registration when you moved.

In fact, this is the roughly deal in some jurisdictions. Twenty-six states offer unrestricted absentee voting. Thirty-one states permit in-person early voting. And nine states offer Election Day registration. Several of these reforms have been found to raise turnout, especially same-day registration. See our fact sheet for a summary.

*Arkansas, Arizona, Colorado, DC, Florida, Georgia, Hawaii, Indiana, Kentucky, Louisiana, Michigan, Montana, Ohio, Pennsylvania, Tennessee, Texas, Utah, Washington, Wyoming.