Potential employers of young workers tend to value degrees (high school and college), courses, majors and grades, and previous jobs. Those are all experiences that a person completes, rather than direct evidence of one’s capacity to do a given task. In a tight market, employers can raise the bar, so that–for example–65% of new openings for executive assistants and executive secretaries now require a BA, even though only 19% of the people who now hold such jobs have college degrees. This is bad news for the two-thirds of young people who have not attained at least a bachelors degree. It’s also a potential loss for employers, who may be missing the people who would do the best work although they haven’t accumulated the most highly-valued experiences. One widely promoted solution–getting everyone through college–seems both unrealistic and needlessly expensive.
Why do employers use college degrees and other major experiences to select employees? I would propose these explanations:
- Employers are actually looking for concrete skills, such as the ability to write a coherent memo or schedule a meeting. In the absence of direct measures of such skills, they use college completion as a rough proxy. That is unfortunate because it isn’t a precise measure, it can discriminate on the basis of social class, and it drives colleges and universities to advertise highly concrete skills as their outcomes even though that distorts their real educational mission.
- Employers have a more general theory of “merit,” which may encompass broad literacy and numeracy, self-discipline, social skills, critical thinking, etc. Employers may feel, furthermore, that the competition to enter and then complete a selective college is a measure of such merit. They need not believe that merit is innate to use it as a selection criterion. Success in school could result from investments in the home, community, school, and college, and still be an indicator of value for an enterprise. The content of the education may be fairly unimportant to the employers; the point is that school/college is a difficult competition, and those who get through it are more likely to contribute to their enterprise. By the way, if this theory applies, then getting everyone through college would only raise the goalposts; employers would start looking for graduate degrees. The idea of general “merit,” however, is highly problematic–not only morally, but also because completing a fancy college may not show that you are well suited to a particular job. Once again, the most truly qualified candidates may be left aside, which is waste of human potential as well as an injustice.
- Employers may value the goods that liberal education explicitly promises: genuinely critical thinking, reflection on the good life, sensitivity to culture, truth about nature and humankind, etc. This explanation may apply in a few workplaces, but I must say I am cynical about it. Most organizations have fixed ends or objectives and are really only interested in critical reflection about means (if they’re open to criticism at all). But the heart of a liberal education is critical reflection about ends: about the nature of a good life and a good society. Starting with Socrates, some have concluded that if you are seriously critical of ends, then you must be independent of institutions, although that might make you a poor employee in most organizations. In the classic text that first defined “the liberal arts,” Seneca wrote, “I respect no study, and deem no study good, which results in money-making.” To be sure, Seneca was a slave-owning aristocrat who had plenty of money to start with. One can find a compromise between the highest goals of a liberal education and the need to put food on the table. But the two objectives at least seem in tension.
- A variant of the previous theory is that employers seek a certain kind of cultural capital that results from a liberal education. When they choose employees from colleges like the ones they attended, it’s not because they prefer radical thinkers. It’s because they want to work with people who know and appreciate the same body of culture. This variant of the theory requires less idealistic assumptions than #3, but I still doubt that it applies in most organizations (other than small white-collar enterprises).
To the extent that the first theory applies, it would make sense to measure a diversity of concrete skills, one at a time, and provide portable certificates for individuals who can demonstrate them. That would allow people who demonstrate a given skill to win relevant jobs even though they may not be on a path to college. It would allow employers to choose more appropriate workers, and perhaps with less invidious bias. It would allow colleges that actually teach specific skills to gain credit for doing so. It would make space for organizations like my own to certify concrete political and civic skills that might lead to jobs or leadership roles. At the same time, it would relieve colleges from having to sell their bachelors degrees as indicators of concrete skills. Instead, they could offer genuine liberal education.
I acknowledge the risk. If prospective students only care about jobs, or are forced by economic circumstances to put employment first, and if employers only care about concrete job skills, then organizations that teach and certify job skills could put the liberal arts (k-16) out of business. But I think that maybe the liberal arts would be better off claiming that they enhance the soul and the community, instead of living off an inefficiency in the labor market.