There were people (may still be) who lived under what might be called a flat management structure: there wasn’t much difference in the day-to-day lives of any two adult members of the group. I’m thinking of, for example, the Moriori as described by Diamond, or, really, most hunter-gatherer tribes. There may be some division of labor – men do most of the hunting, women stick to gathering – and the occasional official – medicine man, chief – who might have duties that free him, to some degree, from the hunting and gathering. But, in general, there is little material distinction in the lives of any two members of the tribe.
This flat structure makes it possible for everyone to be a carrier of the complete culture, more or less. Any man, woman or child could probably adequately explain for all daily purposes all the lore, traditions, taboos and mores as well as the next member, with age perhaps allowing more depth of understanding. With the flat structure comes a degree of homogeneity in cultural understanding among the people.
Once you start farming, things get more complicated. The division of labor soon becomes more extreme and exclusive. Need for craftsmen, soldiers, and, ultimately, government create groups of people whose daily lives are indeed much different from those of people in other groups. A farmer, a blacksmith and their lord pursue different activities, socialize differently, and might even follow much different rules of behavior – a milkmaid might not need to know how to perform a courtly curtsey; a plowman might never handle a broadsword. Perhaps only a tiny class knows how to read and do figures, or fight from horseback, or a dozen other things.
The rough homogeneity of culture seen in hunter gatherer groups disappears with growing specialization, not to mention fragmentation into classes, specialists, guilds and other groups within which exist different expectations and traditions. In the extreme – here, now, for example – groups within a nation hardly share a culture at all. At best, we retain (most of) a language and some norms for social interactions. Are these enough to say we share a culture?
Nonetheless, specialization has been an inescapable aid to making life as materially good and pleasant as we now find it. Indoor plumbing and hot running water, not to mention penicillin, refrigeration, vaccines and so on, are not luxuries I’m prepared to give up at the moment. Such specialization, even if considered apart from class differences, does fragment the culture to some degree. There are scientific (and pseudoscientific) subcultures with rules and expectations that non-scientists don’t need to concern themselves with very often. In many fields, day to day interactions take place in a jargon that might as well be another language for all an outsider can make of it.
How is a culture passed on, given that no one can be a master in all fields needed for a complex and living culture? It must be passed on in pieces, so that some piece – the art of bricklaying, for example – gets handed down one way, while the art of writing a good essay gets handed down to a mostly different group some other way. This is not a hypothetical, it is an issue that has been with us for thousands of years. Here I want to talk about one aspect only: how the intellectual and artistic achievements of a culture are passed on.
First, I will assert that, while many people in a society might see no value in art and literature and philosophy, the culture as a whole is healthier when there is a strong intellectual tradition within it. (Not too controversial, I trust.) It is a better place to live even for those who have no interest in it. Admirable cultures of any sophistication have well-established means of passing such intellectual culture down. In ours, until recently, that means was colleges and universities.
Thinking of colleges and universities as existing to serve a societal role in passing along an intellectual tradition would have made as much perfect sense to a Harvard grad 200 years ago as it is perfect nonsense to the people who think everyone should go to college for free. That would be like saying everybody should learn bricklaying at public expense. At least since the 1940s, a college education has been viewed primarily as a meal ticket. (1) Depriving someone of a degree came to be seen as an act of Oppression. Rather than having a somewhat self-selecting group of people, maybe 10% of the population (2), pursue a ‘useless’ degree in Liberal Arts for the acknowledged goal of keeping an intellectual tradition alive, we think it good to try to send a majority of kids to college.
The problem is that a majority of kids, if they are clear-headed enough to want anything from college, want a job. This has become so ingrained that a grad with a [fill-in-the-blank] studies degree thinks she ought to have a job, and that somebody somewhere is doing something wrong so that she doesn’t have it (3), and the bank completely unreasonably wants its money back!
In this context, it’s a weird and inconsistent fact that I got a Master’s in International Business and Finance from a University. Business degrees are vocational training, no different in essence from learning to weld, lay bricks, file papers or prosecute a case in court. Why are such things taught along side Liberal Arts (properly considered, not just a euphemistic catch-all for ‘things that can’t get you a job’) and given the same or higher level of honor?
I’ve long pushed on this blog for a bifurcation of college education into those fields which depend on objective evidence for their validation from those that don’t. Thus, engineering, math, medicine, accounting, chemistry, business and so on would be taught in the ‘real-world’ schools – and get you jobs. Women’s Studies, Sociology, Psychology, Comp Lit, Creative Writing and so on would be taught in faerieland, and not get you a job. Expectations would be forcefully adjusted accordingly.
But this, while gratifying to contemplate, doesn’t solve the challenge of passing on an intellectual tradition. For that, we’d need, frankly, Great Books schools, as well as various institutions passing on the arts. After one has made the acquaintance of the Western intellectual tradition, one would be free to go Vo-tech (4) or even faerieland, if one wished (just don’t expect to suck at the public teat if you do!).
So now we’ve provided for the handing on of an intellectual tradition, and for training people for jobs (and for identifying the people to avoid at parties). What’s left is finding a way to pass on a baseline culture, to make sure as much as possible that the chemist can talk civilly to the clerk, and the auto mechanic to the lawyer. The ancient Greeks (can’t keep them out of an argument for ever) had ephebia – schools that, initially, were for training young men to be soldiers. When a boy turned 17 or 18, he was expected to spend a year or two at this specialized school. Over time, the ephebia became more of a tool for expressly passing on culture, so that Alexander the Great set them up in all the little ‘Alexandrias’ he founded – and allowed the non-Greek natives to attend. This practice continued for centuries, and is part of the background to the books of the Maccabees – what Judas Maccabee and his team are fighting against is the efforts of their Greekified conquerors to inflict Greek culture on them – via, largely, the ephebia (with it’s naked gym class and all that).
The reason the ephebia persisted for centuries is that they worked. A core of men, who like modern college grads identified themselves with their graduating class, would grow up and gain power together, always sharing an idealized Greek view of the world. They would nurture the following classes, and send their boys – and promising non-Greek boys – in their turn.
My ideal education system might consist of the following stages:
A. Age 0 – 14, 15, 16: Leave them the hell alone. If they want to do sports or take music or just hang out during the day, cool, let’s do that. As for reading, writing and math – any competent adult can show 95% of kids how to do it once they’re ready to learn, in a tiny fraction of the time the schools waste on it. The whole ‘professional educator is REQUIRED’ for these ages myth is exploded once you dip your toe into history.
B. Once they take an interest in joining the adult world, let them take whatever they want. We ran this experiment on our own kids, and guess what? They all started taking classes at the local community college by age 14 or so, and, except for the 12 year old – little early still – all got into great colleges (hey, we want to be part of that 1% that preserves the culture/saves Western Civilization) with no insurmountable problems.
C: Only mandatory schooling is 2 years of an American ephebia, which is pretty much anti-school as it is now practiced: learn about America, how it works, and why you should love it and keep it.
D: Vo-tech all around! Do it now, do it later, just do it.
There, problem solved!
- I’m one of ‘them’ – the scion of a family with no intellectual ambitions, but with plenty of jonesing for a better economic future. I’m still a little amazed at my dad’s enthusiasm for St. John’s Great Books Program. It worked out well, financially, but how he thought he could see that escapes me. Heck, *I* could barely see it.
- Of which maybe 10% might actually really become Guardians. We probably have to train 10% to get that 1% of the population who will understand and be willing to defend our culture.
- The colleges do their best, hiring scads of otherwise worthless intellectuals to teach the next batch of marginal students, thus rendering them unemployable in their turn.
- Or before – there’s no reason a 15 or 16 year old should not be trained in whatever they want to do. Delaying such training until after age 18 or 22 is just silly.