Halloween, Hieronymus Bosch & Ephesians

This morning, had a discussion about slavery in the Roman Empire triggered by the Epistle to the Ephesians read this morning at Mass:

Slaves, be obedient to your human masters with fear and trembling,
in sincerity of heart, as to Christ,
not only when being watched, as currying favor,
but as slaves of Christ, doing the will of God from the heart,
willingly serving the Lord and not men,
knowing that each will be requited from the Lord
for whatever good he does, whether he is slave or free.
Masters, act in the same way towards them, and stop bullying,
knowing that both they and you have a Master in heaven
and that with him there is no partiality.

The usual commentary here goes something like: here is a revolutionary Christian proposal by Paul, that slaves, who have no rights under Roman law, must still be treated as brothers, and that masters will be judged by God on how well they treat them.

And, of course, this is true, and this new understanding in fact set the stage for the elimination of slavery wherever Christianity held sway. (Of course, given human nature, slavery pops right back up whenever we take our eye off the ball, but one or the other – slavery or Christianity – must prevail.)

I was making the point that understanding slavery under the Romans is a little tricky for Americans, as we have this history of racial slavery, where, because Americans were nominally Christians, they could not justify enslaving other men. Therefore, black Africans had to be thought of as less than men at least to some degree in order to keep the guilt and cognitive dissonance at bay.

The Romans, while as arrogant and bigoted as any conquerors, did not necessarily consider slaves as inferior men just from the fact of their slavery alone. A brave and noble man might just get unlucky, might be cursed by the gods, and simply be on the losing side of a war, and end up a slave through no real fault of his own. This is not to say that Romans didn’t look down on slaves, or treat them terribly – they did – but they did not imagine them a different, fundamentally inferior species. In general.

Also, a vast gulf exited between household slave and agricultural slaves. Sometimes, free men would sell themselves into slavery to a patrician, in order to have some hope of upward mobility – perhaps the nobleman had business interests that he might put the slave in charge of, if the slave proved himself dependable and talented. Then, if all went well, the slave could then buy or have given to him freedom for his children. At least, he probably wouldn’t starve in the meantime.

Agricultural slaves, on the other hand, seem to have been largely treated as animals. I have not run across any stories of agricultural slaves, who made up the vast bulk of slaves under the Empire, working their or their children’s way to freedom. But again, my reading in this area is slight.

Anyway, the only point, and it is a small one, is that it might have been lass shocking to the Romans and their Greek subjects to hear that a slave must be treated as a brother than it would have been for a Southern slave owner. In fact, the American slave owner just refused to hear it.

And this discussion lead, in that ineffable way my mind works, to consideration of Hieronymus Bosch, and why there are not more Halloween costumes and parties based on his works. The connection is that slavery isn’t the only thing that the Romans thought very differently than we do. Their sense of honor doesn’t map exactly to ours, for one thing, and the same noble Roman who would die unflinching for his Republic had most likely a deep and abiding affection for scatological humor. It’s a mistake to think of them in our terms. They inhabited a very different emotional and esthetic universe, it seems.

Hieronymus Bosch inhabited another very weird universe, one that – thankfully, I think – is very different from ours. It’s not just that his work is bizarre and often obscene – that might just be a personal quirk – it’s that his work was enormously popular. For a century after his death, people came to admire it. There are hundreds of copies drawn, painted or sculpted from that time. His work was hung in public places for people to see, and people traveled to see it. People really dug this stuff.

So I hit the web. And the answer is that first, there are plenty of Bosch themed costumes out there, if Google images is to be believed, and, second, that even a few parties along those lines have taken place. So, OK, even in these modern times Bosch has some appeal.

Then, for the first time in years, I looked, like really looked, at some Bosch.

Yikes.

 

Image result for hieronymus bosch details
And this is kinda tame. There’s stuff I won’t even put up here. 
Image result for hieronymus bosch details
The ice skate/funnel/red cape/yoyo combo really sets off the cross beak/letter look. 
Image result for hieronymus bosch details
As a costume, you’d need the right attitude to make that fish head with butterfly wings cape, sword and shield look work. 

And these are some of the less disturbing ones.

As Halloween costume inspirations, it seems to me Bosch would not be very appropriate, at least, under what I hope are modern American sensibilities. For Catholics, we dress up as scary or even evil characters in order to mock them, to show them we no longer fear them. Oh Death, where is thy sting? after all. Bosch does seem to be mocking something. The mockery has a hard time cutting through the disturbing, at least for me.

Conclusion: the 15th and 16th century Netherlanders, and the Germans, Spaniards and Italians who admired and copied Bosch, did not look at the world the same way we do. At least most of us. And I’d frankly like to avoid the ones among us who do.

The Deus Vult Hymnal – part the second: Gather Us In

So far, two of my generous readers have made very worthy proposals for the next modern liturgical ditty to get the Deus Vult treatment: aetherfilledskyproductions nominates Lord of the Dance and Richard A thinks Gather Us In should be very firmly kicked, um, up a notch. I agree with both. We’ll start with Richard’s suggestion merely because my muse is more intrigued by it at the moment.

My analysis of Gather Us In can be found here. To sum up, word count exercise reveals:

Us, We, Our, Ours, People, Peoples: 30 instances.

God, Jesus, Lord, (and related terms, such as Savior, King, etc): 0 instances. There’s an implied God behind the gathering, couple of pronouns wander by, but overall there seems to be a quaint delicacy about just naming Him, let alone thanking and praising Him. Almost like we think Dad’s asleep and don’t want to wake him. He might come down and see what we’re up to.

We also have here the subtle despair of low expectations. Pick any beloved old hymn and  the poetry paints a vivid, concrete picture and puts us in it. Here? Vague feelz, curious circumlocutions, a general resistance to saying clearly what we are doing. Space and Place instead of Church; Awaken and arise to the sound of our names – what? See, the temptation for those of us paying attention is to provide logical backfill so that the lines say something – “I guess he means…” But really, it’s a hymn, not a reading comprehension test.

Must say, this might be the perfect song for the Deus Vult treatment, as it wants to gather us, just not into a holy army; it wants to change us, but not into Soldiers for Christ; it wants us to awaken, but not to repentance and holy fear of the Lord.

We can fix that.

Deus Vult 6
Too subtle? Goodness knows we don’t want to be too subtle here.

Continue reading “The Deus Vult Hymnal – part the second: Gather Us In”

The Deus Vult Hymnal – part the first

Enough is enough. Steps must be taken. For too long have we put up with wretched modern hymns, whining, sure, but doing nothing.

I here propose that we – I and anybody else who wants to play – Do Something. How about we take specific egregious hymns and through the magic of Scripture and doctrine married to classic hymn structures, write something that answers and corrects the pablum and heresy? Line by line, verse by verse, with references, we answer the drivel, the incoherences, the feels and, yes, the occasional overt heresy, with lines that mean something, advance the faith, and even rhyme!

We must for now set aside the ear-achingly bad music and focus on the texts. However numbingly awful modern church ditties may be, bad music most of the time merely insults taste and decorum, not Divine Truth directly. Besides, if we write our texts in any one of dozens of perfectly nice hymn formats, any number of existing tunes should fit them suitably.

Deus Vult 1
Just spitballing here…

We shall call this little exercise the Deus Vult Hymnal. Just because.

Deus Vult 2
Maybe this?

Today’s execrable hymn crying out to heaven for rebuttal if not vengeance, is the little Haugen tune All Are Welcome. Here, Mr. Haugen tries his musical hand at reinventing the Lutheran hymn, and the results are really not bad – musically speaking. The text, however, is a sort of motte and bailey: if we object to singing the Hegelian We Are Church Spirit into existence, we can be accused of objecting to being welcoming. Surely, the very least love of neighbor requires is welcome!

Continue reading “The Deus Vult Hymnal – part the first”

More Polanyi: Mysticism & Fantasy

Part II of my review of this book.

I’m well into the second half of Polanyi’s Great Transformation, and, while I’m getting a crash course in 18th & 19th English history through looking up all his references to events and people I’ve never heard of or that are just names to me, tedium is setting in. Late last night, while plowing through a few pages, I broke down and did something I almost never do and advise against doing until after you’ve read the book for yourself: looked at what other people say about this work. Read what the authors themselves say as much as possible, to avoid the inevitable biases and lacunas that predigested takes contain by their nature. In my frustration, curiosity about who, if anyone, takes Polanyi seriously got the better of me. Yes, I am weak.

Criticism fell into two distinct groups, with no one in the middle: Marxists critical theorists who love, love, love Great Transformation and consider it the seminal work on economics of the last 100 years, and non-Marxist economists – real economists, in other words – who would hurt themselves if they rolled their eyes any harder.

Image result for trobriand islands kula
Some kula in a museum. Subsistence farmers on remote islands make and trade these as part of a complex social ritual intended to reinforce social ties and thus avoid war. When all you got is yams, fish, palm fronds, and no realistic hope for anything more, the perennial human hobbies of sex and murder come to dominate your thoughts and rituals. Even more, I mean. 

The criticisms I laid down in my preliminary comments here and here were echoed and reinforced by his negative critics. For example, one critice makes a point Chesterton also made a couple of times in other contexts: primitive peoples alive today are not our ancestors. Rather, they are as much modern people as we are, except that for whatever reasons they have not made much technological or cultural progress. While our actual European ancestors were  inventing science and technology and cities and architecture and experimenting with complex social relationships, the Trobriand Islanders were cultivating yams and developing ritual trading designed to reinforce social relationships to keep the peace.

To point to tribal peoples living today as examples of man in nature is to ignore that our actual ancestors, who did develop what eventually became the modern world, were every bit as natural in the sense ‘natural’ is used here. Our actual ancestors, despite what Rousseau may think, were also natural men who did whatever they did by nature – they eventually developed the gold standard and international trade just as naturally as islanders grow yams and murder each other.  A ‘primitive’ Italian like Marco Polo, for example, clearly did engage in international truck and barter – around the time the Trobriand Islanders first arrived in their little paradise and started building grass huts. Polo is an ancestor to the West. The islanders are not.

Enough. Returning to my reading, here is a paragraph from the second half of the book I find quite revealing of how Polanyi thinks:

Let us return to what we have called the double movement. It can be personified as the action of two organizing principles in society, each of them setting itself specific institutional aims, having the support of definite social forces and using its own distinctive methods. The one was the principle of economic liberalism, aiming at the establishment of a self-regulating market, relying on the support of the trading classes, and using largely laissez-faire and free trade as its methods; the other was the principle of social protection aiming at the conservation of man and nature as well as productive organization, relying on the varying support of those most immediately affected by the deleterious action of the market—primarily, but not exclusively, the working and the landed classes—and using protective legislation,
restrictive associations, and other instruments of intervention as its methods.

Notice anything odd? How about the odd use of the word ‘personified’? Polanyi is here saying that two competing ‘organizing principles’ are – persons?

It would be easy to explain this away, a little goof in the midst of a long book, something a good editors maybe should have caught, but clearly I don’t think so. I think that this personification of abstract forces is exactly what this book is about. The individual is nothing, the masses everything, after all. And the masses is a seething, suffering – abstraction.

To Polanyi, great lumbering forces, abstractions that manifest themselves in Capital, or the Gold Standard, or the Labor Market are the persons of History, while people are just at best the raw material History acts upon. These persons, these gods-who-are-not-gods, correspond to Hegel’s Spirit, in that History is not made from a cumulation of millions of little decisions by millions of little people, but rather History acts upon the little people, with their decisions merely reflecting the gradual expression of Historical forces.

History, then, is always inevitable, even if we can’t see it until our illusory choices have slipped into the past. Marx’s claim to see the future is a claim that History is as deterministic as a wind-up clock. In 3 hours it will be 5:45; in the fullness of time it will be the Worker’s Paradise.

Hidden here is the perennial bait and switch, or perhaps motte and bailey: our sympathies are engaged by the very real suffering (usually) of the Little People, but the analysis and proposed solutions are always about presumed inevitable forces. The Polanyis of the world flip from one to the other with greater or lesser skill: questions about the framework are answered by implied or, increasingly, shrill accusations that you don’t care about the little people; focus on practical steps directed at the little people, get reminded that it’s the system, man.

I’ll try to get this finished off and post a final review soon.

Wednesday Flotsam, Including Science!

A. Stray thought: there is evidence that we’re in a Golden Age in at least some fields, and not just the obvious technological ones. Besides, we’re so close to the birth of technical fields such as computer and material sciences that calling this a Golden Age in that respect seems too premature to mean much. No, I mean areas old enough to have gone through a few boom/bust cycles.  There seem to be an awful lot of people, many  young or youngish masters, doing very impressive work across a number of fields.

Further, while Golden a Ages seem to feature a disproportionate number of true masters, I would think that the true measure isn’t the individual genius (who can pop up anywhere at any time, it seems) but rather the number of competent to great practitioners beneath them. For example, the 16th and early 17th centuries were the Golden Age of polyphony. Everybody who knows anything about that era can name Palestrina, de Lasso, Victoria, Byrd. But what’s astounding is that the few times I’ve gotten to perform stuff written by the supposed 2nd tier guys, I’ve been blown away. There seems to have been a lot of great music written back then, for which Palestrina in particular has been chosen as the poster boy, with everybody else getting at most a ‘oh, yea, him too’ reference. (1)

Years ago, listened to an interview with some pianists involved in competitions. Turns out there’s a general consensus that there are more fabulous piano players alive now than at any time in history. Used to be that, for example, Rachmaninov’s piano concertos used to be the peak of the virtuoso mountain, attempted by only the best of the best. Today, there are thousands of 15 year olds around the world knocking them off. It’s gotten to the point where, in competitions, as single mistake will disqualify you; as one young pianist said: it’s like they’re judging your soul. Technical perfection is simply the price of admission.

Sticking with music, Rick Beato, an accomplished musician with a Youtube channel I follow, mentions in one of his videos the growing number of utterly excellent guitar players out there today. He tells a story familiar to any of us older, say, over 50, guys: when we were young, a new song would come out and you’d throw the vinyl record on the turntable and wear it out while you figured out how to play it. A noble, useful exercise, but time-intensive and often frustrating.

Today? On Youtube, you can likely find a dozen videos of people, occasionally, even the original performers, showing you how to play the song. Technical issues such as fingerings and voicings that are often difficult picking up from the recording become clear. You still have to do the work, but it is so much faster and less frustrating to see it worked through, especially when you’re a relative beginner.

Same general principle holds for woodworking, blacksmithing and boat building, and no doubt a thousand other crafts and arts. There’s some normalish guy out there building a Bombay chest, a classic rapier or a cedar strip canoe right this minute, and his work will stand comparison the the best that’s out there.

But the real value is more subtle: you get to see normal people doing extraordinary things. Teenage girls will shred their way through Eruption or arrangements of Beethoven sonatas; you can watch her hands and see how she’s doing it. Dude will show you how to do epic, Japanese-flavored woodworking projects in his home shop. A guy will build a 45′ steel boat in his front yard; a young couple will build a ketch from scratch with the intention of sailing the world. A 20 year year old kid will make swords that should be hanging in museums. And on and on.

There’s even a sort of Art Tatum of this crafty world, a guy whose patient perfectionism and awesome skills might intimidate you even he wasn’t so matter of fact and charming: on his Clickspring channel he’s building a replica of the Antikythera Mechanism out of sheets of brass, often using tools he makes for the purpose. It must be seen to be believed.

Is it just that social media makes all this cool stuff easily seen, when in the past it was hidden from all but the hardest hardcore hobbyists? I don’t think that’s the whole story. Rather, I think there’s a general spreading of inspiration, that people everywhere are seeing that people just like them can do these incredible projects, and that some of those people then start incredible projects of their own.

I think this a very good thing, if true. People with a sense of accomplishment are much less likely to get blown about in the winds of political and social fashion, seems to me. They’re not looking, or looking less, for that practical sense of meaning in their daily lives that mastery of a craft gives you. People who finally master that instrument, build that boat, or finish that home addition are more likely to be stable, solid citizens.

Maybe. I could be delusionally optimistic. Wouldn’t be the first time.

B. In comments to  this article from the Medical Press, a nuclear physicist points out that even elite scientists often screw up their statistical analyses. In the initial paper, data was collected from a carefully selected representative sample of people who use statistics. Just kidding! I slay me! Using an ‘instrument’ of some kind, some college students were asked to solve questions where the solution required following 3 or more logical steps OR really knowing stat so that you could plug some numbers into standard stat formulas. There is an example in the initial paper.

Now, knowing quite a few people, including myself, I’d say the likelihood a high percentage of any group of people who aren’t professional statisticians or logicians to solve such problems is slim to none. It would fall well below half. I’d expect single digits among, say, pedicurists, long haul truckers and journalists – you know, fields were being able to follow 2 or more steps of logic isn’t a job requirement.  No knock – if we don’t use it, our minds are pretty good at freeing up space for something else.

The purpose of the study was no doubt to obtain a stick with a patina of science on it with which to beat some target or other. Since the reason suggested – I hope you’re sitting down – is that statistics is taught very badly in schools, it looks like the target is people whose money the state wants for funding reliable statist voters, such as ‘educators’ and teachers. The state of eternal school reform must be maintained, while at the same time all who question why we would pay for something that has failed to work for 200+ years are automatically excommunicated.

I don’t think teaching will help much, unless lots of us peons somehow reach the conclusion that following logic very carefully is something we can’t live without. Until then, even ignoring any possible issues with native intelligence, people aren’t going to learn this. For those who might want to know how to think a little but for whom the schools have  succeeded in their stated goal of preventing just such thought, well, they could start with Dr. Briggs’ book.  In it, he shows convincingly that probability and statistics are branches of philosophy (and thus necessarily, of logic) with a little math attached. In other words, knowing what you are doing and what is possible comes first. Do that, and the math is either completely unnecessary or firmly secondary.

Statistics properly understood is both a powerful tool and a cautionary tale. As Briggs explains in his book, there are more interesting questions in this world about which statistics can tell us nothing than there are ones where statistics can give us great insight. I’d guess exactly 98.83% of all statistical analyses you’ll ever hear about are out and out nonsense, at least as presented. How the data is gathered, what the data even is, whether the subject matter even admits of numerical analysis – these are philosophical questions that get booted even before the perp gets a chance to screw up the math.

  1. And it’s even worse than that, in that about a half dozen works by Palestrina are well known among the tiny subset of people who care about this stuff. Palestrina and de Lasso were very prolific, writing hundreds of pieces over their careers. Palestrina also maintained legendarily high standards – all his work is good to epically great. It’s too much! So we get to hear the same small set of pieces repeatedly, and just take the experts word for it being representative. And that’s not even counting any of the other masters who you’ve never heard of and whose names I promptly forget.

Science & Humility

Stray thought:

In his classic Caltech 1974 commencement address often referred to here, Richard Feynman states the following:

But there is one feature I notice that is generally missing in Cargo Cult Science.  That is the idea that we all hope you have learned in studying science in school—we never explicitly say what this is, but just hope that you catch on by all the examples of scientific investigation.  It is interesting, therefore, to bring it out now and speak of it explicitly.  It’s a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty—a kind of leaning over backwards.  For example, if you’re doing an experiment, you should report everything that you think might make it invalid—not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you’ve eliminated by some other experiment, and how they worked—to make sure the other fellow can tell they have been eliminated.

Details that could throw doubt on your interpretation must be given, if you know them.  You must do the best you can—if you know anything at all wrong, or possibly wrong—to explain it.  If you make a theory, for example, and advertise it, or put it out, then you must also put down all the facts that disagree with it, as well as those that agree with it.  There is also a more subtle problem.  When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.

In summary, the idea is to try to give all of the information to help others to judge the value of your contribution; not just the information that leads to judgment in one particular direction or another.

This is beautiful. Such scientific honesty is also rare in the wild. I think it would be near extinct except for another feature of science, one Feynman does not here discuss but did in fact practice: everybody is entitled, and even honor-bound, to do their best to shoot down a theory. This probing and testing of theories to see if they can be overthrown is perhaps the only example of Nietzsche’s quip being true: that which fails to kill a theory makes it stronger.

This is not to say that proponents of theories like having others, often young punks, try to destroy their scientific babies. Far from it, which is how we get another famous quip, that science advances one funeral at a time. Often, only death is strong enough to loosen a partisan’s grip on a theory under attack. This observation may be generalized beyond science.

In one of the accounts he gives of the discoveries that lead to his Nobel Prize, Feynman mentions that he spotted an error in previous work, where a no doubt respected physicist had proved his point by extrapolating beyond what the data could actually support. Feynman had to disprove and discard the conclusion of this prior physicist in order to establish the model that got him the Nobel.

It would be nice to hear how this other physicist took the news. Was he a good sport about it? In physics, this is a t least possible, what with grad students and other young guns pouring over every theory and calculation that came before, just looking for that spot where they can make a name for themselves. I assume, perhaps naively, that this sort of thing happens with enough regularity that physicists might not get too bent out of shape over it.

The history of science if full of cases where the fathers of displaced theories didn’t take it very well. In general, scientists are people: Galileo was a out of control and petty egomaniac; Newton was an insufferable snob. Boys will be boys. It is part of the glory of science that it makes progress even with the participants acting like primadonnas and toddlers. It must be something special, if it can work under those constraints.

The point here is that at least in the hard sciences, the honesty of which Feynman speaks is enforced by the competitive nature of elite scientists. On the one hand, they want their theories tested and challenged, because that means they are important and validated. On the other, they know they’ll likely get caught if they fudge, since that same army of ambitious grad students and young guns can make their own name by finding flaws.

The point I think Feynman is making is that everything goes smoother if the scientists keep all this in mind as they do their research. Don’t publish your pet theory until you’ve thought through and responded to every challenge you can come up with, and included all the steps needed for anybody else to validate you methods. Cut to the chase. Keep it clean.

What Feynman does not mention here in this speech nor anywhere else I can recall is humility. I’ve never heard, and find it hard to imagine, Feynman being accused of being humble.

Humility is what it takes. The proud are also the blind. The beginnings of modern science can be found is the medieval Questions method, best known from St. Thomas’s writings. A proposition was tossed out for discussion, and refined to its purest possible expression. The seminar, as we might call it today, batted it around, argued for and against, probably spun off other questions to be addressed. When done, their work was summarized in the usual form: It would seem that X, for reasons A, B & C. Arguments A, B, & C must be stated in a form that their champions agree is accurate. Then: On the contrary, it would seem not-X, for reasons D, E, & F. And objections A, B, & C can be answered as follows. Therefore, not-X is conditionally true, subject to further information and argument.

You see the superstructure of science right there: you must clearly state your proposition, what it is you think supports your argument (which includes these days, observations made with the aid of gadgets and math). You must address all objections, which must be stated as strongly as possible and truly reflect the views of your opponents (or they will let you know about it). You must show your work. Everybody understands that tomorrow’s new observation might overturn the whole thing.

I propose that the problems in science, especially Science! as revealed in the so called ‘replication crisis’, is in fact a failure of humility. The pride present in all of us is exacerbated by the softness of the science. A science is soft to the extent that peer-skepticism and adversarial review are not accepted and encouraged. Such skepticism and review keep science honest. Lacking them frees pride from all restraint.

I don’t imagine the students and masters practicing the Questions Method in 1250 were any more intrinsically humble and honest than we are today. But at least they would have understood that humility is both a prerequisite and a subset of honesty: you must be humble enough to accept you could be wrong; if you’re being honest, you will be humble. In the same way replication and criticism keep modern science however honest it may be, recognizing their own pride and need for humility helped keep the scholastics honest.

This recognition of the need to be humble is unknown among the leading lights of the soft sciences. Freud can respond to challenges to his teachings by asserting his critics are repressed, both applying the theory in question as if it has been established as correct and avoiding any response to the substance of the criticisms. There will never be any recognition that these ad hominem attacks are petty diversions among practitioners of the soft sciences. Just review the name calling that often constituted the entire response of those challenged by the current ‘replication crisis’.

Hegel gave the illusion of a patina of intellectual validity to the notion that reason is nothing, only revelation (“enlightenment”; “wokeness”) matters, so that the only response to critics is to point out how unenlightened they are. This thinking became the thinking of academy. This is hardly surprising, as  the key point of Harvard, et al., the belief that drove their founding and continues intact to this day, is that the right people need to be in charge. Only the little people, as Hegel pointed out, get all tangled up in logic and reason. True philosophers can be identified, functionally, as those who agree with Hegel.

Real scientists are like those adopted kids who don’t know who their parents are, but know they are orphans of a sort. Insofar as you are a true scientists as described above by Feynman, you will not find colleges in general to be your family. But their true parents – the scholastics and Aristotle – have been painted with such a black brush they would be horrified to recognize them as mom and dad.

Reading Scripture

In the RCIA class I’m helping out with, we’ve been discussing Scripture and Tradition. One thing I didn’t get to mention, but I imagine might come up: what to do with those passages that contradict beliefs you hold near and dear, beliefs you hold to be *obviously* true? Examples that leap to mind include Old Testament passages where God orders the complete slaughter – men, women, children, livestock and, I imagine, pets – of Israel’s enemies, and Paul’s commands that women remain silent in Church and be subject to their husbands.

My advice, for what little it’s worth: fight off the temptation to explain these passages away. Instead, ask: how could this be true? What could it mean? What does it tell us about God and man? Is there a non-dismissive way it can true?

Image result for adam and eve
Cranach, Adam and Eve, 1526. Adam, in a pose familiar to husbands, scratches his head when Eve offers him the apple. 

How could a loving God order His Chosen People to kill innocent children and non-combatant women? Clearly, I believe my notion of what constitutes a loving God precludes Him ordering his followers to kill innocent people. I personally would not obey such a command. I would assume such a command, if given to me in, what? a vision? a dream? could not be from God.

Still don’t exactly know what to make of this. The best explanations I’ve ever heard include the notion that this evil – the slaughter of whole villages and cities – was a result of Israel’s disobedience when they failed to promptly invade the Holy Land. Had they proceeded as ordered, the story goes, the inhabitants, faced with a huge army invading out of the desert, lead by God Himself, would have fled – and lived. But once Israel futzed about and was unfaithful, the 40 years passed, after which the inhabitants had ceased to be terrified. So God now needed to purge the Promised Land of the abominations of the worshipers of  the Baals – child sacrifice and sexual perversions being central features – to give His people any chance of remaining true. The slaughter becomes like the Mosaic permission of divorce: given because the people were weak, but not intended from the foundation of the world.

Intellectually, those are kind of OK. Not sure my heart agrees. Yet.

The wives are subject to their husbands and should remain silent in church stuff used to bother me much more than it does now. If husbands are to be as Christ toward their wives, laying down their lives for them, then the whole ‘who’s in charge’ thing seems to pale to insignificance. Further, unsolicited and coming as a complete shock, many years ago my wife said that of course she was subject to me. One of the chief reasons I wanted to marry her in the first place was because, while she was sweet and accommodating in general, I’d also seen that she had real spine when people tried to push her around. My immediate thought: I’m not the boss of you!

The effect of having my wife tell me that she is subject to me – and I’m sure she means it – is that I’m aware that she is looking to me to lead, and that what a terrible and wonderful duty that is. The last thing I’d want to do is push her around (not that I haven’t failed, but I don’t want to). Talk about throwing down the gauntlet: now I have to be as Christ to her! I think obedience, even to a clown like me, might be the easier task!

Once I did my best to accept that part, the whole ‘women silent in Church stuff seemed at least less offensive. If men must be in charge, but only in the sacrificial sense in which Christ is ‘in charge’, then reserving public leadership roles to men is defensible, and in fact could be beautiful is done well. The hard part: we will, and have, screwed this up. Our mothers, wives and daughters should be the most honored and respected people among us, as Christ shows repeatedly in His life, both in how he treats – and honors – the women he encounters, and in His command to love one another as He has loved us.

This cuts both ways, of course: men will error in both being bullies and in being cowards, women in being harpies and in being shrews. We’re one big screwed up family! Adam must now work and earn his bread by the sweat of his brow; Eve’s desire shall be for her man, and he will lord it over her. These two curses are perfect poetic justice for the sins. Doesn’t make it any easier on us.

And this disfunction all gets reflected in the Church. It’s almost like Democracy, in that we tend to get the leaders we deserve.

It is tempting and easy to come up- with ways to dismiss these troubling passages, or to let them destroy what little faith we have. I don’t know which is worse. It is better to embrace them, let them trouble us, and try to discover how they can be true.

This about plumbs the depths of my Scripture knowledge, so grain of salt and all that.

Theology: Developed versus Evolved

Image result for famous fossilsI’m part of a team at our local parish doing RCIA (Rite of Christian Initiation for Adults – the 6-9 month process an adult who wants to become Catholic goes through prior to 1st Communion, Confession, Confirmation and, if needed, Baptism).  I’m sort of the philosophy/history person, although the director and a couple of the other people on the team are perfectly capable of covering it. I talk too much.

We use a variety of materials from a couple of sources, of varying depth and quality. One, addressing what exact topic I’m not recalling the moment, used the word ‘evolve’ regarding Catholic dogma.

I probably don’t need to point out to many of the readers of this blog that ‘evolve’ is exactly the wrong term to use when discussing Catholic theology and dogma. ‘Develop’ is the right word to use.

First, evolve is used in (at least) 2 senses: the technical, biological sense, meaning changes to characteristics of a population over generations; and, more commonly, to mean ‘changing in a direction I like’.

The second sense is fundamentally dishonest, although I hasten to add that most people who use the word this way are most likely completely unaware of the dishonesty. They just picked it up from the way college-educated (“smart”) people talk, and would no doubt be baffled to discover educated people who object to that usage. What is dishonest is the replacing of ‘what I like’ with ‘what is obviously true’. Changes I don’t like are never said to be examples of evolution, but are instead given a pejorative label like ‘regressive’. This substitution takes place below the level of conscious thought almost all the time, I will generously believe for as long as I can.

Starting in the mid 19th century, Hegelians and their idiot children the Marxists met up with Darwin and his less clear-thinking offspring, the Darwinists, and discovered a happy (to them) marriage: the inevitable forward march of the Spirit/History was exactly like, nay, was perfectly embodied in, Darwinian evolution. Just look at how modern, more recent creatures are superior to ancient, outdated creatures! Why, it’s *just like* how modern, progressive ideas replace old, counter-revolutionary ideas by weight of their sheer luminous awesome superiority! It’s not a matter for argument, it’s a simple observation: just as dogs and elephants and canaries are obviously superior to velociraptors, diplodocuses and pterodactyls, democratic, scientific economics is superior to the primitive, competitive ‘free’ market.(1)

One remarkable thing in the history of ideas is how much effort, sometimes, the father or champion of a particular idea puts in to saying exactly what he does and does not mean, while later champions steamroll any subtilty in their hurry to use what they see as the gist of the idea for their pet projects. Thus, Hegel is careful to say that the forward march of the Spirit as revealed in History does not by its very nature admit of its use as a crystal ball – that the whole point of this gradual revelation is that we *don’t* know the future. We require Revelation, which doesn’t depend on and is not subject to human reason. Marx, finding Hegel’s disposal of logic useful but having no use for the divine revelation in History that take its place, immediately claims to know the future by virtue of his understanding of the Dialectic. It’s turtles all the way down, sure, but Marx has thrown out the top few layers of turtles and stands in midair. Charles Sanders Peirce, the father of Pragmatism, goes to great lengths to say Pragmatism is not merely the idea that the ends justify the means, only to have his great pragmatic successor, John Dewey, say exactly that.

Darwin himself does not use the word ‘evolution’ once in the 1st edition of the Origin of Species, and uses ‘evolve’ exactly once, as the last word in the last sentence of the work. (2) In the 12 years after publication of the Origin of Species before publication of the Descent of Man, followers of Darwin got labeled ‘Evolutionists’, so evolution does show 30 times in the later volume. Darwin claims that the ideas he presents in Descent will no doubt result in establishment of a scientific footing for psychology, since it’s clear (!) that consciousness and all other human mental characteristics and capabilities evolved from more primitive precursors in the lower animals from which man evolved.  Somewhere in there, evolution, which is at its roots akin to a simple observation, just one small inferential step removed from looking at related living species and the bones of what might be their ancestors, became the fundamental characteristic of EVERYTHING.(3)

And Darwin was more restrained than his followers. We end up with the second meaning of evolution as describing ‘change I like’ as little more than a Hegel-light or Marxist/materialist clarification of what Descent is talking about.

Image result for valley oakDevelopment is something much more organic and even ancient, having philosophical roots in Aristotle’s idea of Nature. A natural thing has within its nature principles of motion distinct from the accidental causes that might move it or, more generally, change it. An oak tree grows from an acorn. The principles of growth from acorn to oak tree are contained in – are the nature of – the acorn. The acorn might grow to be a majestic valley oak or a stunted oak among rocks or, indeed, get eaten by a squirrel. Those outcomes are at least partly the result of accidents. Growth from acorn to oak are by nature.

That gigantic digression out of the way, we now get back to theology. To understand that theology and church teaching in general might develop from what is already there should cause no one any heartburn. Any new understanding must point back to and be consistent with older understandings. An eternal God is impossible for us limited humans to fully understand, but as He is unchanging and internally consistent, so too must be our theology. People who want to contradict previous teachings must hope theology can evolve, meaning, as explained above, change in a direction they like, never mind logic or consistency. They hope, however unclear they are about it, for Hegelian revelations in history that are not subject to human reason and have no need to be consistent with what came before.

God is a God of Being – “I AM” – not a god of becoming.

  1. Unless we’re social Darwinists, in which case the same argument is made to support the opposite outcome of Übermenschen perhaps wiping a tear of passing weakness from their superior eyes as they witness the inevitable suffering and death of the less fit, before returning to their world-conquering ways. Beware theories that can be easily used to explain contradictory outcomes.
  2. “There is grandeur in this view of life, with its several powers, having been originally breathed by the Creator into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.”
  3. In writing this, it occured to me that my love of The Origin of Species has blinded me to the mess that is much of Descent: the overly-cautious Darwin of Origin, fresh, no doubt, from lying in a field watching bees pollinate clover, is always willing to acknowledge criticisms and admit of lacuna. The more mature Darwin of Descent will talk about consciousness as being of the same species, as it were, as a bird’s colorful feathers. Both exist in the natural world (he assumes) and thus are subject to the same set of evolutionary explanations. It’s like I turn to the baby pictures of a beloved child who is now doing hard time, and pretend my baby is still innocent.

The State of the News

Not anything in particular except by way of illustration.

When did the news become The News? I don’t know and don’t have time at the moment to research it, but it’s good to remember that, somehow, for centuries, people made do with gossip and hearsay about almost everything we consider news today, delivered by word of mouth. So things haven’t really changed much, except at some point the gossip mongers and rumor mills got professionalized. The also added some research capabilities, and have greatly taken advantage of technological advances. However, based on personal experience, on what a look at the news has revealed over the past 5 decades during which I’ve looked at it, the content only marginally and occasionally reflects supposed improvements in research (“investigative reporting”) – it’s still mostly of the quality of what I’d imagine the women discussed around the fountain in the village square.

Image result for dirty laundryInstead, professional and technological improvements have mostly merely expanded the scope of what is to be gossiped about, without much improving the quality. Our poor benighted ancestors would only gossip about the foibles of the people they knew, maybe encompassing a few neighboring villages. Maybe the local aristocracy might come come in for a few whispers. Now, we can hear gossip about ‘celebrities’ and politicians  (insofar as those differ) round the clock and around the world. Based on what’s in the news and who and how many are paying attention to it, the research from the Moscow Bureau or whatever serves a tiny audience – at least, until the “investigative reporting” by “senior correspondants” is reduced to gossip, in cases where such a reduction is necessary. Thus, what exactly is going on in, say, Venezuela or Palestine is unlikely to see the light of day in the Press, and will be simplified beyond legitimate meaning before it sticks in anybody’s brain. The facts as revealed in conversations with just about anybody in almost any media sadly seem to bear this out.

We did go through a period in my lifetime where certain news anchors were canonized if not deified. Walter Cronkite springs to mind. They were trusted dispensers of the Truth. So was, I suppose, Walter Duranty a few generations earlier. But a harder look shows that such news anchors and senior correspondents had only augmented their rumor mongering with a bit of propaganda. The assumption that they were any smarter or better informed, let alone more moral and truthful, than your average garage mechanic or numbers racketeer is hard to maintain in the light of objective evidence.

But we trusted them. And, frankly, adored them. Modern reporters are now faced with an increasingly hostile environment in which only the Home Team even listens to them, and only if they say what the fans want to hear (not that there seems to be much of a risk of anything else happening). It has got to be hard when your idols are attacked, and even harder when what you, the cub reporter, had aspired to goes up in smoke: if you do a great job and get a few breaks, you still won’t be respected and loved like Uncle Walter. People will probably do worse than hate you – they will dismiss you without a thought.

Maybe. Some reporters still seem to think News Media are the secular clergy, and that those who oppose them are thus heretics in need of a good burning at the stake.

I wish I were kidding.

A couple years ago, I wrote a blog post which included a discussion of one Zach Carter, a “Senior political economy reporter” at the Huffington Post. Over the course of an interview with a wizened political economist, Mr. Carter, it seems, was revealed to be both utterly uniformed on the topic of his supposed expertise – political economy – and, more worrisome, utterly unconcerned with his ignorance. In his defence, I’ll point out that the problem is really the Huffington Post’s, who gave him the job and title, and the University of Virginia, which gave him a degree supposedly in evidence he knew some stuff. Also, he can’t be much older than 35, and his pictures on his bios would make Zuckerberg appear an ancient sage by comparison. He’s still getting carded, I’d bet.

I take this as evidence of a general trend, that of inflating titles far beyond the demonstrable qualifications of the position-holder, merely because a) the company needs that position filled, and b) the person filling it must appear to have a certain gravitas that it is hoped a ponderous title might give him. Moreover, as prestige and money in the news media continues to evaporate, papers are forced to recruit among people who will take reduced compensation in exchange for perceived prestige – that’s how you get Senior Vice Presidents and Senior Associate Directors.

Next, I’ve lost track of some item I thought I’d saved somewhere, wherein it was claimed that journalism is more and more becoming a profession for those who don’t need a job. Googled around, and what I could find: journalism in general pays a solid middle-class wage, in the neighborhood of $50k a year on average. Put another way, your average journalist makes around $25/hr, which is a bit more than an auto union member makes, and indeed, considerably less than what your plumber likely makes.

Yet another source (not going to provide links here – these items were on the 1st page of a Google search on journalist salaries, if you’re interested) mentions that journalist tend to be highly educated, and that the name publications, such as the New York Times and Wall Street Journal, have high representation of people with degrees from elite schools, like 40-50% among editors and reporters.

Now, putting two and two together: if I get a degree from Harvard or Yale, I’m not looking at taking a few years to work up to a $50k salary. We can safely assume the NYT and WSJ, prestigious and located in New York, pay way more. But even twice the average – $100K – is not providing the kind of life a Manhattan sophisticate is expected to live.  I’m not paying off any of the couple hundred grand of school debt I may have incurred by going Ivy League on $100k/yr if I’m also paying Manhattan level living expenses. Further, if this is true, that cub reporter in Des Moines is going to be lucky to make $25K, or about $12.50 an hour, if the averages are going to work out.

So, the allegation that people who don’t need the money are overrepresented among journalists is at least not contradicted by the tiny amount of data I was willing to dig up for a blog post. Speculating a little more broadly, it would not be surprising if the wanna be Zach Carters of the world, with sterling degrees, no significant school debt, and delusions or at least aspirations of relevancy, might end up in journalism, while people with school debt to repay and objectively valuable knowledge and skills would have less of that tendency. Who knows? But in the immortal words of Don Henley:

You don’t really need to find out what’s going on
You don’t really want to know just how far it’s gone
Just leave well enough alone
Eat your dirty laundry

Finally, year before last, when the Oroville Dam was having some serious issues due to a rainy season with near 200% of average precipitation, I wanted to keep up on the goings-on. Evacuations, the risks, if any, to the dam itself, mitigation and repair steps taken, that sort of thing. If you consume any mainstream news, you will not be surprised to learn that I found the information on offer from these sources sorely lacking.

So I surfed around. I discovered a YouTube channel run by Juan Brown, a gentleman who lives in the general area of the dam, flies his own airplane, and likes to make videos. Turns out that a number of people put together videos on the failure of the main and emergency spillways and on the California DWR’s efforts to manage the situation. (The CA-DWR is the manager of our reservoir system). The DWR P.R. department even hired some people with drones to put out dramatic videos every week or so of the damage and, later, the repair efforts. Very pretty stuff. But Mr. Brown was the only source that stayed on top of it and, most importantly, seemed to actually understand what was going on. When something crazy was said on the news – and, shocking, I know, but any scary-sounding thing got immediately picked up by all the news ‘sources’ – Juan would address it in his videos. No, he would patiently explain, cracks or leaks in the underlying roller compacted concrete are not an issue, as Phase II entails installation of drainage and placing of a hardened concrete cap on top, for example. Cost overruns were not due (this one time, at least) to bureaucratic incompetence, but to the inability to get a good estimate due to the need to do a lot of work to understand the underlying geology before being able to size the project. And so on.

He attended the DWR news briefings, and seemed to be the only guy there asking intelligent questions or, indeed, understanding the answers. As you can imagine, the PR people with the DWR and Kiewit, the project management firm, started to get to know and appreciate Brown. Last week, he published part II of a guided tour of the site, not something the general public is getting, lead by a DWR and a Kiewit P.R. person.

At one point, off-camera, the lady from the DWR asked him a question: why are you doing this? Why are you so interested in this project? He gave the obvious answers: it’s in his backyard, it’s the biggest engineering project going on at the moment in the entire US, and he finds it fascinating.

Now, I have no way to independently verify the accuracy of Brown’s understanding and analysis of the Oroville Dam spillway projects, but I have a lot more confidence in him than I do any of the young, pretty people I’ve seen report on this in the ‘real’ media. Why? He asks the questions I would ask, and explains the answers in a way that makes sense.

Juan Brown is in some sense exactly the reporter who doesn’t need the money. He just doesn’t work for the media.

The appearance that needs to be saved here is the readily-observable ignorance and clear lack of worry over such ignorance by just about any news reporter or writer. The theory on the table is that careers in journalism appeal to a certain type of person: one who doesn’t need to make a lot of money, and who is attracted to an inner circle of sorts. The sort who can be paid in prestige, and who is not worried by, or perhaps fails to notice, their own manifest incompetence in the face of confusing facts.

In other words, the reason journalists are in general not any more reliable or informative than the women gossiping while drawing water in the village square is that, for many people involved, it’s not a passion for accuracy or truth that drives them, but in fact something much more akin to that feeling a gossip gets when she has something particularly juicy to share.

Maybe? Hey, it’s a theory, I’m sure there are others.

Perhaps next I should think about what news even is, really, and how much, if at all, we need it. I suspect not very much.

Voting is Like Taking Out the Garbage

Yes, over-the-top clickbait style title. Just thinking out loud here…

Related image

In order to have civilization, you have to take out the garbage.  When people are few and far between, you can dispose of your refuse any way you want, partly because you’ll likely produce little refuse, and that refuse will be biodegradable or at least ‘natural’, partly because you don’t have many neighbors to complain about it.

But once you get civilized, the root meaning of which is ‘living in cities’, garbage disposal becomes an active concern. Your neighbors very likely will care where you dump your garbage. Your own home will become a dump by default if you don’t make the effort to get rid of that stuff.

No one mistakes taking out the garbage as the purpose of civilized life, even though proper waste disposal is essential to it. Instead, if we think about it at all, we think proper waste disposal is something we all do in order to make and keep space for doing what is more important to us. A comfortable, non-smelly home with places for meals, conversations, sleeping and so forth is the goal on a personal level; on a community level, we want similar standards applied to public places for similar reasons. Therefore, we take steps both for our personal garbage disposal and for methods and places to deal with our collected garbage.

Thus, every city, town and village has its garbage men and dumps. Public piles of trash outside of dumps are a sign that civilization is slipping away or has never completely arrived. Privately, Hoarders, cat ladies and people who never seem to clean up their own messes are a tolerable nuisance, usually, but could become a public issue if their personal garbage gets too far out of hand.

Image result for plastic straws
Oh, the huge manatee!!

Few imagine that their success or excellence in dealing with garbage is a defining characteristic of their personhood. True, out here in California, you will meet the Prius-driving composters who would never use a plastic straw nor fail to recycle a soda can and who thinks anyone who fails at these steps is Destroying the Planet and therefore probably irredeemably eeeeeevil. But even out here, people tend to be more sane than that, and take into consideration other personal factors, such as friends, family, hobbies, and achievements before marking a person for future culling once the right-thinking people achieve their peaceful, righteous totalitarian paradise.

Not so with voting! In two different senses, voting seems to be popularly considered an indispensable sign of full personhood. First, not having the right to vote makes one less than fully human in the minds of many. Second, to some, voting *wrong* makes one an unperson, as evil, stupid and suitable for extermination as people who consciously put plastic straws in the San Francisco Bay.

I contend, rather, that voting is much more like our duty to take out the garbage than it is a defining aspect of full personhood. Voting is something we do for the sake of other, much more important things. It is those important things – family, friends, possessions and the freedom to enjoy them – that give voting its meaning.

Historically, in America, we had a revolution to a large extent over the colonists chafing at the very idea that a government an entire ocean away could make and enforce rules and taxes without so much as a how do you do to the people to be ruled and taxed. Coming from Britain, the colonists had inherited a belief in a commonwealth reflected in common law – the idea that certain rights and duties had been established by centuries of precedent, and that the day to day laws were to reflect and reinforce those precedents. More simply, the English in Britain had one commonwealth, which included peculiarly English laws and traditions, royalty, parliament and so forth, while the English colonists in America had developed, over the centuries prior to the Revolution, a different commonwealth, which included, among other things, the practice of self-government. That the Crown would attempt to unilaterally impose its will with no regard to the colonists’ long-established practices shined a stark light on the fact that America was not the same naturally-constituted Nation as England.

In such an environment, the simple act of voting, of having a say in your own government, took on the sacramental quality of religious dogma. “All men are created equal, and endowed by their Creator with certain unalienable rights…” – this is a religious dogma in its very formulation. Compare this to the early English practice of having local votes on local issues, such as each man who bore arms got to vote on (local) issues of war: because it was my life I was putting on the line, I get a say. In medieval practice, a woman, or a teenager we might consider a child, might get a vote in local decisions if they were the ranking representative of their family. Voting was more or less tightly bound to personal duties and obligations the voter would be expected to be personally responsible for.

Having a farmer or miller vote on ‘national’ issues or ‘candidates’ made no sense, not the least because the modern idea of a nation or candidate are complete anachronisms when applied to the Middle Ages. Instead, I, the local farmer, owed allegiance to a local lord, who in turn vowed to protect me and mine and to honor our rights. That lord owed allegiance to a greater lord in a similar way. Such allegiances might or might not roll up to a king or emperor someplace, but even such nested loyalties were built upon local, often face to face, loyalties, duties and rights.

The English systems grew out of these medieval roots, and, at the time of the Revolution, weren’t all that far from them. Indeed, the new Republic’s voting ideas reflected those English roots to some extent: State governments selected Senators and Electoral College members however they saw fit; the President therefore worked for the States and only indirectly for the people. The federal judiciary was yet one step further removed from popular vote. Only the House of Representatives was the direct result of state-wide elections.

But this removal of most of the Federal Government from direct election by the People contradicted the dogma that government gains its legitimacy from the consent of the governed, and, even more important, the inescapable corollary that the individual is the sole sacred locus of all legitimate political power. It is clear from the Federalist Papers that insulating the bulk of the government from the whims of voters was an active goal, reflecting the republican idea that we share an inherited commonwealth that is not open to revision by vote. Such a commonwealth included the notion of individual rights, and government of, by and for the People.

The idea of the sovereign individual who reigns supreme via his consent given at the ballot box conflicts not only with the idea of an inherited commonwealth that his vote cannot overrule, but with reality in general. It seems the Founders assumed voters would be like them – men for the most part thoroughly invested in family, children and usually land. Those families, especially those children (“our posterity”) are the direct embodiment of the commonwealth. A voter could only legitimately exercise his franchise to support the commonwealth! A voter votes as a son, father, and husband, or his vote is not legitimate. Those of us who are sons, fathers and husbands get this instinctually.

This conflict between the sovereign individual and the family man produced by and protecting a commonwealth can go one of two ways:  either individual sovereignty becomes THE measure of worth in society such that not having it is being relegated to non-person standing, or voting a secondary or tertiary thing that only has value insofar as it promotes and protects the commonwealth that is the place where individual rights reside.

Further, if we go the sovereign individual route, the commonwealth itself cannot be off limits. We must be able to vote away our rights, for example, or we are not truly sovereign individuals – something completely contrary to what the Founders stated, but an inevitable result of the logic’s gravity.

In the hoary American tradition, we’ve mostly whistled past this issue for 200+ years while sliding with greater alacrity toward sovereign individualism. In a final twist, a large number of the latter-day recipients of the franchise – women, blacks, 18 year olds – choose to vote for various flavors of the idea that the individual is nothing, the masses everything. Inheritances such as free speech and due process are attacked daily – by popularly-elected officials. The gravitational pull of sovereign individualism toward destruction of the commonwealth is not just a theory.

Under a republican understanding, where a Republic consists of a common wealth held by all to the benefit of all, a citizen does not need to be defined as a voter. Citizens are all those who share fully in the benefits of the commonwealth. Voting becomes the means to an end: the protection and promotion of the commonwealth for the sake of family, and, particularly, our posterity. It would be absurd from this view to pit the right to vote against duty to family and Republic, since voting exists for the sake of those things. Under this view, voters should be those who are best situated to defend the Republic. The idea that voting could be allowed to drive a wedge between members of the same family would be a horror, or at least wildly counterproductive.

Rather than the ultimate expression of our full adult personhood, voting is more like taking out the trash. It needs to be done in order to have a civilization, but it is not that which defines us a full adults.

Finally, sovereign individualism flies in the face of reality in another sense: we Americans with few exception spend tiny amounts of time and effort on voting. If we really believed voting is the highest expression of our human dignity, maybe we’d hold votes more often that once every year or two? Maybe get the week before election day off to allow proper study of the issues and candidates? Perhaps have quarterly or monthly holidays on which to hold local meetings to discuss politics and try to understand our neighbors? In other words, shouldn’t we ACT a little more like voting is all-important if we claim to believe it is?

(Just realized I almost went full Starship Troopers here…)