Actual versus Potential: Aristotle and Quantum Probability

I only understand maybe 50% (and that may be optimistic) of the esteemed William Briggs’ latest post, but must share: Quantum Potency & Probability.

Here’s my take on the issue: I’ve heard most of my life about how, at a quantum level, reality is probabilistic. What this seems to mean to people propounding it is that reality, viewed on a fine enough level, is not governed by the laws of cause and effect, nor even by the law of noncontradiction. Things can come into being and pass out of being for no reason; and some things can truly be said to both be and not be at the same time in the same way.

To be fair, it’s not often put exactly like that, but it sometimes seems to be. ( As is almost always the case, the better the scientist, the more careful they are about how they express themselves. Heisenberg was a great scientist, and so he was generally careful. His acolytes, and especially those who use him as a club with which to beat their enemies, not so much.) And to honest, as mentioned above, it’s not like I understand the math or even the finer points of the experimentation that is claimed to lead to these assertions. What I do understand is that math is not reality, however useful and even indispensable math may be to our understanding and using of the world.

In his book on the philosophy of statistical analysis Uncertainty: the Soul of Modeling, Probability and Statistics (which I still need to reread and review here! Time eats life, as some French dude once said) Dr. Briggs takes great care to distinguish between epistemology – how we understand things – and ontology – how things are. Applied mathematics belongs to the world of epistemology. I am reminded of a section of the Feynman lectures where he pauses after having filled a couple large blackboards with equations to note that it sure took a lot of math to describe what was, essentially, a simple motion, and that nature in doing what it does certainly isn’t doing all that math.

Related imageAnd, for me, that is the point. Just because quanta are nigh impossible to see and measure and appear to behave in incomprehensible ways doesn’t mean that their states are not caused, nor that they are anything other than what they are regardless of what we are able to deduce about what they are. It is a radical and unnecessary step, and contradicts the minimalist approach embodied in Occam’s Razor, to assume a new principle: that there are classes of uncaused phenomena, not just phenomena the causes of which we don’t yet understand.

The discussion on Dr. Briggs’ blog is far more nuanced and deep than my feeble understanding. One part I do understand, and which is commonly discussed on this blog: Insofar as science actually advances, they are following Aristotle and not any of the post 1630 philosophers. (1) Hylomorphism – the understanding that any object in the real world that we can consider is made up of form and matter – is, of course, how science routinely understands the world, even if the terminology has been beaten out of it. Modern science desperately wants there to be material and efficient causes only, and so does its best to pretend that there are no formal or final causes. This results in the absurdity of saying, for example, that a bird’s wings are not *for* flying, that it is not possible to describe them in terms of how they are to be used.

Of course, nobody talks this way except when pushed to the wall. But our analytic philosopher comrades, living on the cutting edge of the early Enlightenment, must insist that we don’t know and can’t meaningfully talk about formal and final causes lest we fall into the trap of *gasp* metaphysics. Can’t have that. Can’t live without it, either, but that just makes them mad.

Anyway, the most fascinating idea:

Additionally, hylomorphism entails a gradual spectrum of material beings with greater degrees of potentiality to greater degrees of actuality. Something has greater actuality if it has more determinate form (or qualities) and something has higher potency if it is more indeterminate with respect to being more receptacle to various forms. For example, a piece of clay has higher potency insofar as it is more malleable than a rock and thus more receptacle to various forms. A rock can likewise be modified to receive various forms, but it requires a physical entity with greater actuality or power to do so because it has more more determinate form as a solid object… [H]ylormophism predicts that you will find higher levels of potency because you are getting closer to prime matter. This is precisely what we find in QM. The macroscopic world has more actuality, which is why we experience it as more definite or determinate, whereas the microscopic world has far less actuality, thereby creating far less determinate behavioral patterns.

Briggs quoting Gil Sanders “An Aristotelian Approach to Quantum Mechanics” (which I haven’t read yet, but will). My paraphrase: the higher up a thing is on the scale of being – the more ensouled, the more natural in the sense of having a fuller nature – the more primary is form. The lower one goes, the less primary is form. Thus I am a human animal, among the most natural objects in the universe, one where over my 60 years has had pretty much all the matter in my body swapped out one or more times. Yet no one sane doubts that my form – human animal – has persisted through all those changes. Once we get down to barely perceptible objects, we barely are able to perceive their form at all – all we can see are the mysterious undulations of prime matter as various forms subsume it. And this is what an Aristotelian would expect: less or lower forms, less nature, less definition.

Mind blown. I’m going to need to think this over a lot.

  1. 1630, more or less, is the year Descartes retreated to his room, drew the curtains, contemplated his navel and started producing the anti-Thomist philosophy that spawned all the crap since. I wouldn’t object to using 1517 as the real start date, but it’s Easter Week! We’re playing nice!
Advertisements

Evolution & Society

Speaking of getting more circumspect the more I learn, treading more carefully on evolutionary topics these days than I used to. Thanks to those who have brought more depth to my understanding here and at other blogs and such, especially Mike Flynn. Of course, my continuing lack of understanding is nobody’s fault by mine. Onward:

There are here both a basic idea and a basic problem that war, or at least are made to war, with each other. First is the grand idea, not quite so grand as many imagine but grand nonetheless, of species arising under the pressure of natural selection. Darwin spends the first part of the Origin of Species (note: origin of species) discussing how farmers have always, more or less consciously, artificially selected the most desirable plants and animals for breeding and thus perpetuation. That’s the model to be kept in mind always when considering Darwin: natural selection is to be understood as analogous to what a farmer does.  Continue reading “Evolution & Society”

Time & Eternity

It seems that a lot of modern arguments over the existence and nature of immaterial reality hinge on a misunderstanding of what classic philosophers mean by ‘eternal’. Fool rushing in, I, the least among philosophers, will try to explain in one blog post what Aristotle and Thomas and hundreds of vastly better thinkers have filled libraries discussing. But, hey, never stopped me before! And maybe it will prove helpful to somebody. Weirder things have happened.

Here’s a description of how I understand the relationship between time and eternity as understood in a Thomist/baptized Aristotelian scheme:

Aristotle Altemps Inv8575.jpg
Da Man.  St. Thomas says so. 

Time is, as Aristotle says, the measure of motion. By motion, philosophers mean a change of any kind, not just changes in location. This definition may seem a little weird, but upon reflection is what any other meaningful definition must boil down to. For something to be one way or in one place and then get to another way or place, time must pass. A moment ago, the ball was orange and at my feet; now it is green and over there. Somethings have changed – time has passed.

Oddly enough, the key here is the verb ‘to be’ in its various forms. A mutable thing *is* at a given point of time; it *becomes* something else – green and over there – over time.

The funny thing: a man or a dog or tree or a river is what it is over the course of its life or existence, even though the material it is made of – meat or wood or water – changes over time. A man is the same man in some fundamental way over the course of his life, even if, as is the case, most of the material his body is made of gets swapped out, often many times, over the course of that life. Something persists over time that makes that man who he is, and it can’t be material. If it were matter, then a man would not be the same man after each meal or breath.

This fact, without which we could talk of no thing, has inspired much philosophizing and is at the roots of the Perennial Philosophy.  It is the recognition that some things are not matter and that talking and thinking about things requires a type of presence and persistence that matter alone does not offer.

Further, there are certain fundamental ideas to which no matter at all corresponds, that have no place in time whatsoever. No physical thing is a triangle or a rule of logic. Yet we are more certain of what a triangle is and what the law of noncontradiction means than we are of any of the ‘blended’ being we encounter in the physical world. These pure ideas are not mutable – it is of their nature that, if we understand them at all, we understand that they cannot change.

Some understanding of the nature of being falls out of this necessarily. Unchanging things belong to eternity. Eternity is not lots of time, or even infinite time, but rather is – something else. When we say that triangles, laws of logic, our souls or God are eternal, we don’t mean they last a long time, even an infinitely (unbounded) amount of time. We mean they are of a different order of being.

Carlo Crivelli 007.jpg
Too humble to claim to be Da Man. But, really – he’s Da Man. 

Over the Physics and Metaphysics, hundreds of pages of Aristotle filled with arguments teasing out what reality is like. The Philosopher concludes that things in time – all the common things we experience – are the way they are because of immaterial things. Ultimately, through however long a chain of causes (or ‘becauses’ if you want) everything is caused – is and does what makes it the thing it is – by an eternal, unchanging Unmoved Mover. This, as Thomas pointed out 1500 years later, is what everyone understands is ‘God’.

In De Anima, Aristotle discusses the ‘soul’, by which he means the animating principle of all living things. Plants have souls which cause them to grow and reproduce; animals have souls that, in addition to growth and reproduction, allow them to sense and move about.  Men, as animals, have a soul that shares these powers. But men do one thing animals and plants don’t do – they understand.

Aristotle saw no reason animal and vegetable souls would be any less mortal than the material bodies they informed. You dog dies – its soul is gone. The remains are no longer a dog in any coherent sense – dead means ‘its soul is gone’ and that soul is what made that dog a dog. A dog, or a petunia, or a person does not have a soul; a living thing IS a soul and a body – an immaterial form informing matter. For plants and animals, the distinction between body and soul is purely intellectual or even theoretical. In practice, every plant and animal is both, or it is not a living thing.

Aristotle puts a surprising number of mental activities within the realm of the animal soul, because he, unlike most of us modern men, lived intimately with animals. He could see that a horse or dog figured things out, imagined some sorts of things in the course of acting (like where the rabbit was likely to be hiding), and even, in the case of dogs at least, dreamed dreams. But men do some categorically different thinking. We are capable of knowing eternal things, of pondering triangles, moral law and God Himself. Aristotle saw that this kind of thinking is different in kind from anything animals do, and so recognized a third kind of soul, the rational soul or intellect.

Here’s the logical step not followed, one I can’t spell out in a blog post: Souls capable of contemplating eternal things must themselves be eternal at least in some sense. Aristotle isn’t clear that this sense is personal as we understand it – that each individual human being has a unique immortal soul. Thomas spells this out: each human being has a unique immortal human soul that is and must be a direct creation of God.

The human soul is a creature of eternity. When we speak of our eternal home, we don’t mean a place within time, except with way more time. We mean a state beyond human understanding, of which we have only the faintest ideas as if seen in a mirror darkly. Somehow, within the Eternity that is God Himself, all creation from beginning to end is loved into being. Somehow, we have been given the incomprehensible gift of Time, within which we get to act on our nature formed in the image of God by understanding and creating and especially procreating.

A mystical as this all sounds, Aristotle, no Christian and no respecter of gods, got almost all the way there as a result of pure, hard-headed reasoning. He asked the hard questions: how is it that we know anything at all? How do we know about things like math, logic and the moral law that don’t materially exist? How is it that the world is so rationally ordered? In modern times, we flinch, and instead ask sophomoric questions and smirk suicidally at our own cleverness as we assert that our better questions are unanswerable: do we know anything at all? Are math etc. knowledge at all? Is the world really rational, or is that just us projecting?

Then we answer them. It is not clever to saw off the branch you’re sitting on, especially considering how high off the ground you are. To say we know nothing, that only material things exist and that what appears as an orderly world is just a projection, wishful thinking or a construct, is to destroy any basis for understanding or even communicating.  It’s not more reasonable. It’s just another flavor of the impulse that drives teenagers who snap back at their parents: I didn’t *ask* to be born!

More “Progress”

Surfing for job related reasons, came across this article (which I link to be polite; life is too short to read such things unless you’re paid to do it). I was lead to ponder: A related idea to Chesterton’s  point about classrooms – it’s what the schools assume that the students will learn even as they ignore what the teachers say – is the notion that it is the assumptions underlying an essay such as the article linked above that carry any message that might stick.

What message would that be?

Peter Drucker, the management guru, is often credited with the all-too-true saying that “culture eats strategy for breakfast.” In a later era, tech guru and investor Marc Andreessen famously said that “software is eating the world.” Now … there’s a growing realization that culture is eating software for breakfast, and perhaps lunch and dinner as well.

The challenge for IT executives and developers alike is addressing corporate culture and organizational issues that complicate even the best intentions.

There’s more along a similar vein. In fact, there really isn’t anything else in this essay.

Related image
This man has the world at his feet! Or by the feet. Something like that.

I suppose a Cobbler’s Guild, faced with the daunting challenge of filling blank electronic pages, might publish articles about how nobody’s going anywhere without shoes, and there must be a meeting of minds between the shoemaker and the shoe wearer. People wear shoes at breakfast, lunch and dinner! We have a shod culture! Imagine the solemn duty, the awesome dignity we, the shoemakers, have to lead the culture – in comfortable, stylish footwear – into a glorious future.

Note the relationships implied in these short sentences quoted above. Culture, which we might think of here as simply the conventions honored by people when they function together, eat strategy. The implication – Peters is a *management* guru, after all – is that the culture should be *managed* in order to better facilitate acceptance of strategy.  Andreessen, an alpha geek, stands Peters on his head and says software is eating the world. (Software assumes the rhetorical position held by culture in the previous sentence – hmmm.) I suspect he might not see this world-consumption by bug-ridden and ephemeral tech as an entirely bad thing, or at least see it as an opportunity of some sort. Sounds like a horror movie plot to a sane person.

Related image
An artist’s impression of Software.

IT people face ‘challenges’ in addressing corporate culture that complicate ‘even the best intentions’. Who, then, would be having these intentions? Would it not have to be the people in charge of the corporation, who have more or less intentionally shaped the culture?

IT people, who are legendarily among the least socially clued in people on the planet, are to see trivia like other people’s intentions and culture as mere obstacles to their intentions, which they summarily and conclusively presume are the *best* intentions. IT intentions contain, as an island inside another circle in a Venn diagram, any *worthy* intentions of the customer.

I wish this were exaggeration. Instead, it’s not the half of it. Man with a hammer style, IT people tend to more or less consciously believe that, always and everywhere,  top-down, expert-driven, we know what’s best for you solutions are not only the best solutions, but are, definitionally, the entire set of possible solutions.

And it gets worse! Because of various tech booms and consumer gadget-lust, technology leaders are often rich, insulated by money from those factors in the real world that stood a chance (however slight) of smoothing off the jagged edges of their hellish ideas. AND that money allows them to ACT on those unpolished ideas.

Woe unto us, and our children! Those ideas will fail in the long run, as all ideas untethered from reality eventually fail. But the damage inflicted as they thrash in their death throes would be something to behold – if we weren’t the folks getting thrashed.

Our heartfelt appreciation of a good, solid, comfortable pair of shoes does not, I should hope, incline us to appoint the cobbler God-Emperor. Our humble gratitude is what is due, and should be enough. IT is glorified cobbling, no more the fount of wisdom than any other rather narrow craft. But try telling that to the tech billionaires.

Let’s paraphrase Heinlein:

“Throughout history, ignorance and hubris are the normal condition of man. Advances which permit this norm to be ameliorated — here and there, now and then — are the work of an extremely small minority, frequently despised, often condemned, and almost always opposed by all right-thinking people. Whenever this tiny minority is kept from enlivening the culture, or (as sometimes happens) is driven out of a society, the people then slip back into abject ignorance and hubris, and recommence killing each other with appalling gusto.

This is known as “bad luck.”

 

 

On Progress and The World as Grass

Two interesting posts from two of my favorite regular blog reads:

Mike Flynn says:

“We often hear that the rate of progress is accelerating. Change is coming faster and faster. Things that were once pooh-poohed as “slippery slope fallacies” only a few years ago are now spoken of as inevitable and well-established. We are building something new, we are told.

“Yet a building being constructed does not move faster and faster. A building collapsing does, as it accelerated under the force of gravity.”

Brian Niemeier says, among other things:

There’s another, more sinister aspect to this phenomenon that heightens the already disorienting experience of learning that the Weird Al single you’d meant to buy on release but kept putting off is now old enough to drive–like children born on September 11, 2001 are now. It’s an empirical fact that Western pop culture–and even Western technology itself–has remained largely static since the late 1980s.

Submitted for your consideration:

  • The last two generations of iPhones have had no new features.
  • The celebrated iPod performed the same essential function as a 1970s Walkman.
  • Movies and TV are dominated by sequels to film franchises and adaptations of comic book story arcs that first gained popularity in the 70s and 80s.
  • Nintendo is still the biggest name in video games, trading on IPs it established in the 80s.
  • In terms of ordinary street clothes, popular fashion hasn’t changed substantially since the 70s. You could zap the average American twentysomething dude back to 1988 right now, and no one would bat an eye, except perhaps to comment that he looked like a slob. There would be no Marty McFly-style gaffes, e.g.: “Hey kid, you jump ship?” “I’ve never seen purple underwear before!”
The issue is bigger than a generation of kids raised on Nickelodeon turning 40. As the 21st century lumbers out of its infancy, we find that the music-makers can only sample Vanilla Ice ripoffs of Queen songs; and the dreamers can only dream of the lifestyle their parents took for granted.
We’d better get some new dreams.
I commented on these thoughts, respectively:

Good image. Also, from working in the software industry: progress almost never means coding, or more generally, the stuff you can see happens as a result of the real progress, but is not progress in itself. Almost all the progress happens before there’s anything to show for it.

Two wildly different examples: in my industry, meaningful progress happens during the ‘thought-smithing’ stage, where sharp people figure out what’s really going on, what’s really necessary. Ideas and processes crystalize. THEN, if you’re lucky and did a good job, coders code, and there’s software to look at. But coders code and produce stuff to look at all the time – it’s called ‘shelfware’, beautiful software nobody wants, so it sits on a shelf. Conclusion: the software itself isn’t where things got made better.

Second, in honor of the upcoming feast of St. Scholastica, a lot of real progress was made more or less unintentionally when the great Benedictine monasteries were built. The Rule of St. Benedict and the motto Ora et Labora ARE the progress – they ALLOWED the monasteries to spread, thrive, and change the world through being consistent pillars and sources of stability, civilization and technological development. It was almost like having a cultural mom and dad, who, just by being there and not budging, allowed the kids to grow up more confident and optimistic.

Corollary I: few people ever see where the real progress is made, they only see the results of real progress and imagine those results are causes rather than effects.

Corollary II: What people most tout as progress probably isn’t – which I suppose is your point.

And:

Your point about new gadgets is good. I suspect the number of ways people can be distracted is not all that flexible, so a cool gadget that really hits the spot has nowhere to go. Technologically speaking, phones, games, movies can only improve on the margins.

Look at the new gadgets people seem to be pining for: robots (especially sexbots!) do DO anything really different, just free up more time for? New gadgets? Flying cars are called ‘airplanes’. Otherwise, we want *better* books, phones, games, movies – the same things, only better. Real progress in most ways we spoiled consumers define it has come to a halt.

Hegel and by extension all other believers in Progress as a sort of benevolent force at work in the world hang their faith on the very evident material progress made over the last 250 or so years. In his Logic, Hegel in fact asserts that it is obvious traditional logic needs to change (in the sense of be destroyed) as it alone among the arts and sciences has remained ‘unimproved’ since Aristotle. He sees Progress at work in the world, and anything not progressing as being, as the cool kids say, On the Wrong Side of History.

A story told by Feynman springs to mind: he was once on a scientific junket of some sort to I believe Brazil, and was asked about the problems of the poor and if science had anything to offer. The specific example was how slum dwellers needed to march down a hill for a long ways to reach potable water, and then haul it back up to where they lived.Feynman points out that all the technology, all the science needed to solve this problem existed and had existed for decades or centuries: run a pipe up the hill and put in a faucet. Whatever the reasons for that simple solution not having been done, science wasn’t it.

The Antikythera Mechanism. A beautiful dead end. ‘Ahead of its time’ – whatever that’s supposed to mean!

In a similar way, most of what we see as progress day to day is application of technologies developed years earlier. And, worse, it’s almost all fluff – unless you need cutting edge medical care. Even then, chances are the cutting edge is built on ideas that have been around for decades. Our TV and phones and cars are marginally better than they were 10 or 20 or 50 or a hundred years ago – but they serve the same purposes, and the new improved versions have improved our lives little – unless we measure improvement in gadgets.

Real progress is messy, difficult and relies on changes of heart and mind more than any mere material invention. The Greek philosophers legendarily considered caring (much) for practical improvements to day to day life to be beneath the dignity of a real man. Practical progress of a sort was made in some arts, Archimedes is a legend himself – and then there’s the Antikythera Mechanism. But the outcome was not airplanes and moon landings, or even better plows and printing presses – it was constant internal bickering followed by conquests by the Macedonians followed by the Romans and jobs as tutors to their conqueror’s kids.

What the Greeks were missing was ‘why’. Certainly, they were brilliant, curious and ambitious enough to have accomplished so much – that made little material difference. It took the influence of Jerusalem and Christian Rome to provide a civilization with enough room, enough hope, to turn random intermittent ‘progress’ such as is characteristic of men whenever and wherever we live into a program, a communal effort.

If we are made in the Image of God, and the Heavens proclaim His glory, and the world is His handiwork, then applying our minds to understanding the world is a worthy activity. We can use that understanding to better serve our brothers and sisters. We needn’t accept the way things are. Christians are the only people who as a culture were not indifferent to the lives and deaths of the poor. Romans and Greeks, Indians and Chinese would have considered it an affront for a poor man to have the temerity to die on their doorstep; a Christian would be expected to see it as his own personal failure. Look what I  have done the the least of these!

Only if despair is considered cowardice and treason will we persevere in our efforts to help the needy. Only in a culture of hope and duty to one another can material progress become the norm. Such material progress is a side effect of a change of heart.

To the nihilist, relativist Progressive, technology is a tool of power, and science is a bother when it does anything but serve politics. True science, which is no respecter of men if it is science at all, is a threat to power. It follows where it will – and we can’t have that!

But we can have more and batter gadgets, and live an ephemeral life. Until we don’t.

Personal Interlude

I’ll be 60 in 2 months. This is cause for self-indulgent navel-gazing self-reflection. Also, I’m feeling a bit better, let’s see if I can write anything.

The only things in my life I’m unequivocally happy about are my marriage and our children. Work? Nah. Grim necessity that is made worthwhile by the just mentioned wife and kids.  I’m a stone expert in certain arcane corners of equipment finance. Not a great conversation starter. I dread answering the question: what do you do for a living? I tend to say ‘sell software’ because it’s true, although not really the heart of the story – which no one wants to hear anyway.

Got a boatload of hobbies that have evolved over time. Love to make things out of wood – our house is full of bookcase, tables, shelves, and boxes I’ve made.

060
e.g., this triple bunk bed for the younger daughter’s room. Put in rails after this picture. 

For the last few years, it’s been bricks:

The woodworking I’ve been doing since I was 5. The first thing I remember trying to build was a boat, out of scraps of paneling left over from redoing the garage. Remember cutting a piece into a gothic arch sort of shape, and trying to attach sides with finish nails – yikes! Didn’t get real far, but kept at if for a good while, as my handsaw chops were, I imagine, only slightly better than your typical 5 year old. Realized it would never work because I could never get the seams closed enough to hold water. I remember sitting in it and pretending, though.

My proudest childhood achievement was a total remake of a 4′ x 8′ playhouse my older brothers had built earlier, when I was 11:

  • Added a 2nd floor, which required reinforcing the ceiling/roof;
  • Repurposed a ladder from a bunk bed into a super-cool retractable ladder hinged to a board that fit into the ceiling – the whole thing was balance by a series of pulleys, nylon cord and a coffee can full of rocks, so that when you lifted it, it just rose right up into the ceiling;
  • Added a door and windows that could be closed.
  • Added some railing around the top floor so kids wouldn’t fall 60″ to their deaths.

Ended up converting the playhouse into a workspace for balsa wood models, of which I made maybe 3-4.

Also, at age 5, my mother let me plant some pansies in a little spot by the front porch. I was fascinated by them, watched them grow. I have no green thumb, but do love growing things. Put in an orchard this past spring:

I’ve tried and mostly failed to grow stuff over the years, in the sense that, for example, the few tomatoes I’ve grown are very expensive even if I value my time at next to zero. I can’t even grow zucchini. But I keep trying.

Back to my wasted youth. Then we moved. At age 12, started working for my dad on Saturdays and eventually summers at his sheet metal fabrication shop, sweeping floors and cleaning up the scrap metal. Eventually learned to do most activities except welding (a failure I regret to this day) and set up of the fabricators and presses. (I was pretty good with a blowtorch – 35+ years ago!)

Dad had a heart attack at 59 that nearly killed him, and turned him from a high-energy maniac into a more plodding and easily-tired maniac. His doctor told him he had to sell the business. Neither of my older brothers was interested in working with my dad, I was all of 18 at the time – and so, after a 15 year run, Astro-Fab was sold, and my parents and youngest brother moved to Newport Beach.

Skipping over the boring basketball/drama/choir combo that occupied my time in highschool (and made me the oddest of ducks even before you factor in my reading habits – V-II docs, Plato and Asimov’s non-fiction, for example. Fit right in!), we get to a possibly odd little fact: I grew up in a blue-collar household, where achievement meant making something you could see. There was no value placed on what might be called intellectual achievements.

This bias toward stuff you can, as Ted Nugent says, bite and away from less concrete achievements I absorbed with my mother’s milk. It just is. College was, in some sense, baffling to me: unlike high school, which was filled with students who could have hardly cared less (or were careful to project that image) about intellectual stuff, here were all these people my age who, for example, kept papers they’d written! Like the written word was some sort of achievement to be proud of!

I could not imagine. Intellectually, I get it, but even now there’s a part of me that whispers: writing is not work, it’s not worth anything. (This same voice tells me in the same way that I, likewise, am not worth anything. Package deal.)

I try to battle on. When I decided to write music (left out the part about taking piano at age 15 – bless them, the folks were cool with it), I developed a beautiful music script, even going so far as to get some calligraphy tools to make sure it was pretty. This, despite my handwriting being all but completely illegible. See, I think I needed to make it pretty to look at in order for me to think it was worth anything. Or something – all I know is that, when I wrote music, I compulsively wrote it out again at least once, to get the spacing right and clean it up. Pretty sure I spent as much or more time writing it out as I did composing the music in the first place.

Had one musical triumph: got a composition teacher in Santa Fe when I was maybe 23 who also directed the Santa Fe Women’s Ensemble. After a few lessons, she told me the Ensemble would perform a piece if I wrote one for them.

Wow. So I threw myself into writing something, decided to go ultra traditional and set the Kyrie. The first part was very much inspired by traditional polyphony; she told me to make the Christe part contrasting – which I overdid, a little harmonically adventurous, let’s say. Anyway, it was OK – I spent hours writing out a beautiful copy, even got a calligrapher friend to do a cover page – and they sang it, people paid to go to that concert, even got reviewed (favorably – the reviewer compared my piece to Victoria – I blush!) .

And – can I find that review? Can I find that recording? I can lay my hands on the music, I think, because I made a bunch of copies for the Ensemble – in a accordian folder somewhere.

Was I thrilled? Did I go on to be a composer, at least as a hobby? No, and pretty much no. Have a small pile of pieces, almost all incomplete, almost all 35+ years old. They molder.

Around this time I decided I actually enjoyed writing. This is pre word processor, and I don’t know how to type (this self-indulgent dump is brought to you by fast hunt & peck). Don’t know why I liked it. But here we are: half a dozen years, 1200+ blog posts and a million words later. Got piles of mostly unfinished stories and parts of maybe 3 novels accumulated over the last 30 years doing the electronic equivalent of moldering.

So: can I spend the years left to me overcoming a lifetime of failure to follow through and complete intellectual things, and get some stuff finished?

Stay tuned.

And pardon me for the self-indulgent nonsense.

 

Man Was Not Meant to Think Alone

I’ve long been struck by the philosophical and theological sundering of man from other men that began in the 16th century. Since ideas matter, as Sola anything and Cartesian navel-gazing replaced living tradition and the Question method and, indeed, the very notion of a ‘school’ of thought, these bad ideas have also resulted in the physical separation of people from each other.

You need people, lots of people, for there to be traditions. You need people, generally a good number of people, to have a school of thought. Neither traditions nor schools of thought are created and maintained through correspondence or Twitter. Real, often obnoxious, people rubbing elbows make them and keep them alive. In the case of Sacred Traditions, those people included the Person of Jesus and His apostles and disciples, and their disciples down to the present day; schools of thought, at least until that fateful 16th century, were formed, developed and reinforced by actual scholars, often in actual physical proximity to each other in actual physical schools, arguing, yelling and occasionally knifing each other (1). It may not have always been pretty, but, boy, you can’t get any more human than that!

In the early 1500s, Luther declares his ‘Alones’ shifting the standard of religious study  from monasteries, which, despite the ‘mono’ in the name, were gatherings of men, to the lone plowboy reading the Bible all on his lonesome. Sure, that plowboy might benefit from talking with others, but in theory, all he needs for spiritual enlightenment is the Good Book and the ability to read it.

In 1630, Descartes goes to his room, pulls the curtains and writes his Meditations, shifting the process of philosophy from what men can figure out by interacting with the world around them – most particularly, interacting with the *people* around them – to what a man such as Descartes, Hume, Berkeley or Kant can figure out in the privacy of his own cranium. If that cranium can even be said to be known to exist.

Image result for school of athens
A gaggle of philosophers. That’s old school! That’s how you do it!

If we hold being Alone in our theology and philosophy to be the highest court above which no appeal can be made, how long will it take for us to assert that being alone in our personal judgements about, say, culture, government and my true self are likewise beyond appeal?

About 500 years, evidently.

Three things this day bring this to mind. First, this excellent essay by David Mills: The Bible’s not enough, which discusses the pervasiveness of Sola Scriptura even among Catholics. Second, a Twitter thread (so shoot me. I mean, think less of me.) where Morgon Newquist tells of her father, in a wheelchair at Disney World, offering to let a little girl sit in front of him to have a better view of a parade – and the parents react like he’s a child molestor. Finally, I’ve recently become part of the the RCIA team at our parish, and was given the task (and 10 minutes!) to explain how the Church reads Scripture.

We are so Alone. The ruins of go it alone theology and philosophy are everywhere. Rather than discovering ourselves in our relationships, we defiantly declare that only we alone can say who we are, depending solely on what we feel we are. We define *individual* rights, and deny they come from nature or nature’s God or even from our relationships to other people. Even the right to vote – especially the right to vote – is seen as definitive of *individual* worth, even if it is only practiced occasionally, and then as part of a large group for the purposes of the large group. It is an expression not of my role in society, but of my personal universe of truth. Thus, instead of seeing losing a vote as a worthy and acceptable outcome and motivation to try to change people’s hearts and minds, each loser is personally threatened, the victors seen as evil people trying to destroy his world.

Many seem to both want rights and want to be able to define them away from others. You must bake me a cake or give up your guns even if neither has any real effect on me, but I get to tell you who I am (and woe if you mess it up) and what world view you must adhere to so that I can feel good about my feelings. This trick is only possible for an more or less unconscious nihilist, who of course believes other’s worthiness depends on how well they support his view of himself, but also betrays how meaningless he feels his own feelings are.

The antidote is religious by definition. We must believe we are all in this together, that nobody can go it alone, in order to understand why the modernist nihilism won’t work. Or rather, why modernist nihilism should never be tried. We can try, doomed though the effort is, to believe in the unity of Mankind without believing in the God Who created that unity. But with or without God, the Brotherhood of Man is like the Equality of Man: nothing you can observe will support such beliefs unless you already believe them without evidence.

  1. Documents relate to “a student who attacked his professor with a sword” resulting in great damage being done to a lecture room – and to the lecturer himself.  From Medieval Students. Violence in medieval university towns was not uncommon.  I suspect there’s more than a bit of bias, both in the recording and interpretation of history – violent acts are memorable and judged noteworthy. A period of peace not so much. Read somewhere somebody saying that, by modern standards, the violence of the past was psychopathic. Of course, modern standards tend to overlook violence like firebombing cities, nuclear weapons, and the slaughter of a 100 million unarmed civilians by their own governments, so take that into consideration.