Starting next Friday, I’ll be teaching a class in American History at this school, to an uncertain number of kids – 2? 10? – of an uncertain mix of ages – probably middle and high school, but could be some younger, even much younger kids involved.
Although I’ve never taught history to anybody, I’m going to make up my own syllabus because I can. The kids are unlikely to know anything, but are taking the class because they want to, which means engagement is a given.
Here’s what I’m thinking:
– Lots of background;
– Lots of fun stories;
– 1 date to remember each class. Catch: You need to have a basic idea of when things happened to have any context at all, yet dates don’t mean anything out of context. So, we’ll start ’em slow.
I’m thinking 10 45-60 minute segments, as follows. Only expanding on the first couple for now:
1. Prolegomenon: World Geography, Food, how people got where they are, American Indians.
We’ll start by reviewing basic world geography. Think I might introduce a few of Diamond’s more credible observations about resources and migrations. Talk about food – getting it, keeping it, fighting over it – and how it figures into how big and sophisticated your cities can be. Arrival of first Americans. Mound Builder cultures. The migration of maize.
This week’s date: 13,000 BC – about when American Indians are thought to have first arrived.
2. European Discoveries: Vikings; Italian Navigators (Columbus, Zuan Chabotto, Amerigo Vespucci); conquest; diseases.
This week’s date: 1497 – when John Cabot found a Basque cod fishing fleet off Nova Scotia.
For example, look at some of the goofier species, frogs and toads. In general, their policy appears to be: if it’s moving and I can get it into my mouth, I will eat it.
Here, a cane toad, a species noted for its voracious appetite (cane toads are known to have eaten pounds of bees at a sitting), eats – a bat. Why? It was moving, and it fit in the toad’s mouth. This toad would gladly eat you, if you met the criteria.
Happy ending? The toad eventually determined that it couldn’t get the bat down, and spat it out, and it flew away. The bat, not the toad. Therefore, one utterly disrespected and somewhat repulsive (to many) species failed to eliminate a member of another, even more repulsive (to many), disrespected species even though it had ‘im right where he wanted ‘im.
Zombie movies are not my cup of tea, but they sure seem popular, almost as popular as ‘vampire are just misunderstood blood-sucking murderers yearning for love’ movies. So, clearly, we’re in the presence of Deeper Truth, wherein popularity among people who can’t follow an argument going 2 miles an hour accompanied by flagmen, lights and sirens is seen as conclusive evidence that Inner Truths are being explored, Existential Uncertainties being unfolded for our catharsis, and Yearning Hearts are being touched.
Do I have a theory? Why yes. Yes, I do. Does this theory bear any shadow of resemblance to reality? You’re never going to get an ‘A’ in this class with that sort of attitude, mister.
How do zombie movies both touch and reveal our Inner Selves and explain our Place in the Cosmos? Let me count the ways:
– We’re surrounded. Zombies are undead in some sense; at least, they are not as alive as my friends and me. Ain’t *that* the truth!
– Zombies want to eat our brains. Who among us has not felt our brains being slowly devoured by yet another sitcom, talent show or State of the Union address? Also, notice how you often can’t come up the name of that guy, you know, the one you were just talking to yesterday? *Somebody* is eating our brains.
– If we’re not careful, we will become one of them. We shall start staggering about, lusting for brains. We will be unable to relate to normal people. Nancy Pelosi will start making sense.
But this is very superficial. Let’s dig deeper. Zombies are a metaphor for how your typical American experiences intellectuals. (Hey, you’re gonna have to work with me here a bit, OK?)
Normal people aren’t lusting after brains. In fact, in the words of B.O.B.: “Turns out; you don’t need one! Totally overrated!” Clearly, B.O.B. has been watching recent federal politics. Be that as it may, for anyone just getting by, doing their job, living their lives, it seems these socially clumsy and somewhat repulsive intellectuals are somehow after them. After their brains.
Why brains? Well, that’s part of the problem. Modern intellectuals, most often found in universities, are really after our minds, driven inexorably to make us all into mindless sycophants and slaves. But – and here’s some irony – the process that turned them into zombies in the first place did so by insisting that they deny that there even is a mind. Their chief tools – Science! and the various flavors of dialectical materialism that constitute the dogmatic foundation of all the soft sciences and fill_in_the_blank Studies departments both assert that there is no mind – it’s turtles all the way down! What I mean is that what we think of as a mind is asserted to be an ’emergent property’ of brains. So, in their confusion, they want to eat the *brains* of all the undergrads, who, having already fallen into their clutches and being bribed by grades and the promise of acceptance into zombie society, lay their heads on the table right next to the cutlery. Figuratively speaking.
How? They start by assaulting our grasp on reality, attempting to loosen our grasps on our minds, which they think of as brains, by shrouding every idea in a debilitating verbal miasma:
– Orwellian euphemisms. Why say somebody’s poor when you can say they’re socioeconomically disadvantaged?
– Impenetrable circumlocutions. Whatever you do, don’t talk about reality using short, concrete nouns in simple declarative sentences.
– smug dismissiveness. Make sure the students know when they been stupid, but don’t ever explain how, exactly, what they said is wrong.
Once enough confusion has set in, you can start shoveling on the theory, confident that the teenagers in your charge completely lack the desire – let alone the mental tools – to refute you.
Here’s where things take two paths. On the science side, Materialism is conclusively assumed. In the words of the imortal Ted Nugent: if you can’t bite it, it doesn’t exist. Since the unbiteable includes just about everything that gives any meaning whatsoever to existence, we just dismissed any excuse for studying science – except for a pure act of Will, which happens to play nicely into the other path. Every been unhappy about anything? If you’re not a white male, you are the oppressed victim of an oppressor class. Once this position has been granted – and you are unlikely to graduate unless you grant it – then everything else falls into place. Now, since the early steps in the process removed even the idea of objective reality from the realm of possible explanations, you’re now free to expound this theory or these theories (flavors vary to fit the need) unencumbered by any facts whatsoever. Were your mom and dad deeply in love? Your mom was the unwitting victim of male domination, no doubt suffering from Stockholm Syndrome.
And so on. The end products of this process are intellectual zombies, half-dead in their minds, prowling for more funding. Victims, I meant victims.
So, now, assume you’re a relatively normal guy or gal. Whether you went to college or not, how could you avoid the deep, chilling feeling that you are being stalked? And so as dread and foreboding grow in our minds, we conjure up some sense from all this – and we dream we’re living in a Zombie Apocalypse!
Never having watched a zombie movie in my life, here is my reckless prediction: the next great trend in Zombie stories will be two hideous tribes of zombies, to the living all but indistinguishable from each other, at war, each hell-bent on annihilating the other. Except they instinctively unite to pursue the living, only returning to devouring each other when there are no living available.
You know, because it kind of follows from the theory and all.
Where do I go to sacrifice a goat to the Scientific Consensus?
But wait – is a living goat a net sequesterer (is that a word?) of carbon? Does he emit (ahem) more carbon than is locked up in his carbon-based body? Seems I’d need to sacrifice him by freezing or burying him in a hermetically sealed container – burning is right out. And none of this address the net effect of a goat eating vegetation, which packs away at least some carbon, emits (!) some other – yet promotes the growth of more vegetation, which locks away more carbon….
Man, this is as complicated as deciding to get a Prius.
* Rainy season in this part of California is from October to May, with most of the the rain in November through April. Summers are usually completely dry. So, we’re having a bit of what qualifies as a downpour around here – 1 day before the end of summer, 9 days before the ‘rainy season’. I’ll keep an eye out for someone in the press asserting that this is somehow related to global warming, and report back.
As is sometimes the case in otherwise bad flicks, there’s one very memorable character In the rather wretched movie ‘Sahara’ – Matthew McConaughey’s/Dirk Pitt’s sidekick Al Giordino. Al acts as the Voice of Reason and reacts more or less as a sane person would in an otherwise insane movie, which makes him pretty funny. Not enough to overcome the dreariness and implausibility of the plot (hint: finding a Civil War ironclad in the middle of the Sahara is *not* the most implausible thing, not even close – go figure). Heck, even adding Penelope Cruz to the landscape couldn’t save this movie.
After the umpteenth unlikely escape/incoherent coincidence, Our Heroes find themselves wandering in the Sahara. Al speaks for all of us in the audience:
Al Giordino: Hey, you know how it is when you see someone that you haven’t seen since high school, and they got some dead-end job, and they’re married to some woman that hates them, they got, like, three kids who think he’s a joke? Wasn’t there some point where he stood back and said, “Bob, don’t take that job! Bob, don’t marry that harpy!” You know?
Dirk Pitt: Your point?
Al Giordino: Well, we’re in the desert, looking for the source of a river pollutant, using as our map a cave drawing of a Civil War gunship, which is also in the desert. So I was just wondering when we’re gonna have to sit down and re-evaluate our decision-making paradigm?
Dirk Pitt: [coming up on the fortress seen in the cave painting] I don’t know – it seems to be working so far.
You know where this is going, right? Dirk has somehow found completely implausible success through unreason – it doesn’t have to make sense, because it works like magic. Heck it IS magic. Al, our Sancho Panza, has noticed this, and asks if it wouldn’t be wiser not to rely on magic, but to instead apply a little common sense. How about a decision-making paradigm that doesn’t rely on timely bits of irrational magic to work?
Yet, because this is a movie, Al is shot down by – yet another timely bit of irrational magic.
Remind you of anything?
Here’s yet another reason I’m Catholic: I don’t have to wait around for timely bits of irrational magic to make sense of my beliefs. My decision making paradigm goes something like this:
The Truth is One. Logic and reason, and history and science all point toward the same truth as Scripture.
Wisdom and holiness are inescapably accompanied by beauty, goodness and truth, which are conjoined triplets. If you are missing one, there will be problems with the others;
Reason is good. Logical thinking honestly applied points to the truth;
Yet even the wisest and holiest people are prone to error, and I’m not remotely the wisest or holiest guy around, Therefore, I will not rely solely on my own judgement – I know from bitter experience how wrong I can be;
Instead I will look to see where holiness and wisdom reside, and go there, and follow;
The signs I will look for are beauty, goodness and truth. Reason will never be contradicted by beauty, goodness and truth, but will be reinforced by them.
I, along with millions of Catholic intellectuals over the centuries, find comfort and peace in God through the Church which is His Body. There are mysteries, to be sure, but, as in any good story, the mysteries are introduced up front, not used ad hoc to annihilate logical problems. In other words, the story is meant to help us appreciate and understand the mysteries, in however limited a way we can understand them; the mysteries are not introduced willy-nilly to salvage the story.
I don’t have to shout down Al when he points out that my decision making paradigm is doomed and is getting us into trouble, I don’t go into the desert hoping they’ll be an ironclad a thousand miles from navigable waters, trusting that it will all turn out OK because some miracle like someone having scratched instructions on a cave wall in a cave that I’ll just happen to wander into will take place just when I need it.
Miracles abound, to be sure, but not primarily to resolve or rather destroy logic and history. Scripture, written in the Church for the Church, preserved, honored and studied by the Church for 2,000 years, is asserted to, if properly understood, destroy the Church. The logical problems with this are resolved by a series of presumed miracles, or, rather, magic tricks: the Great Apostasy, an event that is presumed to have occurred at no particular date, for which – magically – there is no evidence apart from that which – magically – appears once the event is conclusively assumed to have occurred (the Church is asserted to be both diabolically good at destroying evidence – and incompetent at destroying evidence. It’s magic!). Then, a ‘real church’, one that made no converts nor left any other evidence of its existence, is presumed to have existed underground, because the Spirit that guided it is all in favor of hiding one’s lamp under a bushel basket. Meanwhile, the apostate Catholic Church is sending missionaries all around the known world in accordance with Christ’s Great Commission (today is the feast of some 16th century Korean martyrs, who received the faith from Japanese missionaries, who received the faith from awesomely brave European missionaries, for example) evidently did so only to diabolically throw people off. What’s amazing is that these people were willing to die horrible deaths instead of renouncing, not Christ, but rather their horrible ruse. Now, that’s dedication.
And so on. Renee Lin is doing the heavy lifting on these topics over on Forget the Roads, so go over there if interested. My point is that the method we chose for deciding what to believe – our decision making paradigm – makes a huge difference in what we assent to as truth. If we make it a bedrock test that Truth is One – that history, logic and science honestly understood all tend toward the same Truth as is revealed in Scripture – then we don’t wait around for magical explanations, like believing that Scripture is both luminously transparent and yet was fatally misunderstood for 15 centuries until some German monk figured it all out.
Metaphysics is not a dirty word. Neither does it consist of arbitrary choices that have no meaning or repercussions in the real world. Rather, getting your metaphysics right – and you do have them – is the key to understanding anything and living well.
The dictionary says:
the branch of philosophy that deals with the first principles of things, including abstract concepts such as being, knowing, substance, cause, identity, time, and space.
Metaphysics, as the name implies, is a study of the foundational beliefs that must NECESSARILY be true before any theory of physics can be proved true.
I tend toward the simplest functional definition:
Metaphysics is what must be true if anything is true.
Metaphysics requires the tacit acknowledgement of the goodness and desirability of Truth. If you don’t care about truth, as in: “Truth? What is that?” then perhaps you could truly claim to have no metaphysics – although simple physics, such as is evident in sitting, standing and emitting sound orally is pretty difficult without metaphysics, if indeed the impossible can be called difficult.
Here I will state, in completely nontechnical terms, what any metaphysics that is at all useful for a seeker of truth must hold:
An objective world exists;
I am not the only person in that world;
It is possible to learn about that world and the other people in it through perceptions;
Meaning and truth can be conveyed by words
Notice a certain non-hierarchy here. In order to seek truth, it is not required that one make any metaphysical assumptions about the physical or mental world being in some sense primary truth – there is no mind-body problem. Aristotle’s standard approach was to proceed from what is most knowable to us – immediate particulars – to what is most knowable in itself – more general ideas. But this does not mean the particulars are any less real or true than the general ideas, or visa versa.
Mike Flynn had a neat definition, too (surprise, surprise!) which 15 minutes of searching didn’t turn up. Something about the studying of being as being, as opposed to physics, which studies particular beings, but that’s not quite it. Must cut and past right when I see cool stuff.
Just Thomism points out that it’s wrong to think that modern philosopher mean the same thing by metaphysics (and other core terms) as the perennial philosophers do. I’d add that, in the case of Hegel’s use of ‘logic’, the meaning is opposite the traditional meaning.
Pretty ambitious title for a short essay. Mostly, it’s just numbers out of context and musings. Context would require at the very least wondering about who makes up the various bands – the top 5%, those under the poverty line (and using a consistent poverty line), the middle class – all over time. Ever wonder if, over forty years, the nature of the people the data is being collected on has changed? Personal example; my father ran his own small sheet metal fabrication shop in 1973, and probably made a the top 5% income. Of course, the tax rates at the time were very high, so he would have been highly motivated to not take taxable income above certain threshold levels – thereby inadvertently swelling the ranks of the middle class by one family right at the top end, as opposed to swelling the ranks of the upper middle class at the low end. 40 years later, only 2 of his nine children make anything like the (inflation-adjusted) kind of money he made in 1973 as a 56 year old highly talented and experienced sheet metal fabrication expert with an entrepreneurial streak. Several of my siblings are probably at or below the poverty line, and several more make good middle class livings. None work in sheet metal. All this is to say that when we’re comparing rather arbitrary groups like ‘the top 5% of income earners’ 40 years apart, we’re making a bunch of radical assumptions about real people, the chief of which is that all differences make no difference as long as the aggregation is big enough – kind of like psycho-history, which, it is good to recall, was and remains fiction.
In this essay, the author points out that incomes are flat for middle income people over the last 40 years, but not flat for the top 5%. Consider that the top 5% – the individual people, as well as the manner in which they make their money, has almost certainly changed for at least a large part of that crowd. In 1973, that group might have had stock brokers, lawyers and CEOs, but few high-tech entrepreneurs, not to mention college coaches and professional relief pitchers. Does the mix of jobs really make no difference to the analysis? Also, some people make their money and then stop – top professional athletes and early Google employees, for example – they may have little if any reported income, but live what we would call an affluent life off their previous earnings. I would suspect – but have no numbers – that the number of well-off and retired or semi-retired people has grown since 1973, at something like the rate of retirement communities + the growth rate in, say, sailboat sales. Who knows? But ignoring that population calls into question the rest of the numbers.
Cassidy doesn’t consider the entire top 5% in his analysis but rather, the 95th percentile. What this means is that we’re excluding the top 1% – the 99th percentile – and, especially the top 0.1% – the 99.9th percentile. I suspect two reasons for this: the first is that the very top wage earners can skew things wildly. A Derrick Rose, for example, just went from about $5 million a year to $17 million. Of course, he might make that number for the next 6 or 8 years if he’s lucky, then, as an old man of 35 or so, he’ll retire. He’ll have bought himself all the houses, cars and health care he’ll ever need by that time, and may therefore, if he’s lucky and smart, live like a king on little if any taxable income – maybe even below the poverty level! His accountant and tax attorney will work it out.
Second, his audience are readers of the New Yorker. We can back into the demographics of that readership by looking at the ads. I’d guess the 95th percentile is probably squarely in the middle of that target. (Just yesterday, overheard a conversation between two New York professionals, at least one of whom lives in Manhattan, in which the tribulations of the life of a friend were discussed – his options were sadly limited, as he only made “mid-six-figures”. Oh, the humanity!)
Comparing group A – the 95th percentile form 1973 – with group B – the 95th percentile in 2013 – would seem to require a lot of caution, research and caveats. Just assuming that normal generational changes, not to mention social upheavals, have no meaningful effect is reckless, and is among the many reasons I don’t give too much weight to these sorts of comparisons. Anyway, here’s a bullet point from the above:
At the top of the income distribution, things look very different. Forty years ago, a household in the ninety-fifth percentile of the income distribution—i.e., a family with nineteen families below it for every one above it—earned $133,725. In 2012, a household at the same spot in the income distribution earned $191,156. That’s an increase of forty-three per cent.
Just a little math: 43% over 40 years comes out, though the wonders of compounding, to an average annual growth rate in income of less than 0.9%, or an $89 per $1,000 per year of additional income. So, the mythical average well compensated individual in the 95th percentile might see around a $1500 increase in pay in an average year – less taxes, which run over 40% here in California. So, it’s not like the 95th percentile were running off to vacations in Bermuda with their extra $800 each year. But, through the wonders (and probably illusions) of compounding, after 40 years we’re talking real money. All one has to do is enter the 95th percentile at age 25 – not happening very often – and work 40 years, and you can see your wages, on average, rise similarly. Or, what’s more likely yet not discussed, is that the 25 year old enters the workforce a lot closer to the poverty line than the 95th percentile, and, after those 40 years of work, manages as an old man to crack the top 5%.
Of course, it’s better to get the extra cash than not, and the average middle class wage earner was getting only a few dollars a year extra over the same period, but – are we really going to foment some outrage over $800 a year? Does that makes the 95th percentile manipulative greedy bastards? Do we believe that this $800 is being used to buy senators and stack the economic and tax policies of the nation in order to favor people making $150k/year? Really?
My suspicions by now should be clear: what seems to be the case among the people I know is that young people tend strongly to get less well paid jobs than older people with a ton of experience who have patiently worked their way up the ladder. In many occupations, that ladder has been destroyed or shrunk – the idea that a factory worker could make an ever more lucrative career out of factory work just by putting in the years is pretty ridiculous now days, if it ever existed outside of a few high-end union jobs. Just as the economy told people to get out of buggy-whip making, it’s now telling anyone with ears to hear that you don’t want to be unskilled or semi-skilled factory labor.
I’d want a much better breakdown of the data, as well as data on people moving in and out of various percentiles as they age or move, how people enter and leave the workforce and how this has changed over time, and goodness knows what else before I’d considering whether blanket, population level assertions might be valid. Just because some data submits to high-school level statistical analysis doesn’t mean it tells you anything.