College and the Big Evil Corporation Model

Here’s an idea to keep in mind when thinking about our wonderful universities and colleges: these ivy-infested institutions are, when you get right down to it, rich, evil corporations.

Image result for j p morgan
Super rich titan of industry, or major university president? Why not both?

Now, this notion, like most things this simple, doesn’t explain everything about ‘higher education,’ but, if judiciously applied, should serve to weed-whack some really stupid ideas and clear the ground for some actual thought. Plus, it’s factually true, at least about the name-brand institutions. Harvard, the big dog, has a $38.3 billion endowment, $44.6B in net assets, and an annual operating budget of $4.5B. For comparison, General Motors has net assets of $55.2B.

So, here goes:

Giant heartless corporations try to convince everyone they simply must have their products. You’ll never get ahead if you don’t have a college degree. You want to be a failure, like George Washington, Lincoln, or heck, Harry Truman? You want to live like that poor trade-school educated welder down the street, who owns his home, is debt-free and can get another job in about 15 minutes if he needs to? That’s what will happen to you if you don’t get a degree! Even though it’s patently nonsensical, doesn’t just about everyone you know think a college degree is all but essential to the good life?

To keep costs down and control high, evil corporations sow uncertainty and insecurity among their workers, You’ve all heard the stories about how evil corporations use the threat of replacing workers with a fresh-off-the-boat immigrants, to keep them in line and keep them from demanding more pay and better working conditions? Talk to a college professor lately? They all know that there are hundreds of people willing and able to take their job if anyone on campus finds anything at all lacking in them. Colleges used to offer tenure; now, it’s rare, as most classes are taught by adjuncts and grad students in most colleges in most fields. Not only are those non-tenure track people cheaper, they send a message to the tenured profs as well: we got backup plans if you screw up.

Giant, evil corporations willingly sell cheap, inferior products whenever they can, to maximize profits. To be admitted to Harvard 150 years ago, back when profs got tenure and under 10% of people went to college, you needed to pass a Greek and a Latin exam – and a calculus test. A college education *started* from a baseline that far exceeds the intellectual achievement of most PhDs today. (FYI: Most PhDs today are in education and social sciences.) Since only a tiny fraction of any population is likely to have the inclination and talent to learn Latin, Greek and calculus merely to get in to a good college, for the last century or so, colleges have been dumbing down their offerings to make sure they sell as much product as possible.

The first step was education schools, which generally date back to the second half of the 1800’s. For the last 150 years, inferior students (of course, there are exceptions. I assume.) who could not make it in a traditional college (think: Liberal Arts/Great Books + math, science, music, art, where that Latin, Greek and Calc would be put to use) could major in education, even get a PhD by doing ‘original’ research, and then get faculty positions teaching the next round of unqualified students. Over time – I’m estimating the other shoe fell around 1990 – the unqualified/dumb people with PhDs in participation trophy fields outnumber professors who might have a real education in something, and begin to call the shots and simply quash any opposition. You get stuff like this, for example (H/T to Rotten Chestnuts).

As a business strategy, as a way to maximize profits, this ‘create majors unqualified/dumb people can do’ has been a big winner! All studies fields, plus the non-RAD fields like English, History, Sociology, Psychology and so on, exist primarily to take the money from people who would not otherwise be able to hack college. Comparing such degrees to what a university degree used to be (and still is, in a few Great Books schools and the more RAD disciplines in some major schools) is like comparing finger painting to a Raphael portrait. Which is why the super-well-educated college grad is likely to say the finger paining is just as artistic as the Raphael…

Evil, rich corporations use their political influence to get the government to act in their best interests, despite what is good for or desired by people in general. It would be just like an evil corporation to get the government to all but require their product, create an elaborate tax-payer subsidized finance scheme to put people into debt to buy their product, and then try to get the government/tax-payers to take the bullet when the product doesn’t perform as advertised.

Student loans, anyone?

Enough. I’ve got an Academic VORP follow-up essay I’m working on, but it required real thought. Plus, there were some very good comments I didn’t answer because I wanted to expand on them. Sorry about that. Anyway, it’s now 3 days since I’ve written about bricks. Count your blessings! I mean, um, thanks for reading this humble blog.

The Allure of Psychology

One thing a classic liberal education is supposed to do for you is make you suspicious of ideas you find emotionally attractive. Like the brutal honesty demanded by science, it is just assumed to rub off on students who work their way through all those tough classic texts. Just about every freshman finds Plato attractive. Like the young men who followed Socrates around just to see him lightly eviscerate some pompous fool, we thrilled to the discovery that pompous fools could be eviscerated, and craved more. Then we run into Aristotle, and don’t like it much, because he, effectively, says: enough with the fun and games, time to stand your ground and say what you mean. Perhaps some of us get the idea that Socrates would have met his match, or more, in Aristotle (although I suspect they would have gotten along pretty well while having some doozies of arguments, because they had doozies of arguments. Socrates must have been bored out of his skull with the Ions and Menos of the world.)

Then, as you move on through the list, one precious idea after another gets beat up. You think that you’ve reached the pinnacle of sophistication as an 18 year old who has learned that the only thing he knows is that he knows nothing, only to have that self-refuting notion beat up by Aristotle’s moderate realism. Then, perhaps, you see how Aristotelian metaphysics and epistemology lead to places you might not want to go, making Descartes very appealing. But Descartes leads to Hume, Berkeley, and, eventually, Kant, while Thomas leads to science. So now, maybe, Descartes is less appealing, and you take another look at Aristotle…

Thus, by a million paths, the serious student learns to take extra care about accepting too readily ideas that he finds attractive, because he finds them attractive.

When I read Alice Miller‘s books 30+ years ago, I found her ideas very attractive, even though her Freudian approach was seriously off putting. I like to say that Miler was a fallen-away Freudian, but had not fallen away nearly far enough. What made her assertions more acceptable to me was how well they fit with evolutionary theory. On the fly as I read her books, I would substitute arguments from natural selection for hers, the unholy offspring of Freud and Rousseau.

Brutal honesty moment: in other words, I back-filled psychological theories I found emotionally appealing with evolutionary just-so stories. I get it. I suppose my purpose in writing this out, apart from trying to make it as clear as possible to myself, is to invite criticism.

What are these theories? I’ve mentioned them before, but never in great detail. Here, I’m paraphrasing them based on 30 year old memories and replacing Freudian turns of phrase with Darwinian language. These start out as truisms (I should hope) but turn dark:

  • For their very survival, children need to be part of a family/tribe (Extended family – I’m just going to use ‘tribe’ from here on out). In our evolutionary environment, no children lived to reproduce outside of a tribe. Therefore, intense selection pressure has been applied to children in favor of group membership and against running off or doing anything that might get them excluded. (1)
  • As sophisticated social mammals, children by instinct incorporate whatever behaviors are required for tribal membership into their base understanding of the world as foundational assumptions. (This is nothing more than saying ‘tribalism’ is a base state for humans and is pre-rational). Kids don’t think about these requirements (much), they just are.
  • We see it in the ‘attachment-promoting behaviors’ of babies and toddlers before they are even aware of what they’re doing. As they grow, their behaviors become more complex and more specific to their particular environment. In this, people are only the most sophisticated among animals – you cat and dog do this as well.

All well and good, and I hope not too controversial. It should be noted that the reciprocal activity on the part of the adults – nurturing the tribe so that the child might survive – must also be a part of any environment of evolutionary adaptation. So parents and relative – the tribe – can be expected to behave in such a way as to promote the survival and integration into the tribe of its children. That’s the model that seems to have been developed and to have worked over the last half a million years or so, at least. There’s nothing necessarily nice or pretty about it – it’s just what works.

But what happens when, as in the modern world for the last couple hundred years in many places, many people survive despite having no tribe in the evolutionary sense? What happens when the brutal culling mechanisms of Darwinian survival get put on hold? Whatever else may happen, it is now possible on a scale and to a degree never known before for children to be neglected, abused, and traumatized – and still live, and perhaps even still reproduce.

  • Children who are neglected, abused and otherwise traumatized will, through the all but inexorable drive of instinct, incorporate their neglect, abuse and trauma into their pre-rational view of the world. Miller, in her decades of work as a psychoanalyst, noted a remarkable ability of her patients to excuse, ignore and explain away the objectively horrible things done to them – which is what one would expect, under the evolutionary explanation above. Aside: this, at least, seems to be obviously true from just routine interactions with people.
  • So we have a world increasingly filled with damaged children of all ages who, for basic survival reasons, have accepted their mistreatment at the hands of those who were supposed to love them, rationalized it, and who are highly motivated to accept it as part of their tribal membership fees.
  • It gets worse: as part of the emotional mechanisms that ‘worked’ insofar as they did in fact survive into adulthood, their experiences and coping mechanisms now become the template for how to raise any children they might have. Thus, Miller observed the pattern where someone who had been sexually abused as a child, even if they were not themselves an abuser, would routinely put their children into situations where they were likely to be abused. To do otherwise would be to confront the careful structure that allowed the parent to survive in the first place. Very painful and disorienting.
  • This is expressed in the title of one of her books: Thou Shalt Not Be Aware. To acknowledge one’s own mistreatment enough to protect one’s own child requires reopening some deep and carefully scarred over wounds. Rather than do that, we readily subject our kids to what we experienced, no matter how horrible.

Miller says that a sympathetic witness, someone who understood the trauma and abuse on some level and could tell the child that it wasn’t right, was all but essential to having any hope for healing. That witness provided a counter to all the stories the kid would otherwise make up in order to keep his membership in the tribe: that daddy didn’t mean it, that momma does really care, that what uncle did wasn’t so bad, and so on – all the little myths one runs into whenever one is drawn into other people’s dramas. Lacking such a witness, it seemed to Miller all but impossible to get past all the barricades built up by the child.

So, there you have it: I see – I think, that’s the question – people reenacting in their child’s life whatever it was that traumatized them as children: people who were abandoned at 15 abandon their own kids as teens; children of divorce get divorced; Sexually abused kids become libertines and expose their own kids to that life; and so in a million ways.

There’s more, but that’s the general outline. I’m not just saying that miserable childhoods tend to make for miserable adults. I’m saying that miserable childhoods tend to all but compel people to make their own children miserable in the same way.

Anyway, make any sense? I readily acknowledge that Miller is a loon – I read most if not all of her books, and she gets into speculation that’s little better than palm reading in many places. And, as mentioned, even though she became one of Freud’s harshest critics, she still thought and spoke like a Freudian. Am I just experiencing confirmation bias when I seem to see this inflicting of one’s childhood trauma on one’s own children everywhere I look, or is it real?

  1. And, of course, tribes can’t survive without children, either, so, at least by nature, tribes care about their children as passionately as children yearn to belong. Note that this doesn’t imply any sort of lovie-dovie niceness: the ever-popular Yanomami tribesmen raise their sons to be good little homicidal sociopaths, because that approach has been proven to work. Similarly, their daughters are raised to seek the most murderous sociopaths as mates.
  2. And then expanded, by design, to school, with its artificial and arbitrary tribes of classrooms and grades. But Miller doesn’t go there, as far as my memory can recall.

An Epidemic of Diagnosis Revisited, Sort Of

Walker Percy, Lost in the Cosmos

I was pretty sure I’d written about this 2007 essay in the NYT, but evidently not, according to a search of blog archives. In it, a doctor points out something that should be obvious: as we add more healthcare to our lives, we are going to get more of what healthcare produces. He notes that what healthcare produces is not health, but diagnoses. Healthcare professionals are in the business, first and foremost, of telling us what is wrong with us. Only once we’ve been diagnosed can the wheels of the machinery of the healthcare system turn.

Improvements in healthcare have helped us to generally have longer, healthier lives than at any point in history. Not as much as the real, if unromantic, big three of plenty of food, clean water and good sanitation. Get those three right – and avoid wars, but these things go together in practice – and people start generally living pretty long lives, long enough to die from old people causes like cancer and heart disease instead of dysentery and rickets.

We have more healthcare, the largest, most expensive healthcare system in history, because we’ve demanded, in the literal and economic senses, more healthcare. We’re loath to admit the human body is simply too complex and fragile, too prone to break down and suffer injury and disease, for lifelong health to be anything other than a stroke of luck or a blessing, and even more loath to come to grips with the inevitability of death. Nope, we ignore all experience, and come to think vigorous good health is what we deserve, and that something is horribly wrong if we don’t get it. We demand, pounding our little feet and throwing haystacks of cash, that Science cure cancer, prevent diseases and in general fix any and every problem we have.

We demand: Tell me what’s wrong, then FIX IT! We’ve learned, from our car mechanic if nowhere else, that sometimes it’s hard to figure out what’s wrong. We’ve internalized the reality that, sometimes, a diagnosis is just an opinion among many possible opinions. We’ve learned to seek a second or third or fourth opinion until we find one we like.

We have not learned, it seems, to distrust the diagnostic process itself.

If this is true of mundane things like cars and the human body, it is much more true of the difficult and subtle human mind and soul. We desperately seek a diagnosis, which, when we get one we like, becomes our identity. In English, we even say: I *am* on the spectrum; I *am* depressed, or gay, or transgender, or oppressed, or a million other things. To attack our diagnosis is to attack us. We are our diagnoses.

Listen for it: so often, these days, we will hear a person’s diagnosis within minutes of meeting them. “I’m a *blank*” As if *blank* is the key information, as in, it unlocks and settles everything.

As the Percy quotation above shows, in his inimical style, this need to have someone else tell us who we are is so pervasive and compelling that many millions of us turn to astrology (1) or worse just to get our diagnosis. What the diagnosis is, as he demonstrates, matters far less than that we have one. We are so lost in the cosmos that we’ll cling to anything to stay afloat, anything to distract us from facing the brutal reality of who we really are.

If only this self-delusion were the worst of it. The real killer diagnoses are projection and protection. In the first, rather than engaging with those we disagree with or otherwise find unpleasant or in any way disconcerting, we diagnose them as having a disorder that prescribes summary dismissal: they are only saying that or behaving that way because they are *blank* – religious, conservative, liberal, stupid, evil, etc. (Note that it’s the prescription that makes this a diagnostic ritual: if you simply observe someone is promoting progressive ideas, say, BUT do not use that as an excuse to not engage with them, that’s just basic intellectual awareness.)

Even worse, as mentioned under point 3 in the last post, is the diagnosis of others that lets us off the hook: my child *IS* ADHD, therefore drugs and your acceptance are the only answers. No fair looking at his home life or parental and school expectations that make demands upon the kid that he’s not interested in fulfilling and cause him massive stress – nope, it’s drugs and acceptance, AND THAT’S FINAL! (2)

But of course it’s the gender dysphoria diagnosis that’s the real killer – literally, given the suicide and self-harm rates among those confused about their sexual identity. It’s the parents who love that diagnosis, because now everything is known, the prescription is complete, and every future problem is accounted for in advance. And no finger is ever allowed to be pointed at me, no matter how chaotic and emotionally abusive my serial polygamy or just plain rutting around is for my kid. Nope, we must bless and worship his confusion, so that even he embraces the insanity. Just so long as I’m off the hook.

Marxist social criticism at its best, destroying lives, sowing unhappiness and using the weakest and most damaged among us for their political ends. Not to mention promulgating some of the stupidest ideas known to man.

  1. Have to tell this off topic story: in my early 20s, somehow got dragged into a panel discussion of astrology, as the token Thomist (I wish). So I did research – Thomas does address astrology in an interesting way – and gave my little spiel (and that’s yet another story). Afterwards, at a party at St. John’s (lived in the neighborhood), I mentioned that I’d done a bunch of research on astrology for this talk, noted that Thomas doesn’t summarily dismiss it, and an attractive young woman I barely knew said: really? What sign am I? Somehow, I managed to deadpan a reply using appropriate astrological bafflegab, after which I answered: therefore, you’re an Aquarius (or whatever). I guessed right. Her mind was blown. I managed to keep my straight face, knowing that I would be right about 8% of the time, and had gotten lucky. Not ‘lucky’ lucky – you know what I mean, get your mind out of the gutter!
  2. Mandatory disclaimer: there very well might be something to ADHD diagnoses, I don’t know, but there’s certainly something to ADHD over-diagnoses.

AI, AI, Oh.

Old MacBezos Had a Server Farm…

Free-associating there, a little. Pardon me.

Seems AI is on a lot of people’s minds these days. I, along with many, have my doubts:

My opinion: there are a lot of physical processes well suited to the very fancy automation that today is called AI. Such AI could put most underwriters, investment analysts, and hardware designers out of a job, like telegraph agents and buggy whip makers before them. I also think there’s an awful lot of the ‘we’re almost there!’ noise surrounding AI that has surrounded commercial nuclear fusion for my entire life – it’s always just around the corner, it’s always just a few technical details that need working out.

But it’s still not here. Both commercial nuclear fusion and AI, in the manner I am talking about, may come, and may even come soon. But I’m not holding my breath.

And this is not the sort of strong AI – you know, the Commander Data kind of AI – that gets human rights for robots discussions going. For philosophical reasons, I have my doubts human beings can create intellect (other than in the old fashioned baby-making way), no matter how much emergent properties handwavium is applied. Onward:

Here is the esteemed William Briggs, Statistician to the Stars, taking a shot at the “burgeoning digital afterlife industry”. Some geniuses have decided to one-up the standard Las Vegas psychic lounge routine, where by a combination of research (“hot readings”) and clever dialogue (“cold readings”), a performer can give the gullible the impression he is a mind reader, by training computers to do it.

Hot readings are cheating. Cons peek in wallets, purses, and now on the Internet, and note relevant facts, such as addresses, birthdays, and various other bits of personal information. Cold readings are when the con probes the mark, trying many different lines of inquiry—“I see the letter ‘M’”—which rely on the mark providing relevant feedback. “I had a pet duck when I was four named Missy?” “That’s it! Missy misses you from Duck Heaven.” “You can see!”

You might not believe it, but cold reading is shockingly effective. I have used it many times in practicing mentalism (mental magic), all under the guise of “scientific psychological theory.” People want to believe in psychics, and they want to believe in science maybe even more.

Briggs notes that this is a form of the Turing Test, and points to a wonderful 1990 interview of Mortimer Adler by William F. Buckley, wherein they discuss the notions of intellect,. brain, and human thought. Well worth the 10 minutes to watch.

In Machine Learning Disability, esteemed writer and theologian Brian Niemeier recounts, first, a story much like I reference in my tweet pasted in above: how a algorithm trained to do one thing – identify hit songs across many media in near real time – generates an hilarious false positive when an old pirated and memed clip goes viral.

Then it gets all serious. All this Big Data science you’ve been hearing of, and upon which the Google, Facebook and Amazon fortunes are built, is very, very iffy, no better than the Billboard algorithms that generated the false positive. Less obvious are people now using Big Data science to prove all sorts of things. In my gimlet-eyed take, doing research on giant datasets is a great way to bury your assumptions and biases so that they’re very hard to find. This, on top of the errors built in to the sampling, the methodology and algorithms themselves – errors upon errors upon errors.

As Niemeier points out, just having huge amounts of data is no guarantee you are doing good science, in in fact multiplies to opportunity to get it wrong. Briggs points out in his essay how easily people are fooled, and how doggedly they’ll stick to their beliefs even in the face of contrary evidence. You put these things together, and it’s pretty scary out there.

I’m always amazed that people who have worked around computers fall for any of this. Every geek with a shred of self-awareness (not a given by any means) has multiple stories about programs and hardware doing stupid things, how no one could have possibly imagined a user doing X, and so (best case) X crashes the system or (worse case) X propagates and goes unnoticed for years until the error is subtle, ingrained and permanent. Depending on the error, this could be bad. Big Data is a perfect environment for this latter result.

John C. Wright also gets in on the AI kerfuffle, referencing the Briggs post and adding his own inimitable comments.

Finally, Dust, a Youtube channel featuring science fiction short films, recently had an “AI Week” where the shorts were all based on AI themes. One film took a machine learning tool, fed it a bunch of Sci Fi classics and not so classics, and had it write a script, following the procedure used by short film competitions. And then shot the film. The results are always painful, but occasionally painfully funny. The actors should get Oscar nominations in the new Lucas Memorial Best Straight Faces When Saying Really Stupid Dialogue category:

Wet Enough for You? Philip Marlowe Edition

From the L.A. Times: Why L.A. is having such a wet winter after years of drought conditions. (Warning: they’ll let you look at their site for a while, then cut you off like a barkeep when closing time approaches.) Haven’t looked at the article yet, but I’ll fall off my chair if the answer doesn’t contain global warming/climate change.

But I have some ideas of my own. Historical data on seasonal rainfall totals for Los Angeles over the last 140+ year is readily available on the web. I took that data, and did a little light analysis.

Average seasonal rainfall in L.A. is 14.07″. 60% of the time, rainfall is below average; 40% above. Percentage of seasons with:

  • less than 75% of average rain: 32.62
  • between 75% and 125%: 39.01
  • over 75%: 28.37

“Normal” rainfall covers a pretty wide range, one would reasonably suppose. Getting a lot or a little seems somewhat more likely than getting somewhere around average. This fits with my experience growing up in L.A. (18 year sample size, use with caution.)

The last 20 years look like:

Season (July 1-June 30)Total Rainfall, InchesVariance from Avg

14 years out of 20 (70%) are under average; 6 above. Those 5 years in a row stand out, as does the 9 out of 11 years under from 2005-2006 to 2015-2016. (That 22.55 inches in 2004-2005 also stands out – very wet year by L.A. standards.)

Wow, that does look bad. So does this stretch, with 7 out of 8 under:


And this one, with 10 out of 11:


Or this, with 6 out of 7:


This last cherry-picked selection is also like the most recent years in that annual rainfall is not just under, but way under. This last sample shows more than 6″ under, in 5 out of 6 years. In the recent sample, 5 out of the last 7 years prior to this year were more than 6″ under, and one over 5″ under.

How often does L.A. get rainfall 6″ or more under average? About 22% of the time. So, hardly unusual, and, given a big enough sample (evidently not very big), you would expect to find the sorts of patterns we see here, even if, as it would be foolish to assume, every year’s rainfall is a completely independent event from the preceding year or years. It would make at least as much sense to think there are big, multi-year, multi-decade, multi-century and so on cycles – cycles that would take much larger samples of seasonal rainfall to detect. And those cycles could very well interact – cycles within cycles.

Problem is, I’ve got 141 years of data, so I can’t say. I suspect nobody can. Given the poorly understood cycles in the oceans and sun, and the effect of the moon on the oceans and atmosphere, which it would be reasonable to assume affect weather and rainfall, we’re far from discovering the causes of the little patterns cherry picking the data might present to us. They only tell us that rainfall seems to fall into patterns, where one dry year is often followed by one or two or even four or five more dry years. And sometimes not.

L.A. also gets stretches such as this:


Not only are 7 out of 10 years wetter than average, the 3 years under average are only a little short. This would help explain why it is so often raining in Raymond Chandler stories set in L.A. – this sample of years overlaps most of his masterpieces.

Image result for philip marlowe
It could be raining outside – hard to tell, and I don’t remember. Just work with me here, OK?

The L.A. Times sees something in this data-based Rorschach test; I see nothing much. Let’s see what the article says:

Nothing. The headline writer, editor and writer evidently don’t talk to each other, as the article as published makes no attempt to answer or even address the question implied in the headline. It’s just a glorified weather report cobbled together from interviews from over the last several months. Conclusion: things seem OK, water system wise, for now, but keep some panic on slow simmer, just in case. Something like that.

Oh, well. You win some, you lose some. That *thunk* you hear is me falling out of my chair.

A Cultivated Mind

Just kidding! I think!

Here I wrote about how I’m trying to help this admirably curious young man for whom I am RCIA sponsor on his intellectual journey. I’m no Socrates, but I do know a thing or two that this young man is not going to pick up at school, that would be helpful to him and, frankly, to the world. Any efforts to get a little educated and shine a little light into the surrounding darkness seems a good thing to me.

I figure I’ll give him a single page every week or so when I see him, with the offer to talk it over whenever he’s available. Below is the content of the second page; you can see the first in the post linked above. We started off with a description of Truth and Knowledge. I figure the idea of a cultivated mind might be good next. We’ll wrap it up with a page on the Good and one on the Beautiful, and see where it goes from there.

Any thoughts/corrections appreciated.

A Cultivated Mind

A cultivated mind can consider an idea without accepting it.

What is meant by a “cultivated mind”?

Like a cultivated field:

  • Meant for things to be planted and grown in it
  • Weeded of bad habits and bad ideas
  • Is cared for daily

A cultivated mind

  • is what a civilized and educated man strives to have.
  • is not snobby or elitist.
  • Is what is required to honestly face the world.
  • Is open to new ideas, but considers them rationally before accepting them.

How do you cultivate your mind?

Reexamine the ideas you find most attractive:

  • Have you accepted them because you like them, or because you examined them and believe them true?

Carefully review all popular ideas:

  • Have you accepted them because to reject them might make you unpopular?
  • Have you really examined them before accepting them?

Double your efforts to be fair when considering ideas you do not like:

  • Can you restate the idea in terms that people who accept it would recognize and agree with? If not, you are not able to truly consider the idea.

NOTE 1: To engage ideas, listen to and read what people who hold those ideas say, especially when you don’t like them or already disagree. Hear and understand what the idea really is before you can consider it.This takes discipline and time.

NOTE 2: This is a life-long project, always subject to revision. Guard against over certainty, avoid exaggeration. Do not pretend to know what you do not know. Acknowledge that some things are difficult, and can only be known partially.

Follow the Dominican maxim: “Seldom affirm, never deny, always distinguish.”

Image result for monsters vs aliens B.O.B totally overrated

“Forgive him, but as you can see, he has no brain.” “Turns out you don’t need one. Totally overrated!”


How’s the Weather? 2018/2019 Update

In a recent post here you could almost hear the disappointment in the climate scientists’ words as they recounted the terrible truth: that, despite what the models were saying would happen, snowpack in the mountains of the western U.S. had not declined at all over the last 35 years. This got me thinking about the weather, as weather over time equals climate. So I looked into the history of the Sierra snowpack. Interesting stuff.

From a September 2015 article from the LA Times

This chart accompanies a September 14th, 2015 article in the LA Times: Sierra Nevada snowpack is much worse than thought: a 500-year low.

When California Gov. Jerry Brown stood in a snowless Sierra Nevada meadow on April 1 and ordered unprecedented drought restrictions, it was the first time in 75 years that the area had lacked any sign of spring snow.

Now researchers say this year’s record-low snowpack may be far more historic — and ominous — than previously realized.

A couple of commendable things stand out from this chart, and I would like to commend them: first, it is a very pleasant surprise to see the data sources acknowledged. From 1930 on, people took direct measurements of the snowpack. The way they do it today is two-fold: sticking a long, hollow calibrated pole into the snow until they hit dirt. They can simply read the numbers off the side of the pole to see how deep it is. The snow tends to stick inside the pole, which they can then weigh to see how much water is in the snow. They take these measurements in the same places on the same dates over the years, to get as close to an apples to apples comparison as they can. Very elegant and scientifilicious.

They also have many automated station that measure such things in a fancy automatic way. I assume they did it the first way back in 1930, and added the fancy way over time as the tech become available. Either way, we’re looking at actual snow more or less directly.

Today’s results from the automated system. From the California Data Exchange System.

Prior to 1930, there were no standard way of doing this, and I’d suppose, prior to the early 1800s at the earliest, nobody really thought much about doing it. Instead, modern researchers looked at tree rings to get a ballpark idea.

I have some confidence in their proxy method simply because it passes the eye test: in that first chart, the patterns and extremes in the proxies look pretty much exactly like the patterns and extremes measured more directly over the past 85 years. But that’s just a gut feel, maybe there’s some unconscious forcing going on, some understatement of uncertainty, or some other factors making the pre-1930 estimates less like the post 1930 measurements. But it’s good solid science to own up to the different nature of the numbers. We’re not doing an apples to apples comparison, even if it looks pretty reasonable.

The second thing to commend the Times on: they included this chart, even though it in fact does not support the panic mongering in the headline. It would have been very easy to leave it out, and avoid the admittedly small chance readers might notice that, while the claim that the 2015 snowpack was the lowest in 500 might conceivably be true, having a similar very low snowpack has been a pretty regular occurrence over that same 500 years. Further, they might notice those very low years have been soon followed by some really high years, without exception.

Ominous, we are told. What did happen? 2015-2016 snowpack was around the average, 2016-2017 was near record deep, 2017-2018 also around average. So far, the 2018-2019 season, as the chart from the automatic system shows, is at 128% of season to date average. What the chart doesn’t show: a huge storm is rolling in later this week, forecast to drop 5 to 8 feet of additional snow. This should put us well above the April 1 average, which date is around the usual maximum snowpack date, with 7 more weeks to go. Even without additional snow, this will be a good year. If we get a few more storm between now and April 1, it could be a very good year.

And I will predict, with high confidence, that, over the next 10 years, we’ll have one or two or maybe even 3 years well below average. Because, lacking a cause to change it, that’s been the pattern for centuries.

Just as the climate researchers mentioned in the previous post were disappointed Nature failed to comply with their models, the panic mongering of the Times 3.5 years ago has also proven inaccurate. In both cases, without even looking it up, we know what kind of answer we will be given: this is an inexplicable aberration! It will get hotter and dryer! Eventually! Or it won’t, for reasons, none of which shall entail admitting our models are wrong.

It’s a truism in weather forecasting that simply predicting tomorrow’s weather will be a lot like today’s is a really accurate method. If those researchers from the last post and the Times had simply looked at their own data and predicted future snowpacks would be a lot like past ones, they’d have been pretty accurate, too.

Still waiting for the next mega-storm season, like 1861-1862. I should hope it never happens, as it would wipe out much of California’s water infrastructure and flood out millions of people. But, if it’s going to happen anyway, I’d just as soon get to see it. Or is that too morbid?

K Street, Inundation of the State Capitol, City of Sacramento, 1862.jpg
Great Flood of 1862. Via Wikipedia.