From Clarissa’s blog. She has been posting helpful link regarding the current unpleasantness. I merely note that one does not have to have the sterling credentials of this Swedish doctor to notice, upon little more than inspection, that most of his points are valid.
UK policy on lockdown and other European countries is not evidence-based The correct policy is to protect the old and the frail only This will …
Every death is a tragedy, at least on some level. And everybody dies.
Here, for this analysis, we will recognize upfront that tragic deaths have occured due to COVID 19. Nobody sane is happy about that, and I and everyone of good will has nothing but sympathy for those suffering such loses.
Having acknowledged that, we will set aside our perfectly healthy and human emotional reactions to take a look at what should reasonably be done.
Death is a petty, fickle thing. The most innocent activities can get you killed. For example, every year:
about 3,500 Americans a year drown in their backyard pools.
about 2,700 Americans die in motorcycle accidents.
about 36,000 Americans die of falls.
And on and on. About 4o,000 Americans die every year in automobile accidents, not counting pedestrians who get run over; commercial fishing and raising cattle are two other comparatively dangerous occupations, with disproportionately high accidental death rates.
We all know this, or at least used to. Used to be, we accepted reasonable risk and the inevitable tragedies as part of the price of freedom. What a ‘reasonable’ level of risk is varies, but, generally speaking, if we felt (it’s mostly feelings – we’re people, after all, not computers) the risk was either unavoidable or at least partly something we can control, we were generally good with it. Thus, backyard pools can be fenced, children can be watched; motorcyclists can be very careful and wear helmets; we throw a non-slip rubber mat in the shower.
Something else we used to understand: bad stuff happens. The neighbor kid will climb pool fences, and sometimes drown; the careful, law-abiding motorcyclist will die when somebody runs a red light or makes a crazy lane change; and grandma is going to die of something, and falling and whacking her head is a possibility no matter what we do.
In moments of clarity and humility, we accept that all we can do is take prudent precautions, that bad stuff will happen no matter what steps we take. We recognized we’d end up living lives of paranoia and slavery if we can’t accept a good level of risk.
Now? Check this statement out:
“The truth is, there is no such thing as an accident. We know what to do to save lives, but as a nation, we have not consistently prioritized safety at work, at home and on the road.”
“…no such thing as an accident.” I don’t know whether to prescribe some basic philosophy, psychological counseling, or serotonin reuptake inhibitors to treat such an insane misapprehension of reality. On a logical basis, that claim is sheer fantasy. As propaganda, however, it’s genius. Under this rubric, LITERALLY EVERY BAD THING THAT HAPPENS IS SOMEBODY’S FAULT. “We” know how to prevent every bad thing. If every bad thing has not been prevented, some other “we” – certainly not us! – has failed! Implied: “we” – first “we” again – need MORE POWER in order to properly “prioritize safety”.
The National Safety Council is a hoary institution, dating back to the Wilson administration, a time when our betters were pretty public about their conviction that they could solve all problems if we little people would just let them, so it would be tempting to think this view represents some sort of outdated extreme. Alas, this is exactly the view expressed by Governor Cuomo and others to justify anti-COVID 19 measures that would have been considered crazy a generation (or, perhaps, one administration) ago.
And measures not remotely justified by the information we had available at each point of this fiasco.
How did we get here? A big part was played by models that predicted millions of dead and a collapse of health care systems under the strain of tens of millions of critically sick. Those claims were made not just on extremely unlikely worse-case scenario assumptions, but in the face of known contradictory evidence. Yet, our well-schooled, by which I mean functionally innumerate and alogical, population continues to fall for it. Let’s take a look at models at a high level.
The first thing a pro, even so meager a one as myself, wants to know when constructing a model is: what does the data look like? Is it clean and consistent? Is it reported in an orderly, reliable manner? Is it clearly defined? Because your model – your predictions – will never be any more accurate or reliable than the data they are built upon. This is a truism: garbage in, garbage out.
So, something clear to anyone who cared to look from Day 1 of this mess: the data is garbage. Teasing anything meaningful out of this inconsistent, incoherent pile of junk was always going to be a challenge, and would always be highly uncertain.
Simple questions a pro would ask:
What constitutes a ‘death’?
What constitutes a ‘case’?
Are these defined and reported consistently from place to place and over time?
And slightly – and only slightly – more sophisticated:
Anybody have an interest in the numbers going one way or the other?
Meaning: is it in anyone’s interest to over or under report any of this?
Let’s take it one by one:
Death: Completely inconsistent or undefined from place to place and over time. In some places, such as Britain and New York, a ‘death’ for COVID 19 purposes means: person was determined to have COVID 19 at the time of death. ‘Determined’ means that the person who fills out the death certificate puts COVID 19 anywhere on it. It does not mean: the deceased tested positive. It could mean: the deceased had symptoms consistent with COVID 19, for example, a fever and a cough. Sure, that’s also consistent with the flu or a common cold, but whatever. ‘Death’ as used in the reports, at least from some of the places reporting, literally means that an asymptomatic person who tests positive and then gets run over by a lorry on his way home counts as a COVID 19 death.
That may seem far fetched, but this next example certainly isn’t: yesterday on Twitter, people were mourning the death of a man well known in certain geeky circles. His death was classified, and people were referring to it, as a COVID 19 death – that’s what the official report said.
He was 79, almost 80, and had been in a nursing home for the last few months after having suffered a massive stroke. Now, I gather few healthy people spend much time in nursing homes. Unfortunately, I have, because my father and two older sisters spent time in nursing homes prior to their deaths, and I have had a number of other occasions over the years to visit a variety of such facilities.
The majority of people in these places were put there to die. If they are very lucky, some family member will get them released into hospice care so that they don’t have to die alone in a place stinking of urine. But few are there to get better; at least, if they are there to recover, they are not generally there for long.
I’d be happy to be contradicted by medical pros or somebody with experience working is nursing homes, but here’s what I suspect: if you’re pushing 80 and have a massive stroke and don’t bounce back within 2 months, the only way you’re leaving that nursing home is feet first. Barring a miracle, of course.
No rational person would count that death as due to COVID 19. He died in the aftermath of a massive stroke. The stroke killed him. The most COVID 19 might have done was speed up his death some. But the flu or a cold would have done as much, most likely.
Now put this anecdote together with some other bits of information that have been generally available from the beginning: that people who were old, or with preexisting medical conditions, or both, were disproportionately killed by COVID 19. That COVID 19 broke (a small portion of) the Italian medical system when it ripped through nursing homes in Lombardy. That the average age of death was 81 years old.
The ability of even the most vigorous 80 year old to fight off and recover from illness is seriously impaired in almost every case. Now, start with what in almost every other case is a very minor illness – very few people who aren’t old or already sick so much as develop serious symptoms – and you’re in for a lot of deaths, provided you count every old or sick person who catches COVID 19 and dies from whatever causes as a death FROM COVID 19.
If I were building this model, as a first-pass data clean-up, I’d exclude ALL deaths of people over, say, 65 and ALL deaths of people with preexisting conditions. THEN, if I could figure out a reasonable way to do it, add back in some small fraction of those deaths on the assumption that some, at least, would have survived, say, a year and a day if they had not caught COVID 19.
That would be the professional, responsible thing to do. The number of death, after such a merely prudent clean-up, would be much smaller that currently reported. Using the raw data would be irresponsible, as it clearly misrepresents reality.
Let’s take a look at that. Back in March, when we were losing our minds:
That likely explains why although older Americans represented 31% of the cases, they accounted for 45% of hospitalizations, 53% of ICU admissions, and 80% of deaths, the CDC reported.
Ninety-five percent of New York City’s almost 200 deaths from the new coronavirus had underlying health conditions, though almost half were under the age of 75, according to data published by the city’s health department on Tuesday.
So, we’ve known for weeks now that only 5% of the deaths associated with COVID 19 were otherwise healthy people, and that 80% were people over 65. But rather than take prudent precautions around the sick and elderly, we shut down the economy.
If I were building this model, I’d use that 5% – that’s 1,200 deaths at the moment – as a baseline number, then add back to it some percentage – half? I’d need to do some research, but that seems generous – of the other deaths, under the assumption that those were people who, despite being abandoned to die in a nursing home, might still make it a year and a day if they hadn’t caught the virus. I’m being only slightly flippant here – to have sane numbers, you’d need some approach that filters out people who died of other causes while infected (or suspected of being infected) with COVID 19, to better reflect the real risks. If you’re dying anyway, COVID 19 isn’t really a risk, is it?
So, maybe I start with 13,200 deaths in my model. Seems high.
Next, let’s look at the Chinese. I don’t know how they define a COVID 19 death, and wouldn’t trust them to tell the truth anyway, but: if the Chinese, for their own selfish reasons, decided to count only deaths where COVID 19 seriously contributed to an untimely demise, excluding from the count people who were already seriously sick before they caught the virus, their reported numbers, which seem so low, might actually be accurate. At least, more accurate than the ridiculous numbers coming out of the West.
And – here’s the main point – if, and it’s certainly a big ‘if,’ the Chinese were in fact reporting in this manner, they would be reporting COVID 19 deaths in a way that’s much more like what a reasonable, normal person would consider a COVID 19 death.
Now consider the ‘consistent over time’ issue. New York changed to the ‘count every death where COVID 19 shows up anywhere on the death cert’ method a few days ago. They got what must have been a gratify uptick in deaths – gratifying, because why else would you do such a ridiculous thing? In this current panic, was there any reason to think deaths in New York were being underreported? Really?
Imagine if you fanned a panic and things didn’t turn out too bad, so that, come election time, your opponents can point out that your panic mongering was directly responsible for millions of lost jobs and falling incomes and destroyed businesses. Better do something about that. Today, New York ‘found’ some more deaths:
New York City today has reported 3,778 additional deaths that have occurred since March 11 and have been classified as “probable,” defined as follows: “decedent […] had no known positive laboratory test for SARS-CoV-2 (COVID-19) but the death certificate lists as a cause of death “COVID-19” or an equivalent” [source]. We will add these to the New York State total as soon as it is determined whether the historical distribution can be obtained
Wow, nick of time. First, you change the requirement so that every death with COVID 19 anywhere on the death certificate is classified as a COVID 19 death, then you mine the old death certificates to find any where they listed COVID 19 but didn’t list it as a significant cause, so you can then throw them into the count. That this happened just as the totals for New York and the US were turning downward is one of those amazing coincidences that just keep popping up.
The US in general changed its method of counting COVID 19 deaths a week or so ago, to be more ‘generous.’ US counts are therefore inconsistent over time. Even this ignores the enormous amount of discretion local officials – doctors and coroners – have in what they put on a death certificate, especially in cases where cause of death is complicated, as it always is when preexisting conditions are involved. Even under the same rules, what counts as a COVID 19 death in Peoria may not be what counts as one in Santa Barbara, nor one on Tuesday versus one on Saturday, even in the same location. Lots of uncertainty here.
Finally, add in the political, professional, and social pressures. We’ve just burned a couple of trillion dollars in economic activity, gotten millions laid off and many millions more are seeing income reductions – you want to be the guy who calls ‘Ooopsie!’ on all this? Or do you double down? Do you want to be the one doctor who refuses to play games, or you just going to follow orders and put what they tell you to put on the death cert? New York and the recent redefinition of COVID 19 deaths shows us exactly how that’s going to work.
Similar problems exist everywhere. France, for example, had a spike in deaths when they ‘discovered’ a whole bunch of people who died in nursing homes had not been counted. France, by the way, is experiencing a ton of social unrest at the moment; Macron would like it to go away. Maybe, just maybe, he might like this whole thing to drag on a bit?
As of this moment, 125K deaths from COVID 19 have been reported worldwide. There is no reason to believe this number is remotely accurate by any reasonable definition of ‘death caused by COVID 19’ and every reason to think it wildly overstates the number of such deaths. This problem has been obvious from Day 1.
Next, we have similar issues with Cases, which I will not belabor again here. The real issues with cases is that people still insist on thinking cases = number of infected people. With rare exceptions, case counts will NOT INCLUDE asymptomatic people, for the simple reason that, so far, almost all test are of people suspected of having the virus. Somebody with no symptoms is most likely not getting tested. In the few general semi-random tests done so far, around 50% of infections are asymptomatic. Further, 96% of cases show mild symptoms. Again, many people with mild symptoms are likely not getting tested.
Again, all I’ve done here is ask the basic questions any competent model builder would ask about the data, and discovered that cases must significantly understate the number of infections in the world. Yet people who should know better keep insisting cases and people infected are the same. For example:
Must harp on this one more time:
Case Fatality Rate (CFR) = number of people identified as COVID 19 deaths on their death certificates / number of cases.
Fatality Rate = number of people who died from COVID 19 (however defined) / number of people infected with COVID 19.
Unless we test every human being on the planet, we will never know the true number of people infected. It was always be an educated guess. All we can reasonably know is that it must be more, and probably a lot more, than the number of cases.
Now let’s put these two things together:
Since the number of deaths has been demonstrably overstated, AND the number of people infected is demonstrably understated by the number of cases, THEN: the CFR OVERSTATES how deadly COVID 19 is. Based on my semi-educated proposed adjustments, outlined here but explained in more detail in earlier posts, I’d guess that factor is at least 4. If your CFR is 2.5%, then, the logic here shows the real fatality rate, the real risk someone who catches COVID 19 will die of it, can’t be more than 2.5/4, or 0.625% EVEN IF we make the completely insane assumption that ALL deaths WITH COVID 19 are in fact deaths FROM COVID 19. If my guesses are in the ballpark, cut those numbers in half again.
These number ignores the age/health component to risk. If you’re healthy and not old, your risk is effectively 0 – too small to mean anything in practice. Effectively, you’d be better off getting some exercise and and maybe cutting some calories than doing anything at all to avoid COVID 19. Even if you’re old, if you are otherwise healthy, your risks are tiny.
It also needs to be said that this sudden concern for the sick and elderly is ironic, by which I mean, rank hypocrisy. We routinely warehouses the sick and elderly under the care of minimum wage workers so that they are no bother to us while we wait for them to die. Again, ever been to a nursing home? For every resident who gets daily visits from loved ones, there are probably 4 who rarely, if ever, get a visitor. Ask somebody who works in one of these places. We abandon them to die, and now we’re all concerned about them dying? Give me a break. (Note: we have my 82 year old MIL living with us. She’d still be in a nursing home if we hadn’t sprung her. There are cases where this is not possible, where more care is needed than a normal family can provide. But there are plenty of cases where it would be possible, if anybody cared to do it. And nothing prevents frequent visits or at least phone calls.)
So, what would simple prudence dictate as proper actions to take in order to protect the vulnerable from COVID 19?
If you are sick or have been exposed to COVID 19, stay away from the sick and elderly (duh).
Similarly, if you are sick or elderly, you might want to consider avoiding crowds for a while. I say ‘consider’ because lack of human contact can damage people too, and loneliness is a big problem for many elderly people. If gramps want to accept the risk and go to the church potluck to be with his friends, who are we to say no?
Wash your hands.
I leave the proper steps to best care of hospitalized and institutionalized people to the pros. The rest of us should get back to work.
Start with building a smallish but beautiful chapel within the Church’s traditional architectural language.
Assign 2 canons to make sure that the liturgy is celebrated beautifully and consistently, as it has been best celebrated for centuries:
Then build a multipurpose building nearby, but not too nearby. (Building A should be clearly distinguishable from Building B.) This building will serve as a base for all the Church’s activities that flow from and are directed toward the Eucharist those two guy up above are making sure gets reverently celebrated.
Then staff this facility. Note the order is very important, as it carries and communicates the truth that the Church is commissioned with spreading: that Jesus Christ we commemorate and Who is among us in the most powerful and direct way in the Eucharist has died and risen that we might be saved. AND that we therefore must love one another as we love ourselves.
Of course I’m ignoring a bunch of stuff here, such as how much more effort it takes to set up something like this instead of just sending missionaries out to celebrate Mass on a colorful native blankets spread on Mother Earth, and that the people being proselytized will not (at first) understand what is going on, and that if this were put into practice, many fewer (at first) missionary churches would be established. And I’m rejecting outright the idea that the message – of God’s sacrificial Love and our need for salvation – must be in any material way shaped for the particular audience. I’m making the radical assumption that people being people, and all of us needing saving, that this message doesn’t really need to dressed up in local garb in order to be digestible. Instead, I’m recognizing up front that the Way is weird and foreign and potentially off-putting no matter who you are. I’m rejecting the idea we’re going to go easy on it at first, make it seem just like what people are used to, only to spring the full horror/brilliance/dazzling Love on those same people at some future point. (Right? We’re planning on doing that at some point?)
Anyway, I’m just a nobody who hasn’t done anything, let alone been a missionary. I don’t even really know what is being done (except that mass on a blanket thing – got that from a missionary order’s magazine). Nonetheless, I can’t get away from what I expect would be Paul’s reaction to all this, the greatest missionary of all time and the man who tried to cut through the nonsense of the 1st century by declaring: “I preach Christ, and Him crucified!”
This week, 7 years ago, my parish informed us that a boy walking on Crossroads was hit by a car and passed away early that morning. I was 12 and had known what Crossroads was for as long as I could remember because my parish in Northern Virginia invites walkers to come through and speak at the end of their walks. I remember reading about Andrew and being completely shocked.
This year, I decided to walk with Crossroads, and wow, God’s timing is incredible.
On the first weekend of our walk, way back in San Francisco, at the very first Mass I spoke at, this woman came up to me after and told me she knew someone who did the walk once. I talked to her for a while, and she turned out to be Andrew’s aunt. She said her family was very at peace with what had happened and talking with her was really inspiring and encouraging to me.
On Sunday, I was randomly assigned to speak at St Margaret Mary Alacoque Parish in St Louis, MO. Many parishioners came up to me after and told me Andrew had spoken at this same parish just a couple days before his incident several years ago. They were exceedingly sorrowful, and I had many fascinating conversations with them about Andrew. I told them about how his uncle had walked the rest of the summer in his honor and how the rest of the team that year did end up finishing the walk together. It was really moving, and I was struck by how in 2 days or 2 weeks, or at any moment, something like that could happen to us, but are we really prepared? Are we spiritually prepared?
On Thursday, I was on shift walking, and while finishing the final prayers of a rosary with a teammate, we happened to walk right up to the site of the incident, finding the cross planted for Andrew. It was a chilling experience, kneeling and praying in the middle of the road on the median, while cars drove past all around us. My heart was pounding, and I had goosebumps just thinking about Andrew and how he was killed at that very spot while praying the rosary for the unborn, who do not get a chance to live at all. 7 years later, we walk in his same footsteps, and steadfastly continue to pray for an end to abortion.
I am so grateful I was given a chance to live, and for this chance now to witness the gospel of life to others.
Thomas More College of Liberal Arts has a lovely campus in Merrimack, New Hampshire, next to the comparatively larger Nashua. The weather cooperated, as the sunny 70F low humidity day is about the best they ever get in New England.
They have a lovely but tiny chapel, so they set up a tent for Mass:
The college requires graduating seniors to make a 5-minute presentation on their thesis before the parents, in a ceremony held at the Mansion, a large 114 year old building a couple miles from campus:
With 28 graduates divided into 2 groups, this didn’t take too long, and we were able to turn to socializing and refreshments.
Bragging break: our son got honors for his thesis defense at TAC; the college president at TMC, unsolicited, told us our daughter’s paper was one of only 2 he’d really liked in his decade-long tenure as president. They did well.
Sunday, we attended Mass at St. Patrick’s – pics in the last post – before heading off to the kid’s uncle’s house (complete with aunt and 4 cousins). On the way, stopped in Northfield, MA, to visit the new TAC East campus. Wow.
The college is renovating some of the buildings, especially the chapel, which, having been built by Protestant Evangelicals, had no center aisle for processions. Overall, most of the buildings are beautiful, the grounds are very striking, just a lovely place. What a blessing!
Our son will be a prefect there next year, meaning he lives in the dorms and hangs with the students, in an effort to help seed the culture which TAC has spent almost 50 years developing on the west coast. He also will be a manager in the kitchen, which means supervising students, mostly, but also doing some cooking. He’s excited. He starts in 6 days.
Daughter soon heads off to Israel for a visit, then back home for a few weeks – then off to Africa as a lay missionary for a year! Yikes! On the plus side, older daughter is moving back to northern California from L.A., so we may see more of her, which is very nice. Down to one 15 year old child, and he’s making noise about doing college early. Kids these days.
So packed house at the moment – we also have another guest – soon to be largely empty. Prayers for the safety and success of our kids would be much appreciated.
One thing a classic liberal education is supposed to do for you is make you suspicious of ideas you find emotionally attractive. Like the brutal honesty demanded by science, it is just assumed to rub off on students who work their way through all those tough classic texts. Just about every freshman finds Plato attractive. Like the young men who followed Socrates around just to see him lightly eviscerate some pompous fool, we thrilled to the discovery that pompous fools could be eviscerated, and craved more. Then we run into Aristotle, and don’t like it much, because he, effectively, says: enough with the fun and games, time to stand your ground and say what you mean. Perhaps some of us get the idea that Socrates would have met his match, or more, in Aristotle (although I suspect they would have gotten along pretty well while having some doozies of arguments, because they had doozies of arguments. Socrates must have been bored out of his skull with the Ions and Menos of the world.)
Then, as you move on through the list, one precious idea after another gets beat up. You think that you’ve reached the pinnacle of sophistication as an 18 year old who has learned that the only thing he knows is that he knows nothing, only to have that self-refuting notion beat up by Aristotle’s moderate realism. Then, perhaps, you see how Aristotelian metaphysics and epistemology lead to places you might not want to go, making Descartes very appealing. But Descartes leads to Hume, Berkeley, and, eventually, Kant, while Thomas leads to science. So now, maybe, Descartes is less appealing, and you take another look at Aristotle…
Thus, by a million paths, the serious student learns to take extra care about accepting too readily ideas that he finds attractive, because he finds them attractive.
When I read Alice Miller‘s books 30+ years ago, I found her ideas very attractive, even though her Freudian approach was seriously off putting. I like to say that Miler was a fallen-away Freudian, but had not fallen away nearly far enough. What made her assertions more acceptable to me was how well they fit with evolutionary theory. On the fly as I read her books, I would substitute arguments from natural selection for hers, the unholy offspring of Freud and Rousseau.
Brutal honesty moment: in other words, I back-filled psychological theories I found emotionally appealing with evolutionary just-so stories. I get it. I suppose my purpose in writing this out, apart from trying to make it as clear as possible to myself, is to invite criticism.
What are these theories? I’ve mentioned them before, but never in great detail. Here, I’m paraphrasing them based on 30 year old memories and replacing Freudian turns of phrase with Darwinian language. These start out as truisms (I should hope) but turn dark:
For their very survival, children need to be part of a family/tribe (Extended family – I’m just going to use ‘tribe’ from here on out). In our evolutionary environment, no children lived to reproduce outside of a tribe. Therefore, intense selection pressure has been applied to children in favor of group membership and against running off or doing anything that might get them excluded. (1)
As sophisticated social mammals, children by instinct incorporate whatever behaviors are required for tribal membership into their base understanding of the world as foundational assumptions. (This is nothing more than saying ‘tribalism’ is a base state for humans and is pre-rational). Kids don’t think about these requirements (much), they just are.
We see it in the ‘attachment-promoting behaviors’ of babies and toddlers before they are even aware of what they’re doing. As they grow, their behaviors become more complex and more specific to their particular environment. In this, people are only the most sophisticated among animals – you cat and dog do this as well.
All well and good, and I hope not too controversial. It should be noted that the reciprocal activity on the part of the adults – nurturing the tribe so that the child might survive – must also be a part of any environment of evolutionary adaptation. So parents and relative – the tribe – can be expected to behave in such a way as to promote the survival and integration into the tribe of its children. That’s the model that seems to have been developed and to have worked over the last half a million years or so, at least. There’s nothing necessarily nice or pretty about it – it’s just what works.
But what happens when, as in the modern world for the last couple hundred years in many places, many people survive despite having no tribe in the evolutionary sense? What happens when the brutal culling mechanisms of Darwinian survival get put on hold? Whatever else may happen, it is now possible on a scale and to a degree never known before for children to be neglected, abused, and traumatized – and still live, and perhaps even still reproduce.
Children who are neglected, abused and otherwise traumatized will, through the all but inexorable drive of instinct, incorporate their neglect, abuse and trauma into their pre-rational view of the world. Miller, in her decades of work as a psychoanalyst, noted a remarkable ability of her patients to excuse, ignore and explain away the objectively horrible things done to them – which is what one would expect, under the evolutionary explanation above. Aside: this, at least, seems to be obviously true from just routine interactions with people.
So we have a world increasingly filled with damaged children of all ages who, for basic survival reasons, have accepted their mistreatment at the hands of those who were supposed to love them, rationalized it, and who are highly motivated to accept it as part of their tribal membership fees.
It gets worse: as part of the emotional mechanisms that ‘worked’ insofar as they did in fact survive into adulthood, their experiences and coping mechanisms now become the template for how to raise any children they might have. Thus, Miller observed the pattern where someone who had been sexually abused as a child, even if they were not themselves an abuser, would routinely put their children into situations where they were likely to be abused. To do otherwise would be to confront the careful structure that allowed the parent to survive in the first place. Very painful and disorienting.
This is expressed in the title of one of her books: Thou Shalt Not Be Aware. To acknowledge one’s own mistreatment enough to protect one’s own child requires reopening some deep and carefully scarred over wounds. Rather than do that, we readily subject our kids to what we experienced, no matter how horrible.
Miller says that a sympathetic witness, someone who understood the trauma and abuse on some level and could tell the child that it wasn’t right, was all but essential to having any hope for healing. That witness provided a counter to all the stories the kid would otherwise make up in order to keep his membership in the tribe: that daddy didn’t mean it, that momma does really care, that what uncle did wasn’t so bad, and so on – all the little myths one runs into whenever one is drawn into other people’s dramas. Lacking such a witness, it seemed to Miller all but impossible to get past all the barricades built up by the child.
So, there you have it: I see – I think, that’s the question – people reenacting in their child’s life whatever it was that traumatized them as children: people who were abandoned at 15 abandon their own kids as teens; children of divorce get divorced; Sexually abused kids become libertines and expose their own kids to that life; and so in a million ways.
There’s more, but that’s the general outline. I’m not just saying that miserable childhoods tend to make for miserable adults. I’m saying that miserable childhoods tend to all but compel people to make their own children miserable in the same way.
Anyway, make any sense? I readily acknowledge that Miller is a loon – I read most if not all of her books, and she gets into speculation that’s little better than palm reading in many places. And, as mentioned, even though she became one of Freud’s harshest critics, she still thought and spoke like a Freudian. Am I just experiencing confirmation bias when I seem to see this inflicting of one’s childhood trauma on one’s own children everywhere I look, or is it real?
And, of course, tribes can’t survive without children, either, so, at least by nature, tribes care about their children as passionately as children yearn to belong. Note that this doesn’t imply any sort of lovie-dovie niceness: the ever-popular Yanomami tribesmen raise their sons to be good little homicidal sociopaths, because that approach has been proven to work. Similarly, their daughters are raised to seek the most murderous sociopaths as mates.
And then expanded, by design, to school, with its artificial and arbitrary tribes of classrooms and grades. But Miller doesn’t go there, as far as my memory can recall.
The Resurrection, by Piero della Francesca, a fresco from the 1460s found in the Palazzo della Residenza, the City Hall, as it were, in the town of Sansepolcro, Tuscany, Italy. As is so often the case, in person it is far more impressive and moving than any reproduction. This fresco has made a strong impressions on many people, including many non-Catholics and even atheists. Huxley wrote about it. This is Christ in triumph, but also Christ in judgement, which makes it an image well-suited to our current crazy years. It was commissioned not for a church or chapel, but for the place where city government was conducted. The village elders would pray before it prior to conducting business, to remind themselves that they would be judged by Christ, who died and rose that they might be saved. Look at the face della Francesca gave to Christ: A merciful yet just judge.
This fresco shows an amazing degree of sophistication: 2 vanishing points, one for the soldiers, one for Christ, so that the eye can contemplate them both separately and together. The near-hyperrealism of the guards on the one hand stands against the utter disregard for gravity and anatomy other. There are three legs between four guards; the guard in the front right is leaning on air; more subtly, the guards in the middle have assumed anatomically and physically impossible positions. While there are technical accounts of why this is so, the simple reason is that della Francesca was painting the Resurrection, not a bunch of mercenary guards. Stuffing in the right number of legs and giving them all proper postures and things to lean on just didn’t figure into it.
This masterpiece narrowly survived destruction in World War II when British artillery officer Tony Clarke defied orders to shell the town. He had never seen the fresco, but had read Huxly’s description, and had seen the destruction of Monte Cassino. He didn’t want to go down in History as the dude who wantonly and needlessly destroyed a priceless work of art. The grateful villagers (Sansepolcro is hardly more than a village even today) named a street after Clarke. (1)
The della Francesca brough the image below to mind: Gerard David’s image of the Judgement of Cambyses. commissioned in 1488 for the City Hall of Bruges. In this diptych, we see on the left Cyrus’s son Cambyses condemning the corrupt judge Sisamnes, who on the right is shown suffering his sentence: being flayed alive. His skin was then used to cover the judgement seat, occupied next by his successor: his own son.
Dante puts traitors to benefactors in the lowest circle of Hell. This would include those who are entrusted with the public good and abuse that trust. On this most holy of days, we rejoice that a good, merciful and just judge awaits us, but are warned to not presume on his mercy. We pray for those who reject His authority, and fervently throw ourselves on his mercy and beg mercy on everyone we know.
Mercy is there for the asking, but God is too polite to force His mercy on us if we won’t ask for it.
One of the weird things that came out of the two world wars: in my (very light – I welcome correction here) reading, wanton destruction for the sake of revenge was just as much, if not more, prevalent on the British side than the Nazi side (one gets the impression the Americans were merely clueless, but I’m hardly an authority). Understandable, since the Germans bombed London to terrorize the English. Given the horror of those attacks and British character, that did not go over well. I think many in England would have reduced all of Germany to ash if they could. They came close in some places. Meanwhile, there are numerous stories about Nazi officers doing what they could to prevent wanton destruction: not burning Paris against orders, not putting anything important enough to destroy in the ancient heart of Florence, for two examples. Not defending Naziism (isn’t it insane that I think it necessary to state that?) but individual Nazis were just human beings like us, and could behave as evilly or beautifully as anybody else. We prevent ourselves from learning from this cautionary tale by blanket vilification: those people were not like us! They were evil! Nope, they were for the most part just regular folks who fell to social pressures, a misplaced sense of duty and a eagerness to believe a story whereby their troubles were all somebody else’s fault. Kind of exactly like most people today.
Who gets to say what’s in textbooks? First, let’s consider a fairly recent and I think representative example. Richard Feynman was once on a textbook committee here in California. (Aside: the link above came up when googled for the Feynman essay. The commentary at that site is also worth perusing.) While his experiences date back 50 years, the situation has only become worse. So, who says what in them?
You see, the state had a law that all of the schoolbooks used by all of the kids in all of the public schools have to be chosen by the State Board of Education, so they have a committee to look over the books and to give them advice on which books to take.
Feynman getting a say in what’s in science and math books? Famous, brilliant, Nobel-winning teacher? Sounds about right. Buuuut:
Immediately I began getting letters and telephone calls from schoolbook publishers. They said things like, “We’re very glad to hear you’re on the committee because we really wanted a scientific guy . . .” and “It’s wonderful to have a scientist on the committee, because our books are scientifically oriented . . .” But they also said things like, “We’d like to explain to you what our book is about . . .” and “We’ll be very glad to help you in any way we can to judge our books . . .” That seemed to me kind of crazy.
A nice lady who’d been on the committee before told him how it worked:
They would get a relatively large number of copies of each book and would give them to various teachers and administrators in their district. Then they would get reports back on what these people thought about the books.
But this is Feynman we’re talking about! So:
Since I didn’t know a lot of teachers or administrators, and since I felt that I could, by reading the books myself, make up my mind as to how they looked to me, I chose to read all the books myself. . . .
If you know anything about government committees, you may be able to guess what happens. Feynman is the ONLY person on the committee who read any of the books. In one case, there was a book being rated even though it was blank:
We came to a certain book, part of a set of three supplementary books published by the same company, and they asked me what I thought about it.
I said, “The book depository didn’t send me that book, but the other two were nice.”
Someone tried repeating the question: “What do you think about that book?”
“I said they didn’t send me that one, so I don’t have any judgment on it.”
The man from the book depository was there, and he said, “Excuse me; I can explain that. I didn’t send it to you because that book hadn’t been completed yet. There’s a rule that you have to have every entry in by a certain time, and the publisher was a few days late with it. So it was sent to us with just the covers, and it’s blank in between. The company sent a note excusing themselves and hoping they could have their set of three books considered, even though the third one would be late.”
It turned out that the blank book had a rating by some of the other members! They couldn’t believe it was blank, because [the book] had a rating. In fact, the rating for the missing book was a little bit higher than for the two others. The fact that there was nothing in the book had nothing to do with the rating.
Read the whole thing, if you have the stomach for it. Feynman noted many egregious errors and obvious failings in the books that did have stuff in them, so much so that one is lead to wonder if the blank book would not have been an improvement (hint: yes). The rest of the essay is about the corruption of the selection process, where the publishers wine and dine the committee members to get their support, but given the nature and quality of the books, that qualifies as a secondary scandal.
So, to answer the question: who gets to decide what goes in textbooks? it’s ‘educators’ with the ‘help’ of politicians. In the above essay, the recommendations of the committee are largely overturned by the politicians allocating the state budget. The committee was instructed not to look at cost, so they couldn’t recommend a set of books within any budget, or have a hierarchy of which books to cut first if the money wasn’t there. Didn’t matter anyway, as the Education Department simply did what they wanted once a budget was determined. Feynman, a legendary teacher himself, is there just for cover – it’s not like he get to decide, or even have much of a say, despite his expertise.
Two things should be obvious from this story: first, educators, a class of people that did not exist until about 200 years ago (people were teachers, back then) decide what goes into the books. State education departments are and have always been staffed by educators and political hacks.
But the second thing is more perhaps more shocking: it doesn’t matter what goes into the textbooks, so long as it fails to teach! It is not like there are not many people out there who can teach algebra, say, and who could write a good, usable textbook on the subject. I ran across one such book many years ago, and it was night and day. After taking the usual high school algebra courses, I could sort of do the math, but my understanding was limited. Then, in my mid-20s, in a few pages of a book I stumbled across in a library written by a guy who understood and loved his subject, it was a mini Eureka! moment. Algebra wasn’t a series of tricks and rules, but rather a complete logical system. The fragments made no sense; the whole was beautiful.
So, I know they’re out there. Textbooks written by people who understand and love their subjects are never used in the public schools, at least K-12. This is no accident. It’s not just the experts whose opinions are ignored. What parents might wish were in them is worse than irrelevant – it is to be actively shunned.
The compulsory, graded public school system was never envisioned as a means to educate. Fichte, Mann and their spawn hardly cared if the students learned traditional subjects. The system they dreamed up, realized and imposed with the police power of the state was intended from the beginning to form an obedient and docile population. Textbooks that teach real knowledge do not help toward this end, and might hinder it. Better to not run that risk.
Just some quick links. Amusing stuff gleaned from Twitter, where Raw Data gets some comeuppance:
The above seems to be in response to this. Tiny changes either way in tiny states; large changes either way in big states. Per capita numbers might be more interesting, maybe not. Raw data is just that – raw.
Raw data also tells you that Chinese American and Japanese American women, in general, make more money than white men, in general. Adding geography – Are Asian women more likely to live in urban centers? Or do many live among the white males of Appalachia? – or education level – Do Asian women in general get more education than white men in general? – might recalibrate the numbers. Would be interesting. Inquiring minds would want to know.
Wow. If you squint a little, this looks pretty damaging to the argument that innate sexual difference are a factor in the career choices men and women make:
What this purports to show: in the 1960s, a small percentage of women pursued majors in Medical School, Law School, Physical Sciences and Computer Science. Starting around 1970, the percentage of women majoring in these 4 fields started to increase, and, with the exception of computer science, leveled off around 2000. The percentage of women majoring in computer science peaked around 1985 and began to fall, then it leveled out around 2007.
Proposed conclusion: difference in career choices can’t be based on sex, because the sexes were as different in 1970 as they remain in 2000, yet the percentage of college women studying computer science fell even as the percentages in the other listed majors continued to increase or at least held steady. Or, a Twitterer put it:
Here’s a problem for those who say “biological differences” or “innate interest” explain why women hold fewer coding jobs than men.
Well, that’s a big ‘maybe’. What’s wrong with this picture? Let me count the ways:
First, why those 4 fields and no others? As shown here, it’s not too hard to cherry-pick examples to show whatever you want to show. Also, aren’t we talking about very broad fields? Nobody majors in ‘Physical Science’ – they major in geology or chemistry or physics or some such. A better, if harder to read chart or charts would show the male/female differences across many majors at a level of granularity that means something. Does it make a difference to lump them all together? We don’t know, but it is something we would want to know.
Next, and this is the kind of reality check anybody paying attention needs to do: What kind of source data have we here? What do we mean by college students and majors? Med schools and Law schools are dedicated graduate schools attended by students who have presumably prepared and competed to get into them in their undergrad years, while the physical sciences and computer science are studied both in undergrad and grad environments.
Are we talking about majors these students were awarded their degrees in? Or just the ones they declared as pimply 18 years olds? Or something else? STEM fields, for example, have infamously high dropout rates: those 18 year olds, who have been assured for the previous decade and a half that they are the best educated people ever and often have all those Advanced Placement credits from high school to prove it, discover to their chagrin (with possible collateral damage to their self-esteem!) that they cannot in fact hack the math an electrical engineering degree requires – and that it’s a lot of work to catch up to the level of the typical college-bound high school graduate from 75 years ago. It seems the professors in these fields didn’t get the memo:
“A substantial grading differential exists between science and nonscience courses,” said presenter Ben Ost, a third-year Cornell economics Ph.D. student. “Even students who eventually become science majors receive much higher grades in their nonscience courses than their major field courses. This gap in grading standards discourages students from pursuing and completing a science degree.”
(The linked article also mentions that white males stick to it a lot better than women and ‘people of color’ – except for those pesky Asian people of color, both male and female, who do even better than white men but are not mentioned because mumble mumble…)
What difference does this make? As is so often the case, the correct answer is: we don’t know. Such information would be required before we could make much of anything out of this graph; it’s also possible that, given the required information, the point that the graph was concocted to make might become obscured or vanish entirely. Again, we don’t know.
What we do know is that plenty of students do not take ‘pre-law’ (if such a major even exists – didn’t in my day) as undergrads except informally – people with undergrad degrees in English, history, philosophy and maybe even computer science get into law schools all the time. Med schools tend to be more demanding, expecting a solid undergrad degree in some related major such as biology, with chemistry, biochemistry and the like, but still – there’s Group A – graduate students with some undefined undergraduate degrees, and Group B – undergrads and maybe grads in computer sciences and the physical sciences.
Double counting? We don’t know! Seems inevitable, since there’s no way of knowing what the statistics do with a Women Studies/Lawyer or Physics/Doctor or any of the other possible undergrad/grad combos. This would be good information to know.
It’s possible it makes no difference – the graph’s author could have developed a sophisticated and well-defined method that cleared this up and showed the graph to be a perfectly reasonable piece of information. But they didn’t show that information – or, at best, they did and the people pushing this on the web decided to leave it off.
What this does say: the presenter of the graph is much more interested in the particular message he wants to convey than in clear, well understood information.
Next, did anything else interesting happen to the college student population around the 1980s? Why yes, yes it did:
First: more and more people went to college. Second, a disproportionate number of those people were women (or are projected to be women – graph commits the sin of not distinguishing projections from statistics – the far right goes to 2023, which hasn’t happened yet.)
So, at least, we’ve spotted one big issue: the graph at the top shows percentages; those percentages are of ever-increasing numbers of women. Thus, for example, the raw number of women in these field could very well be increasing, just not as fast as the total number of women college students – looking at total numbers instead of statistics might flatten out the apocalyptic-looking post 1985 drop-off.
Another thing that happened in the 1980s, or at least fully blossomed: the idea of college as a Holy Grail/meal ticket for everyone, especially for women. Instead of a college education being something people with certain specialized career ambitions would pursue, or even – *gasp* – something one would do to prepare one’s self for the duty of understanding and protecting one’s culture and Western Civilization in general (colleges being a Western Civ thing, after all), college became more and more exclusively a stepping stone to financial success.A generation of women came of college age who had heard from every direction that to be financially dependent on a man – you know, by marrying him – was demeaning and made a woman less than fully human. Therefore, if a woman was foolish enough to marry, she at least should get a career going first, so that she could walk out on him and maybe the kids if for any reason that whole ‘marriage’ arrangement proved unsatisfactory.
I’m just old enough to remember jokes about women college students – co-eds, they used to call them – attending school to get their Mrs. degrees. That is not a joke one could assume one could safely make on campuses today, nor any that use ‘the ball and chain’ analogy for wives – because in all popular discussions of the evils of marriage, all husbands are assumed to be evil, and all wives innocent, at least, no husband is acknowledged to have taken on the burden of responsibility and no wife is acknowledged to receive any financial benefits from marriage. Because mumble mumble.
But I digress.
If you think you need to have a career to take care of yourself, might you not pick a field that is not too hard and yet promised good job prospects? In my experience, people who study computer science are, well, geeks and nerds. It’s a bit of an obsession, not something someone indifferent to the actual work would choose just because the job prospects are good. Why hang out with obsessives with whom do not share the interest? Why compete with people who are passionate about the work if you are not?
Next, look at this from the colleges point of view: the number of young people who want a STEM-like career and are willing to pay the academic price – actually studying hard, skipping a few parties, preparing themselves in high school – is never going to be too large. So, if all you offer are hard classes – see the quotation above – then you’re going to lose all those students who can’t or don’t want to hack it UNLESS you have easier classes they can take instead. If they leave school, you lose the money those kids bring in, while if they transfer to an Applied Marxism(1) a studies major of some sort, or just to any other easier major, you keep them and the money they bring in.
Yes, yes, I know that the motives of college administrators are pure and high, and that it only appears that they are money-grubbing vermin indistinguishable in action from the snake oil salesmen they generally assume all business people to be. (Except you, donor with a building named after you! You are not like other men!) Whatever the motives, the effect can be observed: many majors have been dumbed down and a numbers of easy new majors have appeared over the last few decades – and women dominate those majors.
And why not? I myself got a graduate degree in business because A) I had a growing family to support and B) business is REALLY REALLY EASY, at least compared to ‘real’ majors. Once you decide you’re going to college to further your career, why not do it in as pain-free a manner as possible, and leave the specialties to the specialist? I also once signed up for some programming classes at UC Irvine (elite tech school I just happened to be living near) in the mid 80s – and promptly dropped out. I was merely curious – the younger whippersnappers were playing for keeps. No way was I keeping up on the amount of time I had to invest in it. These dudes (almost all dudes) lived in the computer labs. I imagine my experience isn’t all that unusual.
Potential career paths: management, sales, consulting, finance
Here’s another. Wow, a lucrative growing field dominated by women!
No. 2: Health Professions and Related Clinical Sciences
Degrees awarded to women in 2008: 94,192
Women in the major: 85.4% of total
Men in the major: 14.6% of total
Potential career paths: nursing, physical therapy
Here’s another, a really hard one for brainiacs:
No. 8: Biological and Biomedical Sciences
Degrees awarded to women in 2008: 46,217
Women in the major: 59.4% of total
Men in the major: 40.6% of total
Potential career paths: research, teaching, medical technology
One might imagine a sane person choosing a major based on both personal interest and career prospects; one also can imagine an 18 year old choosing a major because it’s cool (raises hand. Great Books, baby!). Would one have any reason to expect any other behavior from the women who have come to dominate the college student population? Apart from the religious dogmas of Critical Theory, that is?
Conclusion: If there are nefarious forces keeping women out of Computer Science majors, this chart isn’t showing it. Cherry-picked majors, poor or no definitions of key terms, murky data. In fact, it’s misleading to the point of propaganda.
Bonus: here is a fun chart from NPR, which is not an official tool of oppression yet as far as I know, that gives both percentages and raw numbers (if you hover correctly) and the relationships between majors. It could have been improved if the graph were of raw numbers so as to reveal the upward slope of total enrollment, and the percentages showed when you hover. Big caveat: as is almost always the case in these things, the chart assumes a set of categories that existed in 1970 map meaningfully with categories in use in 2011. That’s plausible enough for math or chemistry – but is what they meant by sociology or psychology or cultural studies (?) in 1970 the same as what they meant by it in 2011? Bears thinking about.
Nothing is easier than Applied Marxism Studies, known as Critical Theory: pick somebody unhappy; ‘discover’ who is oppressing them, because all unhappiness results from oppression; let your imagination run wild as to how awful those mean oppressors are, and how we need to exclude them from even opening their mouths and probably need to kill them. Doesn’t have to make sense or even be internally consistent. Easy-peasy A+