Service announcement
A few weeks ago I told you I was changing the day this newsletter comes out to Sunday. Now I’m changing the policy again. If I have more than enough to send out by Friday night, I’ll send it out scheduled for Saturday at 10.15 am, and if not it’ll go on Saturday night scheduled for 10.15 Sunday. This kind of unpredictability — will he or won’t he? — may remind you of a supervillain. Then again, Adolf Hitler used to deliberately start his speeches late. I would never do that.
Blessed are the morally flexible, for they shall inherit leadership
These are my principles. If you don’t like them, I’ve got others.
Groucho Marx
Alasdair MacIntyre argues that our civlisation has lost its ethical bearings, which he attributes to (at least) two attributes of modernity.
First, modernity saw a unitary (Christian) moral code, or family of codes, displaced by other (secular) families of codes — utilitarianism and Kantianism. This created lots of work for serious thinkers. But in the wider world, it mostly multiplied the options available to rationalise any position — and indeed to get on one one’s high horse about it. As MacIntyre put it, “the characteristic skills of those who are socially and politically successful are exhibited … in negotiating their way” through the possibilities that present themselves. Blessed are those who can clothe expediency in the garb of principle. For they shall deliver lectures on leadership.
Second, there’s the way we’ve built modern life on relentless competition. If one person just won’t do something, someone else will. This is true not just in the market, but also within private and government bureaucracies. There, what matters is getting things done — all while one stays out of trouble. And for MacIntyre that tells us what modern dystopia looks like.
In any society which recognized only [effectiveness — as opposed to intrinsic excellence], competitiveness would be the dominant and even exclusive feature … We should therefore expect that, if in a particular society the pursuit of [such] goods were to become dominant, the concept of the virtues might suffer first attrition and then perhaps something near total effacement, although simulacra might abound.
I started writing this piece reflecting on the Quillette piece about the moral mayhem unleashed by trans maximalism I extract below. But as I was writing it I became aware of the Australian Public Service Commissioner’s statement on Robodebt. It makes little distinction between them Kathryn Campbell and Renée Leon both of which were guilty of numerous breaches of the code of conduct. From the little personal contact I’d had with each, I was not surprised at Kathryn’s role as general bad guy for the duration of the inquiry. But Renée seemed like a competent and decent person to me — though I had far too little contact to be sure. Still, the inquiry was exhaustive and so I assume she was guilty of the breaches she’s accused of.
But here’s the thing. In practical life one is endlessly trading off means and ends. And in a situation as toxic as Robodebt, that’s all anyone’s doing — working out when, if ever to pull the plug and say “thus far, no further.” So it’s important to understand these things in context. And this is the context as Renée Leon sees it.
I am disappointed with the way the Australian Public Service Commission has come to its decision, and I stand by the actions I took to get definitive legal advice and bring the robodebt program to an end," she said.
"Robodebt had already been in operation for two years when I became secretary of human services. When legal doubts were raised, I sought definitive advice from the solicitor-general. I acted as expeditiously as possible to convince a government that was wedded to the robodebt program that it had to be ceased.
"When ministers delayed, I directed it be stopped. Two weeks later, my role as secretary was terminated by a government that did not welcome frank and fearless advice.
There aren’t many public servants who got sacked for doing the right thing — like Paul Grimes and Paul Barrett for instance. But I’ve always thought we should honour them. I can’t see anything about this in the APSC statement.
None of the other people were named, no matter how dishonest or disastrous their conduct was for individuals caught up in the nightmare. As the Commissioner assures us:
Four factors are relevant in the decision not to name current or former public servants.
Anonymity helps ensure proportionality in the application of sanctions. …
[I]ndividuals [should be able] to restore themselves and have some closure …
[D]eterring … improper conduct and … public confidence can be achieved without disclosure of personal details. …
Naming of individuals in the Robodebt case may create expectations that public servants in other or future Code of Conduct inquiries will also be named, and this could undermine the effective operation of these (and other) investigations. There are several hundred investigations across the Australian Public Service each year, and fear of being named may undermine engagement, diminish the opportunity for restoration and increase litigiousness.
So it’s all good. I’d have felt a little better if the Commissioner had chosen to reveal the identities of those doing the most egregious things. There were some shocking things done, but we shouldn't lose our sense of proportion — like those victims who killed themselves — should we?
And why were the departed secretaries named? The Commissioner says a bunch of things — their seniority, high expectations yada yada — but also adds that it was impracticable to report on the investigation without people figuring out who they were.
Postscript
Postscript: Renée Leon’s statement on LinkedIn and some people who did know her take her side. People like Martin Parkinson and people who were rooting for her when she started trying to clean up the Department of Human Services.
Blessed are the comedians, for they illuminate ambiguity
Nicola Sturgeon on Tony Blair on leadership
A good piece from Nicola Sturgeon who feels a new lightness in her life having been relieved of her role as a political leader and Chief Moral Ambiguity Officer, a point we’ll get back to below.
Ironically, however, I suspect it will be those of us with actual leadership experience who find this book the least satisfying. It is not that his advice is wrong; on the contrary. But most leaders know that it is important to have a plan and stick to it, to manage time effectively, to prioritise, to understand the difference between tactics and strategy, to favour policy over politics, to be prepared to take unpopular decisions, and to follow through to delivery.
For a politician who claims to have been a radical leader, he comes across as being very much in thrall to vested interests
The problem is that much of this falls into the category of “easier said than done”. And while he pays lip service to this, Blair fails to address the myriad factors that, on a daily basis, conspire to throw a leader off course, or to offer any practical advice on how to overcome those challenges.
On Leadership would have been enriched immeasurably had he included a couple of case studies from his time in office, occasions when he struggled to follow his own advice, with some insight into how he managed – or failed – to get back on track. Indeed, one of the curiosities of this book is that there is no real reflection on his own strengths and weaknesses.
Getting tribal about trans
Nothing better illuminates Alasdair MacIntyre’s comments about modernity’s moral maze than the trans issue. You’d think that ahead of everything else, these issues would be best handled with some sensitivity and preparedness to accept ambiguity. After all they are kind of about ambiguity. But no. We arrange ourselves into ignorant armies clashing by night. So the trans-activists have a slogan — all women are women.
And the anti-trans-activists yell back that men can’t become women. If you’re a left politician (or a young star who owes your fame to J.K. Rowling) your base is sympathetic to the former position. If you’re a right politician the base goes the other way. (And if you’re J.K. Rowling you’re on that side.) But these issues can mostly be dealt with as well as they’re ever going to be dealt with in context. In the vast bulk of situations if someone born male wants to be thought of as a woman we can all cooperate. But if their birth sex gives them an advantage in sport, then lines have to be drawn and some people will feel unfairly disadvantaged. Draw the line as best you can — not with slogans. Ditto if you’re sending sentencing someone to jail for violence or a trans-woman wants to participate in a rape crisis centre — much less be CEO. But no. And as we know, Nicola Sturgeon got sucked into this vortex. From Quillette.
Few countries have surrendered so completely to the maximalist demands of transgender activists as Scotland, where two former First Ministers, Nicola Sturgeon and Humza Yousaf, both recently had their careers cut short after they supported unpopular legislation permitting men to self-identify as women. And yet, even in Scotland, the latest example of men taking over women’s spaces has left jaws hanging open.
You might think that one of the United Kingdom’s most venerable rape-crisis centres, founded almost half a century ago, would be safe from the incursions of trans-identified men. But the social-service sector in Scotland has been captured by gender activists just as thoroughly as the country’s political class—as illustrated by a bizarre sequence of events that ended in court last month, when someone who accessed support services at the Edinburgh Rape Crisis Centre (ERCC) was exposed as a sexual predator.
Naturally, both principal characters here are biological men. One is Cameron Downing, a 24-year-old “non-binary” former drama student and onetime darling of the ruling Scottish National Party (SNP), who was able to attend the ERCC for several months even as he was abusing half a dozen men and women.
The other is Mridul Wadhwa, a trans-identified man who was hired as the centre’s CEO in 2021, and who then went on a campaign to punish a female counsellor on his staff named Roz Adams, who’d suggested that perhaps rape victims visiting the centre were entitled to know the biological sex of the staff members they were talking to. In May, an employment tribunal ruled in Adams’ favour, denouncing Wadhwa and the rest of the ERCC’s management for conducting a “heresy hunt” aimed at anyone who questioned trans-activist shibboleths. This would be a scandal even if the ERCC didn’t receive generous funding from a long list of public entities.
My favourite dog eating meme — yours welcome in comments
Until I saw this one
Sent by a subscriber
Good, unusual politician delivers good, unusual speech
Onlookers remain sceptical
David French from the NYT
On July 15, 1979, President Jimmy Carter emerged from days of isolation to deliver the most important and memorable address of his life. Carter had canceled vacation plans and spent more than a week cloistered at Camp David, where he met with a “steady stream of visitors” who shared their hopes and fears about a nation in distress, most immediately thanks to another in a series of energy crises.
Carter, however, discerned a deeper problem. America had a wounded heart. The president believed it suffered from a “crisis of the spirit.” The speech was among the most unusual in presidential history. The word that has clung to it, “malaise,” was a word that didn’t even appear in the text. It was offered by his critics and has since become something close to official history. Everyone above a certain age knows immediately and precisely the meaning of the phrase “the malaise speech.”
I believe, by contrast, the best word to describe the speech would have been “pastoral.” A faithful Christian president applied the lessons he’d so plainly learned from years of Bible study and countless hours in church. Don’t look at the surface of a problem. Don’t be afraid to tell hard truths. Be humble, but also call the people to a higher purpose.
The resulting address was heartfelt. It was eloquent. Yet it helped sink his presidency. …
When he addressed the nation, Carter took a step back. … He’d taken the time to listen to others, he shared what he heard, and then he spoke words that resonate today. “The symptoms of this crisis of the American spirit are all around us,” he said, and he described symptoms that mirror our current reality.
“For the first time in the history of our country a majority of our people believe that the next five years will be worse than the past five years,” Carter said. (Meanwhile, last year a record 58 percent of Americans told NBC News pollsters that our nation’s best years are behind it.)
There was more. “As you know,” he told viewers, “there is a growing disrespect for government and for churches and for schools, the news media, and other institutions.” He was right, but compared to now, Americans were far more respectful of virtually every major institution, from the government, to the news media, to the private sector. Only the military fares better now in the eyes of the public.
Then there was this gut-punch paragraph:
We were sure that ours was a nation of the ballot, not the bullet, until the murders of John Kennedy and Robert Kennedy and Martin Luther King Jr. We were taught that our armies were always invincible and our causes were always just, only to suffer the agony of Vietnam. We respected the presidency as a place of honor until the shock of Watergate. …
We’re familiar with political speeches that recite the litany of American challenges, but we’re not familiar with speeches that ask the American people to reflect on their own role in a national crisis. Carter called for his audience to look in the mirror:
In a nation that was proud of hard work, strong families, close-knit communities, and our faith in God, too many of us now tend to worship self-indulgence and consumption. Human identity is no longer defined by what one does, but by what one owns. …
For all the scorn heaped on Carter later, the speech was successful, at first. His approval rating shot up a remarkable 11 points. … Then the world erupted. In November, Iranian militants stormed the U.S. Embassy and took dozens of Americans hostage. In December, the Soviet Union invaded Afghanistan and at least appeared to secure the country quickly and easily. Contrary to popular remembrance, Carter did not respond with weakness. The defense buildup for which Ronald Reagan is remembered actually began under Carter. And in April 1980, he greenlit a daring attempt to fly into the heart of Iran and rescue American hostages by force.
It was not to be. Mechanical problems scrubbed the mission far from Tehran, and in the confusion of the withdrawal, two aircraft collided, and eight American service members died. American gloom deepened. The nation seemed to be moving from defeat to defeat.
The failed rescue was a hinge moment in history. … But just as presidents own military victories, they also own defeats. Carter’s fate was sealed. Reagan carried 44 states, and on Inauguration Day — in a final insult by Tehran — the hostages came home.
The story of the next 10 years, moreover, cast Carter’s address in a different light. The nation went from defeat to victory: Inflation broke, the economy roared, and in 1991 the same military that was humiliated in the sands of Iran triumphed, with assistance from its allies, over an immense Iraqi Army in a 100-hour land war that astonished the world.
The history was written. Carter was wrong. There wasn’t a crisis of confidence. There was no malaise. There was instead a failure of leadership. Better, or at least luckier, leaders revived a broken nation.
Yet with every passing year, the deeper truths of Carter’s speech become more apparent. His insights become more salient. … The trends he saw emerging two generations ago now bear their poisonous fruit in our body politic.
Carter’s central insight was that even if the country’s political branches could deliver peace and prosperity, they could not deliver community and belonging. Our nation depends on pre-political commitments to each other, and in the absence of those pre-political commitments, the American experiment is ultimately in jeopardy. … As Jimmy Carter spends his last days on this earth, we should remember his call for community, and thank a very good man for living his values, serving his neighbors, and reminding us of the true source of strength for the nation he loved.
Seizing the future: being yourself
Always good to find a new voice saying something compelling. It’s in the modern idiom and titled ‘being agentic’, isn’t quite my cup of tea, but then, pretty obviously she really has been seriously maximising her agency. For me agency is downstream of being yourself. But it’s only downstream conceptually. Maximising your agency is probably a better practical way of building a life around being yourself than most ways.
Anyway, the way she describes working on “your edge” is what I’d say about whatever achievements I’ve managed in the area of thinking about stuff. It astonishes me how much modern culture has people clinging to consensus rather than trying to really grapple with the challenge of pondering something difficult until they think they have a powerful and robust way in. For the record, here’s a couple of examples where I think I’ve done this, and another where I critique the approach of following the herd — where one ends up saying lots of stuff (and getting one’s work published) while just spinning one’s wheels in the quicksand of the slipshod frameworks everyone else is applying. Anyway, here’s Cate being ‘agentic’.
In my way of thinking, radical agency is about finding real edges: things you are willing to do that others aren’t, often because they’re annoying or unpleasant. These don’t always surface in awareness to the point one is actually choosing -- often they live in a cloud of aversion that strategically obscures the tradeoff.
The idea of finding real edges, as contrasted with “eking out wins by grinding harder than everyone,” first clicked for me when I started playing poker. Poker in the modern era is an extraordinarily competitive game, and even 8 years ago pros were spending nearly as much time studying as they were playing, using solver models to seek out tiny mathematical advantages. At the same time, a massive edge was available in the form of physical reads, but almost entirely ignored. (I know an example would make this more compelling, but I’m sorry, it’s like explaining a magic trick.)
Two friends and I maniacally studied reads together, and we all had out-of-distribution results. But when we’d tell other pros what we were doing, the response from most was “nuh-uh, that’s not a thing.” They weren’t willing to consider the possibility that reads were valuable, maybe because they didn’t want to feel obligated to study them.
All of my agency hacks are kind of like this, in my opinion -- big, glaring edges that people might rather ignore.
Court rejection
Ask for things. Ask for things that feel unreasonable, to make sure your intuitions about what’s reasonable are accurate (of course, try not to be a jerk in the process). If you’re only asking for things you get, you’re not aiming high enough. …
Seek real feedback
In many contexts, the way to get good feedback is to give people a way to provide it anonymously. Anything else creates friction by layering on social dynamics. … You also want to make it easy to find -- I have a link to my feedback form in my Twitter bio, and get a few comments a week through it.
I imagine resistance from some people on the grounds that anonymity frees people to be assholes, but in my experience they rarely are. 90% of what I get in my inbox is either nonsense or nice -- I get lots of “keep up the good work!” type messages. …
Increase your surface area for luck
The last couple times I was looking for a project, I made a point of meeting as many people doing related work as I could, even if there was no obvious benefit to doing so. …
What I discovered by casting a wide net was that I have very little ability to predict how useful a call will be in advance. Relevance is easier to predict, but it’s not a very good proxy for usefulness …. To some extent, the more confident I am that a conversation is relevant, the less likely I am to discover something exciting during it. …
Assume everything is learnable
Most subject matter is learnable, even stuff that seems really hard. But beyond that, many (most?) traits that people treat as fixed are actually quite malleable if you (1) believe they are and (2) put the same kind of work into learning them as you would anything else.
As you might gather, I think agency itself is a good example. I learned agency late. In my teens and 20s, I occasionally made agent-y moves (like taking a job in a new city to be with someone I hadn’t spoken to yet; we married a few months later). But I still managed to pick a career I really disliked, for no reason other than its obviousness, and only after a decade stopped to ask what I was hoping to accomplish. …
Learn to love the moat of low status
The moat of low status is one of my favorite concepts, courtesy of my husband Sasha. The idea is that making changes in your life, especially when learning new skill sets, requires you to cross a moat of low status, a period of time where you are actually bad at the thing or fail to know things that are obvious to other people.
It’s called a moat both because you can’t just leap to the other side and because it gives anyone who can cross it a real advantage. It’s possible to cross the moat quietly, by not asking questions and not collaborating, but those tradeoffs really nerf learning. “Learn by doing” is standard advice, but you can’t do that unless you splash around in the moat for a bit.
If you can learn to thrive in the moat, it’s incredibly liberating. I once played a hand in a big poker tournament so badly there were news stories about it. I’ll never entirely get over my embarrassment about the hand, but I still look back on it with great fondness, because it’s when I realized I’d crossed some threshold of unflappability. With cameras and reporters crowding around, I could have safely folded and no one would have paid attention; I chose to call knowing it would mean certain ridicule even if I won. The call was in fact quite bad, but I made it for the right wrong reason.
Don’t work too hard
This might be the most important item on the list. It took me almost 40 years to learn it, because my instinct is to think more hours mean more productivity as long as you’re really trying to be productive -- that’s just multiplication, right? No. The reality is that grinding, even if it temporarily increases output, kills creativity and big picture thinking.
Burnout is the ultimate agency-killer. … These days I set boundaries that would have made me ashamed at earlier points in my life: I’m offline at 6 p.m. almost every night, and rigorously observe a Sunday Sabbath where nothing with the flavor of effort is tolerated. These will seem like small things to some people, but like a mortal sin to others in the communities I run in.…
Agency is the skill that built the world around you, an all-purpose life intensifier that lets you make your corner of it more like what you want it to be, whether that’s professional, relational, aesthetic, whatever. Build a better mousetrap. Have an enviable marriage. Start a country. No one is born with it, everyone can learn it, and it’s never too late.
Living in modern Britain
The latest triumph of Australian bipartisanship
Further evidence that our democracy is functioning — at least better than some. For which we can be thankful, again, to Peter Dutton. It’s usually a mistake to attribute too much altruism to a politician, but, if that’s not what it is, I for one am grateful whenever it looks like an Australian politician has looked to their own political interests with an horizon that stretches beyond the next election. Bipartisanship has now made substantial improvements to a number of major long-term fiscal liabilities most notably before this, the NDIS. (It’s also given us AUKUS, and which red blooded Australian wouldn’t want to be first out the gate for the next world war — to give us a perfect record in world wars over more than a century?)
Bernard Keane takes up the story:
Yesterday’s big announcement of bipartisan agreement on aged care funding is, in a way, nothing new. Twelve years ago, Julia Gillard and Mark Butler announced a major aged care reform package aimed at helping Australian seniors remain at home longer and ramping up the user-pays element of the overall system for retirees with greater financial means.
There was no overt bipartisanship in 2012 like there was yesterday, with Aged Care Minister Anika Wells actually thanking her Coalition counterpart Anne Ruston. But in 2012 Tony Abbott — AKA Dr No — declined to exploit the issue. He could have whipped up an almighty scare campaign about user-pays in aged care. Back then — like today — self-funded retirees (a natural Liberal constituency) were the main targets of the increases in charges. Abbott would have known that a scare campaign probably wouldn’t have attracted too many additional votes from Labor and cruelled the chances of reform in aged care that had to be delivered no matter who was in government.
Instead, the Coalition in government quietly continued Labor’s focus on improving the aged care workforce (let them import workers instead, seemed to be the Coalition’s approach) and focused on growing home care. The numbers tell the story. Gillard and Butler boasted about expanding home care from around 60,000 places to 100,000 by 2015, and 140,000 prospective places by 2022. By 2020, the Morrison government was proudly claiming it had added another 50,000 home care packages on its watch to bring the number of places to 196,000. …
It’s possible — perhaps likely — that future governments of both stripes will try to push user-pays further for self-funded retirees and part-pensioners. And the issue of quality of care will be in the lap of bodies like the Aged Care Quality and Safety Commission. But Labor has fundamentally re-engineered aged care, with help from the other side. Unusually, there’s credit to be had all round for good policy.
Audrey Hepburn — Anne Frank: Soul Sisters
Audrey Hepburn made less than 20 films during her legendary career, but they were so beloved — Roman Holiday, Breakfast at Tiffany’s, Sabrina, to name a few — that she became one of Hollywood’s most beloved and enduring stars.
But there was one role she was never able to play: that of Anne Frank.
While Hepburn never met Frank, they lived parallel lives. They were the same age, lived just 60 miles apart, and suffered the horror of the German occupation of Holland, notes Robert Matzen in his new book Dutch Girl: Audrey Hepburn and World War II, excerpted exclusively in this week’s PEOPLE. But with one life and death difference: Anne was Jewish.
The actress grew up in Holland during Germany’s five-year occupation of the country. She rarely spoke about the darkness of those years, where she was forced to live in a cellar due to the bombing, nearly starved to death due to food shortages, and lost her beloved uncle, Otto van Limburg Stirum. He was a magistrate who did not support the Nazi regime, and was then executed on August 15, 1942.
According to Matzen, when the star read Frank’s The Diary of a Young Girl, Hepburn was devastated. “I’ve marked where she said ‘Five hostages shot today,’ said Hepburn years later. “That was the day my uncle was shot. And in this child’s words, I was reading what was inside me and still there. This child who was locked up . . . had written a full report of everything I’d experienced and felt.”
When producer and director George Stevens turned Frank’s diary into a movie in 1959, Anne’s father Otto Frank — the family’s sole survivor — asked Hepburn to play his late daughter, who died of typhus fever at Bergen-Belsen concentration camp in 1945.
But Hepburn was so traumatized that she was unable to. “I was so destroyed by it again, that I said I couldn’t deal with it,” Hepburn later said. “It’s a little bit as if this had happened to my sister . . . in a way she was my soul sister.”
Good LNL on the Dreyfus Affair
Guy Rundle on the Democrats
From Crikey
The republic was founded on the electoral college system to give small and distant states equal sway. Instead, it has, in the presidential race, de facto disenfranchised tens of millions of voters. Who cares if you’re a California Republican or an Oklahoma Democrat?
This new division is based on what is the rock-solid division in Western societies now, between the college-educated — and those who live in cities dominated by them, and their economic production — and those outside of it. In interests, ideologies and attitudes, it now supersedes old industrial class division and struggle and is a new form of class struggle.
Were class, in the old sense, to be the dominant factor, Harris would be killing it, cruising to 400 electoral college votes. But the white and Black working- and middle-class want to stop the flood of immigration, and the Democrats do not. These voters want a trade war with China and a revival of a national economic plan. They are not particularly concerned if that involves lower business taxes, smaller government — which barely serves them in any case — and privatisation boondoggles.
Should Kamala Harris lose in November, with an overall majority but an electoral college loss, maybe, maybe, finally, progressives, the political class, the media class, the whatever, will get it through their thick skulls that the old progressive-working/middle-class alliance is dead, gone, over. These classes have to be listened to, a conscious alliance has to be made, progressive interests have to be ceded to the groups that still have the numbers.
But it will be an expensive lesson to learn, for America and the world.
The middle income trap
Good column from Martin Wolf.
“Middle-income countries are home to three out of every four people — and nearly two-thirds of those who struggle in extreme poverty. They are responsible for 40 per cent of the world’s total economic output — and nearly two-thirds of global carbon emissions. In short, the global effort to end extreme poverty and spread prosperity and livability will largely be won or lost in these countries.” These words by Indermit Gill, the World Bank’s chief economist, appear in the World Development Report 2024, entitled “The Middle-Income Trap”, which is the idea that economies tend to get stuck on the road to the high incomes of the US, Canada, Europe, Japan, South Korea, Australia and quite a few others.
Is there really such a trap? A … 2021 paper … “The New Era of Unconditional Convergence” concluded … that “debates about a ‘middle-income trap’ . . . appear anachronistic: middle-income countries have exhibited higher growth rates than all others since the mid-1980s”.
Nonetheless, closing gaps in average prosperity between rich and poorer countries is painfully slow and hard. The likely persistence of these gaps matters for human welfare, political stability and our ability to tackle global challenges, notably climate change. Not least, they make the idea that the latter will be managed by “degrowth” absurd. Which of these middle-income countries will accept such stagnation? Will India?
As the WDR stresses, the “ambition of the 108 middle-income countries with incomes per capita of between US$1,136 and US$13,845 is to reach high-income status within the next two or three decades. When assessed against this goal, the record is dismal: the total population of the 34 middle-income economies that transitioned to high-income status since 1990 is less than 250 million, the population of Pakistan.”
The most populous country to have become a high-income country since 1990 is South Korea. Meanwhile, important countries have failed to converge. Brazil is an example. Once successful, Chile has also stumbled. Above all, average incomes per head of middle-income countries have stayed below 10 per cent of US levels since 1970. …
The WDR argues that countries need to internalise Joseph Schumpeter’s celebrated concept of “creative destruction”, as updated by the work of Philippe Aghion and Peter Howitt. The essential step is to force incumbents to compete, encourage entrants and open the economy to those who were historically outsiders. … Social mobility is about 40 per cent lower in middle-income countries than in high-income ones. That must change.
Creative destruction is also necessary if the energy transition is to accelerate. Middle-income countries tend to waste energy and have shifted too slowly towards renewables, even though many have exceptional potential. Part of the problem is the high cost of capital, itself the result of high levels of uncertainty. Improvements in institutions, with the aim of increasing predictability and security, will help. Above all, societies and economies need to become more open and meritocratic.
None of this is easy anywhere, not least in developing countries. Alas, the rise of protectionism and consequent fragmentation of the world economy are likely to make their prospects worse. Yes, there will be opportunities, too, as some importers shift from their present reliance on China. But integration has unquestionably been a dominant force behind the development successes of the recent past. …
Growth prospects are worsening. Hopes for a better world fade with them.
For those suffering from a lack of stupid things to do
But want the stupid things they do do to be cool — extremely cool.
Is the idea of ‘mind viruses’ a mind virus?
Dan Williams goes to great lengths, which wasn’t necessary for me to read because (to paraphrase comedian Stewart Lee) I agreed the fuck out of what he was saying from the get go. The notion of ideas you don’t like being ‘mind viruses’ is complacently self-serving, and profoundly ignorant of what ideas are. It’s out and proud intellectual philistinism.
In any event, this short extract ably summarises his three unarguable points. If you don’t find them unarguable
Ideas, including bad ones, are not infectious mind viruses. This metaphor rests on an inaccurate picture of human psychology and social behaviour that functions to demonise, not understand. Because of this, it poisons public debate, increases polarisation, and hinders our collective capacities to understand the world and each other.
I will make three general points:
The "mind virus" metaphor assumes the truth is self-evident, so false beliefs must stem from irrationality. This neglects how people form beliefs based on different information, trusted sources, and interpretive frameworks, which means rational individuals can easily develop radically divergent worldviews.
People often embrace and spread ideas because they serve practical goals beyond truth-seeking. For example, religious, ideological, and conspiratorial narratives often serve propagandistic functions or promote people’s social interests. Such motivated reasoning looks nothing like the passive infection by “mind viruses”.
Belief systems do not spread via simple contagion. They are maintained through complex social dynamics and incentives in which members of belief-based tribes win status by enforcing, rationalising, and spreading bespoke realities.
If you think that, having come across them, I’m not going to show you these pics, you’re very much mistaken
Quite apart from the pun for a heading. I like Eyke!
Reminds me of a Leunig cartoon
How Do Holistic Wrap-Around Anti-Poverty Programs Affect Employment and Individualized Outcomes?
Javier Espinosa, William N. Evans, David C. Phillips, and Tim Spilde #32911A new wave of social service programs aims to build a pathway out of poverty by helping clients define their own goals and then supporting them flexibly and intensively over multiple years to meet those goals. We conduct a randomized controlled trial of one such program. Participants randomly assigned to intensive, holistic, wrap-around services have 10 percentage points higher employment rates after one year compared with a control group offered only help with an immediate need. Most of this effect appears to persist after programming ends. However, we find limited evidence that intensive, holistic services affect areas beyond employment, even when other areas of life are participants’ primary goals. We find some evidence that the program works by increasing hopefulness and agency among participants, which may be more useful in supporting labor force participation than in meeting other goals.
Jake Buckley’s instant, life-saving care reflex
An astounding AFL mark. But watch closely as Jake Buckley saves his opponent Isaac Heaney from possible quadriplegia. I am deeply moved. When will you be called upon? Isaac’s anxious mum wants to say ‘thank you’.
Van Gogh — animated!
Potsdam in 1930
If you’re on a good thing, why spoil it?
The academy rolls on
I’m not up on the literature, but my sympathies are with Cameron Murray as he gets his academic paper rejected, not on the grounds that it’s wrong, but that he needs to dress rather more as a sound chap. There’s something valuable in the advice — always good to remember you’re writing for an audience. But the endless, fully upholstered ways the academy perpetuates the sound chaps’ work long after they’ve been largely debunked is a deeply depressing experience from which I’ve largely insulated myself. By staying away.
Academia is often a social status game more than a knowledge production game.
Everybody knows this. But few say it.
Everybody knows that there are clueless experts. But few say it.
So today, I’m going to say it.
If what Harvard Professor Ed Glaeser writes about housing markets reflects his understanding of them, he is a clueless expert.1
This is a guy with hundreds of academic articles on property markets. His body is a machine that converts money into academic papers. But what about knowledge?
I want to change the social status game and be the first mover to call out this poor economics. I hope others will feel more comfortable doing the same.
Today I’m going to explain the ridiculous problems with the economic analysis in his 2018 paper in the Journal of Economic Perspectives with Joseph Gyourko entitled The Implications of Housing Supply.
It’s not the first time I’ve called out his dodgy analysis.
I spent many years trying to get people to see the problems with one of Glaeser’s other famous but useless methods for analysing property.
But social status games dominate. This was made clear by the comments I received from an anonymous reviewer about my paper critiquing Glaeser’s analysis.
The main point of this paper is both correct and important: the popular hedonic price method of calculating a "regulatory tax" initiated by Glaeser and Gyourko (2003) (henceforth G&G) has little or no scientific merit, and should not be used.
...So the G&G method is ripe for criticism. The early critiques by Somerville (2005) and O'Flaherty (2003) were massively ignored, and their authors probably didn't notice because they thought that the G&G method was too ditzy to go anywhere. But as Murray points out, the method has become popular and its results have become influential. So Murray's critique is timely.
But it has to be rewritten. G&G have both done a lot of good work; this strand of literature is really an aberration. So the profession's prior is that G&G are right and Murray is not. To move the prior, Murray has to be both succinct and serious: succinct because nobody is going to start reading a long paper by someone they think is a kook, and serious because Murray must prove himself part of the brotherhood, not an outsider or rabble-rouser.
If ignoring status games because you want to understand the world and gain knowledge is rabble-rousing, then so be it.
And strongly disagree with the reviewer that Glaeser’s previous dodgy analysis was an “aberration”.
The unlikely miracle of Venice
Even its name is beautiful. So too, in its own way was its constitution. But you’ll hear more from me on that in the future. In the meantime, watch the video.
Review of Gideon Haigh’s memoir of his brother
On the publisher’s website this achingly honest book is tagged under “Memoirs,” a genre its author despises. Gideon Haigh, one of our most versatile and prolific writers, sees memoir as a form of attention seeking, a spotlight for “humblebraggarts.” His new book is deliberately “pruned of memoir’s usual self-pardonings and self-protections.”
My Brother Jaz examines the death of Haigh’s younger brother Jasper in a car crash and its long aftermath. Jasper had a difficult early adolescence, acting up at school, often depressed and abusing drugs and alcohol, and a troubled relationship with his divorced parents. Then, one night in 1987, he steals his mother’s Telstar and his life comes to a violent end. How much his death was reckless accident or intentional self-destruction remains ambiguous.
Haigh presents his immediate responses in a cold, almost dissociated light: receiving the early morning phone calls, walking to the train station, viewing the body. He understands that a deep rupture has emerged between past and future and that he must remember the traumatic present vividly. But memory proves difficult. He gives a eulogy that he later almost completely forgets as an internal process of shutting down gathers force: “I have begun closing those windows into my soul that events had thrown open.”
The complexities of memory are beautifully explored in this book. Haigh shows how it can be erased by strong emotion but also how it persists despite his best efforts at suppression, as when he tries to confuse himself about the death date. “The sense of Jasper is always there, out of sight, but bulking darkly like a submerged continent.” He tries to hold this sense of loss at bay for more than three decades before feeling compelled to avoid it no longer. Requesting and reading the coronial inquest triggers his own memories and makes further evasion unbearable.
Most of all, Haigh channels his loss into asceticism and relentless work. He gives up meat, alcohol and sex, becomes rail-thin, wears drab clothes and writes prodigiously, all renunciation and no joy. Prozac eventually helps but depression fails to capture the entirety of this state. Although Haigh writes that his “chief coping mechanism was giving up,” he pushes on much more than he stops trying. Mourning also seems an inadequate word: the period of self-denying bleakness only starts to abate after five years and is thick with self-loathing. In mourning, Freud wrote, “it is the world which has become poor and empty,” whereas “in melancholia it is the ego itself.” Haigh’s ego is wretched: “Go on, read me: it’s all I have to offer. The rest you wouldn’t like.”
My Brother Jaz offers an unvarnished portrait of Haigh’s mid-life personality. He admits to being prickly and a holder of grudges, hard to love and autonomous to a fault. A promising midlife relationship fails when a partner runs out of patience. Haigh embodies a strange combination of feeling unique and insisting that he is not special, a kind of egalitarian narcissism. He is annoyed when his reactions to loss are the same as other people’s but also resents those who elevate themselves above others in search of sympathy or praise.
How much these tendencies have been shaped by loss and how much by background and temperament is impossible to know. Haigh, wary of psychologising and therapy’s “mimes of candour,” rarely hazards a guess. Background and temperament clearly play a part, though. His prodigious work habits were evident at an early age, his first book published when he was twenty-one in the week Jaz died. He also details a tough-minded family culture of pushing through hardship and looking down on the “poor-bloody-mes” who indulge in self-pity. Haigh’s adult drivenness and lack of kindness to self can’t be attributed to loss alone.
Mostly written in a frenetic three-day burst, My Brother Jaz has an immediacy that makes it hard to put down even as the hurt is palpable and unsparing. Haigh is not convinced that writing the book has done him much good — not writing also not having been helpful — just that he eventually had to “discharge” it. For his readers, though, he has done something very good indeed. This is a wrenching work of self-revelation and a powerful tribute to someone who died much too young.
Daniel J Mahoney on Girard
Though way more conservative than me, I always find Daniel J. Mahoney well informed and insightful, so he gets a big tick when I agree with him. Which I do about the partiality of Girard as a thinker about humanity.
Not that I know much about Girard other than his basic ideas but he seems like a nut to me. Mimetic desire is undoubtedly a big deal in human affairs, and it was his idee fixe. And in our culture, a good way to fame is to obsess about a single thing that is of genuine importance and to ignore everything else.
He attached the theory to the idea of the scapegoat, which is also well and good, but getting more specific and more tenuous as an all purpose guide to human affairs and human history. And Gerard thought this was as all purpose a guide to human affairs as natural selection was for biology. Well no Rene — just walk away!
[As Girard wrote]:
The question about our world is not really why so much violence, but why so little? Why are we not always at each other’s throats.
But he did not adequately investigate it, because he insisted on making mimetic desire the alpha and omega of social order and the human world. He forgot the other letters in the alphabet that might have allowed him to integrate his undeniable insights about mimetic desire into a balanced account of human nature, human motives, and the wellsprings of decent social and civic order. This representative image of Isaiah Berlin’s famous hedgehog forgot the big picture, or rather addressed it narrowly through the lenses of a profound and partial insight. More than one critic has argued that Girard was the hedgehog par excellence.
A brilliantly flawed 1990 essay on “Collective Violence and Sacrifice in Shakespeare’s Julius Caesar” reveals Girard’s tremendous exegetical gifts, this time applied to the reading of Shakespeare, as well as his inability to offer any theoretical place for human nature and politics in his science of mimetic or imitative desire. In one of Shakespeare’s most political plays, he finds nothing noble in Cassius’s or Brutus’s hatred of Caesar’s emerging despotism. He sees in the major characters such as Brutus and Marc Anthony “mimetic doubles” who are morally and functionally indistinguishable. Girard is undoubtedly right that Caesar’s murder became “the foundational murder,” “the foundational violence of the Roman empire.” But he goes much too far in rejecting “all political interpretations of Julius Caesar” (my emphasis), which he says are
all of the same differential type: which party does Shakespeare favor in the civil war, the Republicans or the monarchists? Which leader does he like best, Caesar or Brutus? Which social class does he esteem, which does he despise, the aristocrats or the commoners?
In historical events as dramatized by Shakespeare, however, Girard can see only “conflictual undifferentiation.” It is true that Shakespeare discerns laudable human qualities on both sides of the Roman political divide, thus displaying an impressive impartiality. But Girard denies that this impartiality has anything to do with high-minded detachment and everything to do with fidelity to the core insights of mimetic reflection. By resisting tyranny, one becomes no different from the tyrant. The chief characters are thus morally, mimetically, metaphysically the same.
As Pierre Manent has argued, Girard’s refusal to take political philosophy seriously … inevitably has him making everyone the same. …
Girard is at his strongest when he takes aim at victimology and wokeness rooted, as it so often is, in envy and a “demonic” spirit of rivalry with God himself. But even some of Girard’s disciples, Left-Girardians so to speak, have been slow to follow Girard in his unequivocal rejection of the ideological appropriation of the Christian concern for victims. They are less sensitive to the reality of original sin, of human imperfection built into the nature of things.
The new ideological binary, innocent victim versus rapacious oppressor, forgets the insight so powerfully articulated by Solzhenitsyn in the opening volume of The Gulag Archipelago:
If only it were all so simple! If only there were evil people somewhere insidiously committing evil deeds, and it were necessary only to separate them from the rest of us and destroy them. But the line dividing good and evil cuts through the heart of every human being. And who is willing to destroy a piece of his own heart?
Heaviosity Half-Hour
Hannah Arendt: The human condition
From The Human Condition
21: INSTRUMENTALITY AND Homo Faber
The implements and tools of homo faber, from which the most fundamental experience of instrumentality arises, determine all work and fabrication. Here it is indeed true that the end justifies the means; it does more, it produces and organizes them. The end justifies the violence done to nature to win the material, as the wood justifies killing the tree and the table justifies destroying the wood. Because of the end product, tools are designed and implements invented, and the same end product organizes the work process itself, decides about the needed specialists, the measure of co-operation, the number of assistants, etc. During the work process, everything is judged in terms of suitability and usefulness for the desired end, and for nothing else.
The same standards of means and end apply to the product itself. Though it is an end with respect to the means by which it was produced and is the end of the fabrication process, it never becomes, so to speak, an end in itself, at least not as long as it remains an object for use. The chair which is the end of carpentering can show its usefulness only by again becoming a means, either as a thing whose durability permits its use as a means for comfortable living or as a means of exchange. The trouble with the utility standard inherent in the very activity of fabrication is that the relationship between means and end on which it relies is very much like a chain whose every end can serve again as a means in some other context. In other words, in a strictly utilitarian world, all ends are bound to be of short duration and to be transformed into means for some further ends.19
This perplexity, inherent in all consistent utilitarianism, the philosophy of homo faber par excellence, can be diagnosed theoretically as an innate incapacity to understand the distinction between utility and meaningfulness, which we express linguistically by distinguishing between “in order to” and “for the sake of.” Thus the ideal of usefulness permeating a society of craftsmen—like the ideal of comfort in a society of laborers or the ideal of acquisition ruling commercial societies—is actually no longer a matter of utility but of meaning. It is “for the sake of” usefulness in general that homo faber judges and does everything in terms of “in order to.” The ideal of usefulness itself, like the ideals of other societies, can no longer be conceived as something needed in order to have something else; it simply defies questioning about its own use. Obviously there is no answer to the question which Lessing once put to the utilitarian philosophers of his time: “And what is the use of use?” The perplexity of utilitarianism is that it gets caught in the unending chain of means and ends without ever arriving at some principle which could justify the category of means and end, that is, of utility itself. The “in order to” has become the content of the “for the sake of”; in other words, utility established as meaning generates meaninglessness.
Within the category of means and end, and among the experiences of instrumentality which rules over the whole world of use objects and utility, there is no way to end the chain of means and ends and prevent all ends from eventually being used again as means, except to declare that one thing or another is “an end in itself.” In the world of homo faber, where everything must be of some use, that is, must lend itself as an instrument to achieve something else, meaning itself can appear only as an end, as an “end in itself” which actually is either a tautology applying to all ends or a contradiction in terms. For an end, once it is attained, ceases to be an end and loses its capacity to guide and justify the choice of means, to organize and produce them. It has now become an object among objects, that is, it has been added to the huge arsenal of the given from which homo faber selects freely his means to pursue his ends. Meaning, on the contrary, must be permanent and lose nothing of its character, whether it is achieved or, rather, found by man or fails man and is missed by him. Homo faber, in so far as he is nothing but a fabricator and thinks in no terms but those of means and ends which arise directly out of his work activity, is just as incapable of understanding meaning as the animal laborans is incapable of understanding instrumentality. And just as the implements and tools homo faber uses to erect the world become for the animal laborans the world itself, thus the meaningfulness of this world, which actually is beyond the reach of homo faber, becomes for him the paradoxical “end in itself.”
The only way out of the dilemma of meaninglessness in all strictly utilitarian philosophy is to turn away from the objective world of use things and fall back upon the subjectivity of use itself. Only in a strictly anthropocentric world, where the user, that is, man himself, becomes the ultimate end which puts a stop to the unending chain of ends and means, can utility as such acquire the dignity of meaningfulness. Yet the tragedy is that in the moment homo faber seems to have found fulfilment in terms of his own activity, he begins to degrade the world of things, the end and end product of his own mind and hands; if man the user is the highest end, “the measure of all things,” then not only nature, treated by homo faber as the almost “worthless material” upon which to work, but the “valuable” things themselves have become mere means, losing thereby their own intrinsic “value.”
The anthropocentric utilitarianism of homo faber has found its greatest expression in the Kantian formula that no man must ever become a means to an end, that every human being is an end in himself. Although we find earlier (for instance, in Locke’s insistence that no man can be permitted to possess another man’s body or use his bodily strength) an awareness of the fateful consequences which an unhampered and unguided thinking in terms of means and ends must invariably entail in the political realm, it is only in Kant that the philosophy of the earlier stages of the modern age frees itself entirely of the common sense platitudes which we always find where homo faber rules the standards of society. The reason is, of course, that Kant did not mean to formulate or conceptualize the tenets of the utilitarianism of his time, but on the contrary wanted first of all to relegate the means-end category to its proper place and prevent its use in the field of political action. His formula, however, can no more deny its origin in utilitarian thinking than his other famous and also inherently paradoxical interpretation of man’s attitude toward the only objects that are not “for use,” namely works of art, in which he said we take “pleasure without any interest.”20 For the same operation which establishes man as the “supreme end” permits him “if he can [to] subject the whole of nature to it,”21 that is, to degrade nature and the world into mere means, robbing both of their independent dignity. Not even Kant could solve the perplexity or enlighten the blindness of homo faber with respect to the problem of meaning without turning to the paradoxical “end in itself,” and this perplexity lies in the fact that while only fabrication with its instrumentality is capable of building a world, this same world becomes as worthless as the employed material, a mere means for further ends, if the standards which governed its coming into being are permitted to rule it after its establishment.
Man, in so far as he is homo faber, instrumentalizes, and his instrumentalization implies a degradation of all things into means, their loss of intrinsic and independent value, so that eventually not only the objects of fabrication but also “the earth in general and all forces of nature,” which clearly came into being without the help of man and have an existence independent of the human world, lose their “value because [they] do not present the reification which comes from work.”22 It was for no other reason than this attitude of homo faber to the world that the Greeks in their classical period declared the whole field of the arts and crafts, where men work with instruments and do something not for its own sake but in order to produce something else, to be banausic, a term perhaps best translated by “philistine,” implying vulgarity of thinking and acting in terms of expediency. The vehemence of this contempt will never cease to startle us if we realize that the great masters of Greek sculpture and architecture were by no means excepted from the verdict.
The issue at stake is, of course, not instrumentality, the use of means to achieve an end, as such, but rather the generalization of the fabrication experience in which usefulness and utility are established as the ultimate standards for life and the world of men. This generalization is inherent in the activity of homo faber because the experience of means and end, as it is present in fabrication, does not disappear with the finished product but is extended to its ultimate destination, which is to serve as a use object. The instrumentalization of the whole world and the earth, this limitless devaluation of everything given, this process of growing meaninglessness where every end is transformed into a means and which can be stopped only by making man himself the lord and master of all things, does not directly arise out of the fabrication process; for from the viewpoint of fabrication the finished product is as much an end in itself, an independent durable entity with an existence of its own, as man is an end in himself in Kant’s political philosophy. Only in so far as fabrication chiefly fabricates use objects does the finished product again become a means, and only in so far as the life process takes hold of things and uses them for its purposes does the productive and limited instrumentality of fabrication change into the limitless instrumentalization of everything that exists.
It is quite obvious that the Greeks dreaded this devaluation of world and nature with its inherent anthropocentrism—the “absurd” opinion that man is the highest being and that everything else is subject to the exigencies of human life (Aristotle)—no less than they despised the sheer vulgarity of all consistent utilitarianism. To what extent they were aware of the consequences of seeing in homo faber the highest human possibility is perhaps best illustrated by Plato’s famous argument against Protagoras and his apparently self-evident statement that “man is the measure of all use things (chrēmata), of the existence of those that are, and of the non-existence of those that are not.”23 (Protagoras evidently did not say: “Man is the measure of all things,” as tradition and the standard translations have made him say.) The point of the matter is that Plato saw immediately that if one makes man the measure of all things for use, it is man the user and instrumentalizer, and not man the speaker and doer or man the thinker, to whom the world is being related. And since it is in the nature of man the user and instrumentalizer to look upon everything as means to an end—upon every tree as potential wood—this must eventually mean that man becomes the measure not only of things whose existence depends upon him but of literally everything there is.
In this Platonic interpretation, Protagoras in fact sounds like the earliest forerunner of Kant, for if man is the measure of all things, then man is the only thing outside the means-end relationship, the only end in himself who can use everything else as a means. Plato knew quite well that the possibilities of producing use objects and of treating all things of nature as potential use objects are as limitless as the wants and talents of human beings. If one permits the standards of homo faber to rule the finished world as they must necessarily rule the coming into being of this world, then homo faber will eventually help himself to everything and consider everything that is as a mere means for himself. He will judge every thing as though it belonged to the class of chrēmata, of use objects, so that, to follow Plato’s own example, the wind will no longer be understood in its own right as a natural force but will be considered exclusively in accordance with human needs for warmth or refreshment—which, of course, means that the wind as something objectively given has been eliminated from human experience. It is because of these consequences that Plato, who at the end of his life recalls once more in the Laws the saying of Protagoras, replies with an almost paradoxical formula: not man—who because of his wants and talents wishes to use everything and therefore ends by depriving all things of their intrinsic worth—but “the god is the measure [even] of mere use objects.”
On Renée Leon, I've added a brief postscript.
https://nicholasgruen.substack.com/i/148606906/postscript
I am a little sceptical of Catholic philosophers lauding the good old days (forgive me, I’ve been reading a fair bit of Post-Liberalism recently and certain phrases have become triggering - “virtue”, “human flourishing”, “the common good”, “Aristotelian-Thomism”).
My impression is that human beings are wonderful at inventing self-justifications and always have been. The fact that the Bible is a collection of loosely-related texts by multiple authors makes it an awesome source of justification for all kinds of things (from giving to the poor to murdering large numbers of people).
My own thoughts on moral self justification: https://tempo.substack.com/p/are-we-the-baddies