The Succession post: on Murdoch, Cavell, and The Avoidance of Love

 

Murdoch

It’s probably worth remembering that Rupert Murdoch nearly went down a very different path. He didn’t start out carrying water for the hard right. He descended towards it in a spiral of cynicism and self-interest. In 1972 Australia was coming out of two decades of conservative rule and Murdoch was beginning to amass his media empire. It’s hard to overstate how stale, how stultified, Australia’s political life had become. A political class of conservative mediocrities worked hard to ensure nothing ever happened, and a split in the Labor Party between the communists and the socially conservatives more or less ensured it. Like most ambitious Australians his age, Rupert Murdoch wanted change.

So he threw his national newspaper, The Australian, behind Labor leader Gough Whitlam, who went on to transform the country in his first year and become Australia’s most left wing Prime Minister. It was Murdoch’s first taste of playing kingmaker, and the king he made made university free, established universal healthcare, and pulled Australia out of Vietnam.

Jenny Hocking, Whitlam’s biographer, puts it like so:

Murdoch’s thrill was in being close to political power, a proximity he was yet to establish in America and denied him in England – an outsider, a colonial with tabloid tastes – and he was hooked. ‘I got emotionally involved. I allowed, with my eyes wide open, some of the journalists to go beyond being sort of partisans into almost being principals. They became foot soldiers in Whitlam’s campaign to some extent.’

These days there’s nothing shocking about the idea that Murdoch uses his journalists, such as they are, as foot soldiers in political campaigns, even if this particular admission is especially explicit. What’s surprising is that he was once willing to do it for the left. We’re so used to the end-of-life Rupert Murdoch (that is, à la Logan Roy) that we assume he was always like he is now.

Well, the story goes on. Despite all Murdoch had done for him, Whitlam had no desire to become indebted to the young media magnate. In opposition, Whitlam had upbraided other politicians for their closeness to the media. And to make matters worse, Whitlam just didn’t like him.

It wasn’t without trying. Advisors had urged Whitlam to be friendly to Murdoch, but Whitlam was too stubborn. Murdoch craved Whitlam’s approval: Whitlam thought Murdoch a mediocrity. It started with a dinner in 1971 that Whitlam called ‘one of the most excruciatingly boring nights of my life.’ A year later, once Whitlam came to power, Murdoch was meeting with Whitlam’s men ‘to make plans for the future.’ Murdoch asked that Whitlam make him Australia’s high commissioner to the UK. Whitlam’s answer: ‘No way.’

Two years later, Whitlam found himself in New York, where Murdoch was growing his empire, and finally agreed to meet with him. A year earlier, on a visit to London, Whitlam had peremptorily cancelled several meetings with the man. In Hocking’s words, ‘Whitlam had found the clumsy ingratiations of Rupert Murdoch irksome… It was not just that Whitlam found Murdoch’s expectations distasteful, he found his company dull and made no attempt to hide his displeasure if made to endure it – he was too busy to spend precious time with him.’

But this morning in September 1974 did not go as Murdoch had hoped. The pair had plans for dinner, but as Whitlam was walking across New York’s Plaza Hotel, he happened to cross paths with the British television interviewer David Frost (of Frost/Nixon fame). The two got on well – Frost first interviewed Whitlam in 1972 – and Frost immediately asked what Whitlam was doing that night. Whitlam, fully aware he had plans with Rupert Murdoch, responded ‘Why, nothing at all. Let’s have dinner.’

Murdoch was furious. He had been snubbed. (To make matters worse, David Frost had torn him apart in an interview a few years before.) When Whitlam’s press secretary tried to patch things up with a hastily arranged breakfast a few days after, Whitlam, according to Hocking, ‘maintained his posture of disdain.’ He simply couldn’t bring himself to suck up to the guy. The relationship was destroyed, and in the years that followed Murdoch went from backing Australia’s most left wing Prime Minister to turning The Australian into the flagship of the conservative right. In the US, Fox News was founded and soon made into a propaganda outlet for modern fascism. Had Whitlam not bumped into David Frost that day in September 1974, I don’t think it’s too much of a stretch to say, the world might look very different.

Characters as people

Logan Roy may be inspired by Rupert Murdoch but he is not Rupert Murdoch. He’s a different person. It might sound weird that I’m calling them both people but I’m going to insist on it. I’m working really hard to avoiding calling one of them a “character” and the other a “real person.” And the reason is that, for our intents and purposes as consumers of narrative art, Logan Roy is, or perhaps should be, as much a person as Rupert Murdoch.

I’d even hazard that for most of us, Logan Roy is a more real person. Have you ever seen Rupert Murdoch in real life? Just how much footage of Rupert Murdoch have you sat through? How familiar are you with his anxieties and neuroses? His tics and habits? The way his face pulls across his face when he laughs? I can tell you how Brian Cox’s Logan Roy raises his eyebrows high on his face when trying to communicate subtext to his subordinates. I can speak at length about his theory of mind. I don’t know anything close to that about Rupert Murdoch. He simply is not a particularly real for us, except as a lightly-sketched character – as a person depicted by other people in the real world.

The point of flattening out the distinction between characters and “real people” is to try to take seriously what narrative art is and hopes to do. In short, it succeeds when it creates characters who in fact are, in important senses, taken to be real by the reader. And the best, most instructive reading of literature happens, I think, when the reader does the author the credit of acting as if her characters are real.

This approach to criticism comes to me via Stanley Cavell, who I’m always appropriating for this blog, and whose essay The Avoidance of Love: A Reading of King Lear, is one of my favourite pieces of writing. Characters, Cavell insists, are people, or at least best understood and best read as people. He writes:

I think that one reason a critic may shun direct contact with characters is that he has been made to believe or assume, by some philosophy or other, that characters are not people, that what can be known about people cannot be known about characters, and in particular that psychology is either not appropriate to the study of these fictional beings or that psychology is the province of psychologists and not to be ventured from the armchairs of literary studies. But is any of this more than the merest assumption; unexamined principles which are part of current academic fashion?

Another reason we became reluctant to treat characters as people: for us to do so, the text has to be good. Succession is such a text.

Reading Succession

When I was at uni there were a lot of finance-broey types around who were really into The Wolf of Wall Street. To this day it remains unclear to me on what level they were engaging with the text. Did they have enough self-awareness to read it as a critique of mindless accumulation and the fraudulence encouraged by capitalism? It never seemed like they did. Everything about them indicated that they took the film to be a celebration of Jordan Belfort and his life. And that reading didn’t seem to trouble them at all.

I bring this up in the context of a bête noire of mine, a probably-obnoxious claim I’ve been making lately around here that many people don’t know how to read. Watching Succession’s penultimate episode last week, an extraordinary dramatic turn which featured something like 20 captivating minutes of back to back funeral eulogies, I was reminded of the tension at the heart of the show, and at the heart of what has been called prestige TV – i.e. the feelings of conflict produced in the viewer, the profound and productive sense of moral complexity, the fact that the show forces us to at once sympathise with and detest despicable people.

Thank you for reading Kitchen Counter. This post is public so feel free to share it.

The instructive moment is Kendall’s funeral speech for his father. Logan’s brother, Ewan, the moral voice of the show, has forced his way on stage despite the protests of the Roy children. He gives what is by any standard an extraordinary speech. It is unflinching and true. It manages to include both the base fact that Ewan loved his brother and also that his brother ‘wrought the most terrible things...’

‘…He was a man who has here and there drawn in the edges of the world. Now and then darkened the skies a little. Closed men's hearts. Fed that dark flame in men, the hard, mean, hard relenting flame that keeps their hearths warm, while another grows cold, their grain stashed, while another goes hungry. And even has the temerity to tell that hard, funny, yes, funny, but hard, joke about the man in the cold. You can get a little high, a little mighty, when you’re warm.’

Kendall is forced to follow this act. Roman fails at the pulpit, consumed by grief, and it becomes Kendall’s mess to pick up. This is a classic trope in narrative fiction – the hero is thrown, unprepared, into the ring, and using his wit and ingenuity he saves the day. So when Kendall finds his voice and gives a rousing defence of is father, the viewer is invited by convention, by the narrative form itself, to side with him against his uncle. Convention dictates that is a moment of triumph.

To anyone who actually listened to the words of the speech, Kendall has no answer to Ewan. His voice is raised and he speaks with momentum, but the content itself is captured within the first few sentences and amounts to little: ‘what my uncle said is true. My father was a brute. He was. He was tough. But also – he built. And he acted.’ This self-serving moral logic, that action and vigour and vitality are ends in themselves that supersede all other values, has a long history. In the 20th century it was a theme closely associated with fascism. It is a cynical, nihilist, animalian logic that returns human conceptions of value to a bully ethos, a sort of Darwinian, masculinist mentality that places physicality over thought, action over principle.

So while the speech performs the form of victory, it provides the content of failure. Still, in the audience at the funeral, we see heads nodding along. They need a justificatory story, no matter how flimsy, to keep the gravy flowing. But in their head-nodding we see, too, just how alluring these might-is-right stories can be. Deep down in each of us is a part that loves the bully. Fascism is the logic that rewards and grows that piece of the soul.

Which is why the show is able to make you, the viewer, complicit, too. I barely think you needed to be an illiterate finance bro of the type I caricatured earlier to find yourself, despite yourself, nodding along. Despite ourselves, despite the show performing and validating Ewan as its moral centre, some part of us is compelled by the masculinist strength of Kendall’s rebuttal.

And what about Kendall himself? It’s no accident that he reaches for fascistic tropes in his moment of desperation. Ultimately, fascism is cynicism’s justificatory logic, and there are simply no other grounds on which to defend his father. Most of all, Kendall is obliged to defend his father. He is obliged by self-interest, yes. But he is also obliged by love.

Succession is a show about love. If that claim is surprising it is only because we have been taught ‘by some philosophy or other’, as Cavell says, that characters are not people. But as soon as we treat them as such, and I think the show more or less demands we do, we see that every one of its primary characters if motivated or at least conflicted by love’s demands. Love is each character’s final bulwark against cynicism, and also the cause of his, her, madness.

On this reading Kendall becomes a two-bit Hamlet, a fumbling failson driven by his craving for love and his inability to give it, a man always in his own way, a man blind to himself, a man willing to allow his remaining principles to slip away in the defence of that love. He loves his father and cannot bear it, wants to kill him. He loves his children and cannot bear it, hides from them.

My sense is that this moral complexity can be foreign for some readers – especially those raised on a diet of Marvel films. They have been trained to expect narrative art to resemble a morality play – not an arena for human persons. The more complex the moral lives of the characters, the more complex the moral experience of the audience, the more human the characters on screen become. Which is what the show, ultimately, pulls off. Succession is good enough to ask us to see its characters as people. One feels that Kendall is a real person, and by virtue of him being a person one feels sympathy, even love, for him.


The Avoidance of Love

Here I am yet again using Cavell. The Avoidance of Love is a reading of King Lear that uses Wittgenstein and psychoanalysis to explain the play in wonderful new ways. I love it so much. In the play, Lear asks his three daughters to declare their love for him. Whoever loves him most, he says, will be given his kingdom. The first two daughters are effusive and glowing. The third, his youngest, Cordelia, refuses to say anything. ‘What shall Cordelia speak?’ she says. ‘Love, and be silent.’ Cordelia loves her father, that much is clear, but she refuses to give him the speech he craves. She speaks her love by being silent.

So the tragedy is set in motion. Lear, made so upset by this response, goes insane. There’s a storm and a fool, etc. etc. The point here is that Cavell reads the play as depicting real, normal people doing what real, normal people do, by which I mean this: they love each other and they hide from each other. ‘There are no lengths to which we may not go’, writes Cavell, ‘in order to avoid being revealed, even to those we love and are loved by. Or rather, especially to those we love and are loved by: to other people it is easy not to be known.’ Why does Lear even bring out his three daughters and demand they perform their love for him? ‘My hypothesis will be that Lear's behavior in this scene is explained by….the same motivation which manipulates the tragedy throughout its course….by the attempt to avoid recognition, the shame of exposure, the threat of self-revelation.’

That is – Shakespeare’s King Lear is just another guy. He wants love and he fears being seen. He is filled with the urge we are all filled by, to hide, deflect, defend, and dodge. Cordelia refuses to play his game, gives him the love he is owed pure and honest, and this act of authenticity fills Lear with such shame – shame for asking for false love – that he hides in insanity. The Avoidance of Love is an extraordinary piece of criticism – the type that changes entirely how you see a piece of art, and art generally. To write it, Cavell first had to see the characters of King Lear as people.

As you know by now, Succession is a sort of modern Lear. Logan asks his three children to perform their love for him in exchange for his kingdom. But there is no Cordelia – no pure soul who refuses to play the game – which means Succession cannot be about the same things, have the same structure or goals or direction, as Lear. In Succession they all play Logan’s game, all go to war, all go mad.

Succession is a story about three siblings, crushed by a father whose love they crave, who find their own moral sensibilities denatured by their proximity to power. Cynicism, they each realise in time, is the only means of achieving and holding on to that power. Commitment to any moral code beyond brutish self interest is a commitment to weakness, weakness that can be exploited, and one achievement of the show is its slow insistence that capitalism beat almost all remaining principles out of them. That is the poison pill of power – and the dominant modality of power is capital.

That is why, in this season’s election episode, we watch Shiv watch Kendall in horror as he slowly succumbs to the pressures of cynicism. Roman is pushing hard for calling the election result before the votes are counted; Shiv is holding the line for democracy, but has no power in the situation; Kendall, who has already slipped far from the principles of season one, falters. The narrative wants to say: capitalism creates its evil overlords. They start out people and are made monsters. Murdoch once supported Whitlam.

But Succession would be a much less interesting show if the dominant message were merely capitalism denatures the human soul. The show does say that, very clearly. But it also says: in an unprincipled capitalistic society human people often are in fact restrained from cynicism, from fascism, and that restraint comes from human bonds – from love. In the election episode, the only remaining restraint on Kendall’s action is love for his daughter.

That long-standing internal logic is what makes the show’s finale fully plausible. The dismount – a collapse of the Roy children coalition that facilitates the loss of their company and the rise of Shiv’s husband Tom to CEO – is only sensible to the viewer in light of the countervailing forces of love. Shiv betrays her brother in spite of her love for him and because of her love for Tom.

I see that I am 3000 words in, so let me quickly bring this back to Cavell. People ‘do not just naturally not love’, he writes. ‘They learn not to.’ They learn not to because love presents to each of us as a demand – to love, and be loved, is to be obliged. It is to be burdened. And it is no accident that the antithesis of love I have presented here, cynicism, is exactly a flight from obligation, from burden. People avoid love because they cannot bear, are afraid of, what love asks of them.

‘And our lives begin’, writes Cavell:

by having to accept under the name of love whatever closeness is offered, and by then having to forgo its object. And the avoidance of a particular love, or the acceptance of it, will spread to every other; every love, in acceptance or rejection, is mirrored in every other. It is part of the miracle of the vision in King Lear to bring this before us… We wonder whether we may always go mad between the equal efforts and terrors at once of rejecting and of accepting love.

‘We wonder whether we may always go mad between the equal efforts and terrors at once of rejecting and of accepting love.’ Indeed.

That is how I read Succession, a show depicting people.

 

Against tradition

 

There’s a popular TikTok account that occasionally makes its way to me. An American woman torments her Italian boyfriend by breaking the cardinal rules of Italian cuisine: snapping pasta before boiling it; drinking cappuccinos in the afternoon. It’s a variation on a long running gag. The uncultured anglophone offends the continental traditionalist, who is positioned as the defender of the good and right. A canonical text is this often-viral video, from 2010. British TV presenter Holly Willoughby suggests that adding ham to mac-and-cheese might make it a “British carbonara”; the Italian chef Gino D’Acampo nearly falls over: “if my grandmother had wheels”, he exclaims, “she would have been a bike.”

D’Acampo, like many Italians, believes that carbonara consists of exactly these ingredients: guanciale, Roman pecorino cheese, eggs, and pepper. Swap any ingredient out and you no longer have carbonara, you have something else, and defending the purity of the recipe is supposed to be a matter of national significance. At stake is Italian identity, heritage, tradition.

I’ll give away the ending: D’Acampo is defending a history that never existed. Carbonara is about as old as your parents, and the version we know and love – the one with guanciale – only became standard in the 1990s. The dish, like most of Italian cooking, basically didn’t exist before the Second World War. Like people everywhere, Italians before the war were less interested in recipe purity than they were in, well, not starving to death. That’s why, as the food historian Luca Cesari puts it, carbonara first emerged in 1944, when an Italian chef made it for US Army officers stationed in Riccione. They had better ingredients.

Pizza, too, at least as you know it, is a 20th century invention. Round discs of dough have long been baked throughout the Mediterranean (ever wonder why pizza and pita sound the same?), but the dish we are familiar with – with the tomato base and so forth – was unfamiliar to most Italians until after the war. That could be because pizza as we know it is more American than Italian. When American soldiers got to Italy in the early 40s, they wrote letters back to their families confused by the absence of pizzerias. The academic Alberto Grandi, profiled in this terrific FT piece, places the first-ever pizzeria not in Italy at all, but in New York, founded 1911.

While I’m at it, the cappuccino isn’t too much older, either. The Italians started using the word at the end of the 19th century – they’d appropriated an innovation coming out of Vienna. The Austrians had adopted coffee from the Ottomans to their south and added the new element of dairy – mostly in the form of cream. The new concoction, which added fat and sweetness to the bitter drink, also changed the colour. Coffee went from dark brown to light brown – light enough to resemble the brown robes of the Capuchin monks. The Kapuziner was born.

Still, that only meant a black coffee with a dollop of cream – not an espresso mixed with foamed milk. The cappuccino as we know it didn’t arrive until the 50s. Bialetti had invented his Moka Pot in 1933, but the espresso machine, and steamed milk with it, didn’t become standard until after the war. (The Moka Pot was named for the Yemeni city of Mocha – the port through which coffee was transported from Ethiopia. Bialetti used the Italian Futurist love for aluminium to popularise his product. As it happens, Fascist Italy invaded Ethiopia two years later). That makes the modern cappuccino younger than your grandparents. In fact, this chronology is why Americans to this day prefer drip coffee to espresso. Italian-American immigration mostly occurred prior to the invention of the modern espresso machine (Italian-Australian immigration, on the other hand, occurred after its popularisation, which is how you get Australian cafe culture).

Of course, it wasn’t just Italian food that had to be invented, but Italy itself. The Kingdom of Italy finally came into being in 1861 – though “Italian” states continued to join up until the First World War. Along the way a process of Italianization was required – a process alternatively implicit and explicit that pressed regional states into assimilation with an Italian whole. Similar nation-creating processes occurred elsewhere throughout the 19th and 20th centuries. The unification of Italy was no different, for our purposes, than the unification of Germany, or France, or China, or Indonesia, to take famous examples. In each of these cases hundreds of local cultures and dialects were forced to assimilate with a centralised bureaucracy and language, and all of a sudden something novel emerged – nascent political communities that needed common stories to hold themselves together.

So if I’m picking on Italy it’s only because it makes for a great case study. It’s not special. In the modern era none of us are free: the traditions we depend on are almost invariably invented, novel, and contrived. Every modern nation thinks its traditions ancient and organic, and every modern nation lies to itself.

To drive the point home, let’s give the Italians a break and go on a quick trip around the world. What we call “Traditional Chinese Medicine” is really a conglomeration of cultural-medicinal practices collated during the Maoist era as a way to provide low-cost healthcare to the masses. Mao didn’t particularly believe in or use TCM, but he needed to cheaply and quickly install a legible medical system that would reinforce an emerging national identity. The Scottish highland myth, similarly, had roots in real historical cultural practices, but the Tartanry we imagine today is largely the product of 19th century nation-building – and concerted tourism campaigns aimed at the English.

Let’s keep going. Much of the pomp and ceremony associated with the British parliament was created in the 19th century to convey an image of order and tradition. Zwarte Piet, a racist character in Dutch Christmas folklore, is often defended on the grounds of tradition, but appears to have been popularised by a book published in 1850. Pad Thai, commonly thought of as a traditional Thai dish, was actually invented in the 1930s as part of a nationalistic campaign to promote Thai cuisine. The qipao had roots in Qing-era fashion but became mainstream in the early 20th century, thanks to liberalising attitudes of dress caused by increasing contact with the West. If you eat sushi, you probably think that salmon has a long history in Japan. It does not. Salmon was introduced to Japan in the early 1900s but didn’t take off until the late 1980s, when a Norwegian salmon lobby delegation arrived in Tokyo to hawk their goods. A decade-long Norwegian advertising campaign did the rest. At the same time, Japanese advertising in the US was deliberately and artificially establishing the now decades-old tradition of lesbians driving Subarus. And don’t get me started on the ramen / lamian crossover. Meanwhile, if you are American and you think eating lobster is fancy, you don’t have much to go on, historically. European settlers found them so plentiful that they were mostly given to prisoners, and used as fertilizer. It took a popularisation campaign in the mid-19th century to increase their standing. So too with oysters, which were once so available they were considered a poor-person’s food. It took their over-fishing, and resulting scarcity, to drive demand up.

I could go on, but I won’t. I’m lazy and the point is made.

The function of tradition

If tradition were mere ceremony, it mightn’t matter that it’s so often fraudulent. But tradition performs a powerful role in modern societies. The historian Eric Hobsbawm, whose 1983 book The Invention of Tradition did more than any other to tackle this question, put it like so:

‘Invented tradition’ is taken to mean a set of practices….which seek to inculcate certain values and norms of behaviour by repetition, which automatically implies continuity with the past. In fact, where possible, they normally attempt to establish continuity with a suitable historic past.

So we find ourself in a world where fictitious traditions are invoked to enforce particular belief systems. These belief systems invariably happen to reflect the interests, and meet the needs, of those who invented the given tradition. And the ability to invent and imbed a given tradition is limited to groups with sufficient social power. Those that seek to conserve the status quo, or hark back to an imagined past, find tradition a particularly useful tool. Tradition is not neutral – it tips the playing field in favour of the right wing.

To make matters worse, traditions are susceptible to misuse and misappropriation well after they’ve served their original purpose. This is a familiar feature of reactionary politics – the reactionary appeals to a given tradition, to a suitable historic past, to legitimate their modern-day political project. Very often the reactionary will appeal to a past that transparently never existed – they’ll unashamedly invent new ones on the spot. But it’s the familiar ones that work best. The more a given tradition is accepted as true by the populace, the more reactionaries can exploit it for their particular political ends.

When, in 2019, the archbishop of Bologna suggested serving pork-free tortellini at the feast of San Petronio for the sake of Muslim Italians, the far-right panicked. “They’re trying to erase our history, our culture,” said Matteo Salvini, leader of the far-right League party. Salvini was playing on the misconception that tortellini are traditionally made with pork. But that’s just another myth – tortellini were pork-free until the late 19th century.

It’s not just the reactionary right we should worry about. When I was in China in 2017-2018, the CCP machine was in full flight insisting that they were the inheritors of a 5000 year old Chinese political tradition. The actual history of the region – multiple distinct dynasties, centuries of regional warlords, mass immigration and emigration, scores of diverse linguistic and ethnic groups, the civil war and the near triumph of the Guomindang, occupations and concessions held by Japan; Germany; Italy; Russia; Portugal; the United Kingdom; and the United States; the Cultural Revolution – did not impede the government from propagating this fiction. The myth simply is too useful for the CCP – it legitimates them.

How does it legitimate? By establishing continuity with the past, the CCP becomes the natural and just inheritor of power in the region. It goes from a historically contingent political body that happened to win power from the lottery of history, to a necessary institution whose monopoly on violence is justified by 5000 years of cultural heritage.

(An aside - I’ve been emphasising the national, but we all know that the same process occurs at the local, and even hyper-local, level. As an undergrad college student at Sydney Uni, in what American readers might best understand as a fraternity environment, I saw firsthand how traditions a mere handful of years old could be created and then deployed to buttress violent, brutal hierarchies of social power. Bad!)

Viva contingency

That’s probably enough for today. If I were writing a book about this, and I could, the next chapter would be about the inevitability of tradition. Tradition-forming practices are culture-forming practices. They appear to be inseparable from being human.

But in a world where traditions are routinely being invented, used, and misused, I think it’s incumbent on us to exercise a little skepticism. The risk, beyond equipping political enemies with tools I’d rather they didn’t have, is that these invented traditions come to imprison our minds. We find ourselves committed to falsities – pasts that never existed – and we do terrible things to defend them or bring them back. What we need instead is a skeptical ear for historical stories and a sense of historical contingency. History is the product of uncountable events that could well have gone differently. The future is not fated and the past is not a prison. We create the world we live in, and we can just as well create it differently.

 

Becoming a Person

 

Me, leaving undergrad

The other day, while sweeping the floor in the living room, my mother asked me if I regretted not going into tech or finance. “I was talking to some other mothers and they were all saying how much money their children were making,” she said, or words to that effect. I could hardly blame her for the question. I’m 31 and on my third master’s degree. I do not have anything resembling a traditional career. My current program, an MFA in nonfiction, is so embarrassingly expensive that I do anything I can to avoid the question when it comes up – even if it is the first I’ve paid for. There’s not even a job waiting at the end of it: the program promises a negative return on its investment. Each year MFAs around the US pump out hundreds of students into a market neither willing nor able to pay for their skills (such as they are).

High paying jobs in tech or finance or consulting or whatever are accessible enough to people who can spell and go to the right schools, so I can’t pretend I didn’t have the choice. The truth is I deliberately missed each turnoff as I drove along life’s highway. There were stories I told myself at the time that kept me plunging onwards. I’m a bit too embarrassed to repeat them all here. Enough to admit, I hope, that I simply thought those jobs were beneath me: too unethical, too boring, too lame. I don’t feel so strongly anymore. Some of that work turns out to be pretty compelling – some even makes the world a better place (some, I said!). Not that it matters. At this age I suspect the roads to a classical corporate career are all behind me.

I have regretted it, of course, many times along the way. The last five or more years have been spent in the wilderness, my road plunging deeper into dark forest and rocky terrain. This has not felt heroic or challenging. I have not, to my own mind, been having adventures, or growing, or learning. Mostly I’ve just felt lost and confused and stupid. And: unproductive and misdirected and under-utilised. Like a hammer trying to screw in a nail.

The MFA was a kind of hail mary thrown to get me out of this situation – or at least to get me a reprieve from it. If that sounds silly (perhaps it reminds you of the old line about repeating the same thing and expecting a different result), I can only plead guilty. Either way I have found myself returning, yet again, to education. And, specifically, to the humanities.

For better or worse I’m back here again, and I’m not surprised to find myself enjoying myself. My life once again primarily consists of reading, writing, and talking about ideas. What’s not to love? The reprieve has come. I’m spending all my time doing the things that give life – my life, at least – a sense of significance.

Back in undergrad I used to make this point in the most obnoxious way possible: I’d go around saying things like “Imagine you could cure cancer. Now imagine you could instead write a poem. Surely you should write the poem, right? One is a means to an end, the other is the end in itself.” Later, in a philosophy degree, I thought that we should be all directing ourselves to “getting to the bottom of things.” After a few years digging I didn’t quite get there (or, as Wittgenstein would put it, I hit bottom too quickly – my spade was turned up), and so my next big turn was towards the practical, to public policy and politics. For that I used Marx: “philosophers have hitherto only interpreted the world – the point is to change it.” After a few delays on that project, I’ve pushed my timeline back a bit.

These days I manage to more or less hold all these mantras in my mind at the same time. They don’t so much supplant each other as add to a deepening sense of what life is for. And lately, just on cue, just as I find myself on another educational adventure, just as I ask myself why I should write when the world will never value my writing, I find myself gripped by a new dogma. Now I find myself saying: the point of all of this is to become a person.

Astute readers will recognise the phrase from Carl Rogers, the humanistic psychologist, and his 1961 book On Becoming a Person. Though it’s likely that half of what I’m about to say has been indirectly and osmotically stolen from what’s in that book, I wouldn’t know. I’ve never read it. I just saw the title one day, a few weeks ago, just as my brain was whirring already with many of these thoughts, and felt a whole lot of loose matter begin to congeal between my ears.

So what do I mean? Well, to start, I think a person is a very different thing to a human. We are all already humans, after all. In my mind this is a rather inflexible category. Nothing, except perhaps death, can get in the way of your being a human. You simply are born human and you go out human, like a light going on and off.

In my thinking this category, like all categories, is arbitrary and political rather than natural. This is to say that it is we who carve up the world, not God, or Nature, and frankly I think much of the strife of philosophy has risen from the old Platonic expectation that the world can be neatly divided. Why should we expect that?

But categories have their uses. One of the uses of the category, “human”, is that it builds a capacious and unyielding wall around a group of beings whose dignity should be protected. And it’s my claim, though I won’t defend it here, that we ought to protect the dignity of as many beings as possible. Humans are a pretty good place to start.

The trick here is that, by virtue of being humans, we are all equally deserving of fundamental and inviolable human rights: rights that must come not from God or from Nature or from Reason but from you and me, exactly because we see them as good tools for protecting dignity. (And, if you marry this humanism with egalitarianism, as I reckon you should, you get my preferred variety of socialism. Every human deserves the same share of the economy. Why? Because they are human. Why? Because they are human. Why? Because fuck you, that’s why.) Because being a human is not the sort of thing that can be altered or improved or lessened, the basic dignity we afford to humans should be immutable, too. Being human isn’t so special – it just gives you, or should give you, a VIP ticket to the realm of egalitarian political respect.

So that’s being a human. Being a person, at least in the way I’m using it today, is not like this. Where “human” is discrete and binary, “person” is continuous and fluid. You don’t start out a person, but, if you are lucky, you become one. We should be careful, though – I don’t really see “person” as an end state to be achieved, like evolving a Pokemon or getting your black belt in karate. I see it as asymptotic. A regulative ideal, as Kant says. We, if we do life right, just become more and more, um, persony.

A person is the embodiment of a suite of virtues. I have in mind virtues like self-knowledge; forbearance; humility; what the Buddhists have called lovingkindness; what I want to call unreactivity (though I suppose equanimity would do); detachment; irony (my own personal favourite); and crucially and necessarily, a tendency against solipsism, an ability to see the other as wholly and inviolably human. We have often called the sum of these virtues wisdom.

Implicit in the phrase “becoming a person” is a truth I want to make explicit: becoming a person is how we lift ourselves from the apes. A person is a being with a richer, more profound, less animalistic subjectivity. A person is someone no longer enslaved to the reactivity of the animal spirits (to misappropriate Keynes). In this way a person is a human who is free.

What you’ll see immediately, I think, is that I want to say some people have done a better job at this than others. Some of us don’t get quite so far from our primate pals. No - I don’t want to say this. I’d rather not. But it does seem to be the case. Some people are less thoughtful, less self-aware; more zombie-like, more narcissistic. It’s a fact that really insists upon itself: when I consider people in my life; when I turn on the TV; when I look in the mirror.

It doesn’t help, of course, that the world we have created seems to work very hard to keep us from becoming persons. A quick look at the billionaires list assures us that the incentives for becoming a person are not financial. The economy – where our social values manifest materially – is rather agnostic about, even hostile to, personhood. That just makes a tall order even taller. Becoming a person is hard – perhaps the hardest thing we are tasked with doing.

Despite all that, some people do seem to pull it off. All of us make it some of the way, anyway. And what follows from this is that there will come a day for each of us when we are more a person than we have ever been and ever will be again. This day, or moment, will be our high water mark of personhood. When this will be for you I do not know. But I’m old enough now to have seen many humans climb the mountains of wisdom, summiting their personal peaks in their 60s or 70s or even 80s, before the slow decline of the second childhood begins. Virtues decay with the mind. My own grandfather, who this week turns 100, exhibits his own peculiar mix. Increasingly a child, he occasionally reveals himself to bear the benefits of a century’s experience.

Yes, some people do seem to pull it off. How? Well, I think it’s no coincidence that the moment I returned to reading and writing was the moment I started thinking about all this. For one, the gulf between literature’s apparent uselessness and our compulsion towards literature demands an explanation. Is literature really an end in itself, or might it have some instrumentality after all?

To prioritise the arts in your life is to confront that question. This is true for both producer and consumer. When you spend your time fixated on books you begin to fixate on the question: what is the point of reading? When you spend your time trying to become a writer, you can no longer ignore the question: what is the point of writing? And it’s only after you’ve faced down the fact that few people will ever care for your work – this isn’t self pity, it’s maths – that you can begin to acknowledge that you’re doing this for no one other than yourself. Yet you find yourself doing a third master’s degree. Why?

When I said at the start of this essay “the point of all of this is to become a person”, I was deliberately vague. I think we’re now in a position to be more specific. To the extent that life can have a point, I think the point of life might be to become a person. And I think that the point of reading and writing is that they help us get there.

This is really what I want to say: that the point of the humanities, of the arts, of reading and writing, is that these apparently non-instrumental cultural practices exist, were invented, to help us a little farther along the way to becoming persons. Banging away on my own pieces this year, I finally began to see that my difficulties had little to do with a lack of intellect and a lot to do with a lack of wisdom. A teacher helped me see that, but so, too, did the writing itself.

One way of saying this is to trot out the old cliche, battle-worn and undefeated, that the relationship between the writer and the pen is, can, or perhaps should be similar to the relationship between the patient and the therapist. While putting sentences on paper the writer is held to account by the pen. It gets harder to evade, dodge, deflect, and defend – as we are all so practised in doing – when one’s own self-justifications are made fixed and transparent in typography. And reading, as we have all been told, really does tend to make us wiser. In reading – especially fiction – we confront the interiority of the other; we gain perspective over our own small and insignificant role in history; we become less committed to the self-justifications that shield us from the world.

Given literature’s power, it’s tempting to write off those who don’t read. That’s a mistake. When we do, we reveal our own incomplete progress towards personhood. An engagement with the arts is not the only way to become a person. The media we writers like to champion – the novel, the essay, the poem – are a long way from necessary. They’re historically-contingent and rather recent social technologies: cultural practices that could just have well never have existed. For millennia, and still today, humans have done and do the work of becoming persons without the aid of those tools. They find other paths to virtue: alternative cultural practices; deep communal ties; self-reflection; psychoactive chemicals; and of course the most important and universal of all educators, experience.

Literature is no panacea, either. We’ve all known a fascist (fascism, for me, a kind of flight from personhood) who in fact reads plenty (Nietzsche, usually; or Dostoyevsky). Very often these types seem to read exactly to avoid becoming persons. They want to justify their anti-egalitarianism; buttress their narcissism.

Still, still. Still, it is my experience, and I think it’s yours too, that those we find stalled on the road to personhood aren’t known for their love of literature. When it was revealed Sam Bankman-Fried, the fraudulent founder of FTX, liked to insist “I would never read a book”, I couldn’t help but feel I knew that about him already. The fear of literature, I think, is really a fear of becoming a person, of being asked to confront oneself. That is the real sin of the philistine. Pick your own incurious villain – Donald Trump, say, or Scott Morrison – to see what I mean. To amuse myself, I sometimes like to imagine Elon Musk reading Middlemarch. That image would suggest an entirely different man, a man open to the unprofitable virtues, a man curious about the lives of others, a man curious about himself. It would suggest a man interested in becoming a person.

 

New Hampshire Berning

I came to New Hampshire because, after a year and a half living in the United States, I wanted to see for myself the bizarre, arcane, dissonant cacophony of political advertising and vested interests and sincere public democratic engagement that will dictate the future for the rest of us—that is, for the rest of the world.

Read More

Australia has reached the tipping point. It's time to lead.

Australia is the first rich country to be visited by climate catastrophe. We now know that as the climate worsens, it will be Australia, among the rich nations, who suffers first. We no longer need to look overseas for the canaries in the coal mine – Australia’s native birds will do fine, and they have stopped singing.

Read More

Cringing Toward a New Australianism

The author Richard Flanagan is in Beijing talking to a room full of Australians. Roughly one hundred of them, mostly white, have crowded in to The Bookworm, a bookshop familiar to anyone in Beijing’s small English-language literary scene. Near the front is an enthusiastic Chinese fan, but the rest are members of Beijing’s Australian expat community. They know where Tasmania is, and they know, keenly, that Australia is at the edge of everything else.

Read More

Caring for our own

Seventy-five percent of mental ill health emerges before the age of 25. Alistair Kitchen asks if the University is doing enough.

Read More