A few years ago, in a supermarket, I swiped my bank card to pay for groceries. I watched the little screen, waiting for its prompts. During the intervals between swiping my card, confirming the amount and entering my PIN, I was shown advertisements. Clearly some genius had realized that a person in this situation is a captive audience.
Attention is a resource; a person has only so much of it. And yet we’ve auctioned off more and more of our public space to private commercial interests, with their constant demands on us to look at the products on display or simply absorb some bit of corporate messaging. Lately, our self-appointed disrupters have opened up a new frontier of capitalism, complete with its own frontier ethic: to boldly dig up and monetize every bit of private head space by appropriating our collective attention. In the process, we’ve sacrificed silence — the condition of not being addressed. And just as clean air makes it possible to breathe, silence makes it possible to think.
What if we saw attention in the same way that we saw air or water, as a valuable resource that we hold in common? Perhaps, if we could envision an “attentional commons,” then we could figure out how to protect it.
I had just arrived home from my summer vacation — a week in a Minnesota cabin whose brochure warned “no crabbiness allowed” — when I came upon a study that declared New York the “unhappiest city in America.” I doubt many people were surprised by the results — New Yorkers, both in lore and reality, can be hard to please, and famously outspoken about their grievances — but as a born-and-raised New Yorker, and as a philosopher, I was suspicious of how the study defined happiness.
The survey in question, conducted by the Centers for Disease Control and Prevention, asked how “satisfied” Americans were with their lives — very satisfied, satisfied, dissatisfied or very dissatisfied. But the National Bureau of Economic Research used the data to conclude things about their “happiness.” Some might not have minded that the terms satisfaction and happiness were used interchangeably, but I did. The study was titled “Unhappy Cities,” and the headlines that followed it came out swinging against New Yorkers.
I was certain that a person (even a New Yorker) could be both dissatisfied and happy at once, and that the act of complaining was not in fact evidence of unhappiness, but something that could in its own way lead to greater happiness.
At times like this I appreciate philosophers’ respect for words, and a number of them have argued to keep happiness separate from satisfaction. In his 1861 essay “Utilitarianism,” John Stuart Mill carefully distinguished between the two, saying that a person can be satisfied by giving the body what it craves, but that human happiness also involves motivating the intellect. This means that happiness and satisfaction will sometimes conflict, and that those of us who seek happiness, and even attain it, may still be dissatisfied. Mill considered this a good thing: “It is better to be a human being dissatisfied than a pig satisfied, better to be Socrates dissatisfied than a fool satisfied.”
The 19th-century German philosopher Arthur Schopenhauer, one of history’s best-known pessimists, also believed there was more to life than satisfaction. Better to honestly describe a negative world, he believed, than to conceal it with beautiful lies. That sounds very New York.
There’s plenty to complain about when living in a big city: overcrowding, potholes, high prices, train delays, cyclists, bees. When I was growing up in Rockaway and schlepping to school in Brooklyn, it was perfectly normal to complain, and almost everyone I knew did. Our complaining was not an indicator of our level of happiness. In my experience outside the city, however, people routinely misinterpret my casual expressions of dissatisfaction as unhappiness. They consider complaining to be a sign of negativity, which they think should be replaced with positivity in order to be happy. “If you don’t have something nice to say, don’t say anything at all” is an example of this ubiquitous, if banal, attitude.
When I relocated to Texas, I quickly learned that kvetching about rain was no longer socially acceptable. “We need it!” became my new small-talk response to rain to avoid being dubbed a Debbie Downer. In a world where cheerfulness is applauded and grumpiness frowned upon, those who express dissatisfaction are often politely bullied to “look on the bright side” of rotten things.
We were invited to participate in their self-delusions and to see through them, to marvel at the mask of masculine competence even as we watched it slip or turn ugly. Their deaths were and will be a culmination and a conclusion: Tony, Walter and Don are the last of the patriarchs.
This slow unwinding has been the work of generations. For the most part, it has been understood — rightly in my view, and this is not really an argument I want to have right now — as a narrative of progress. A society that was exclusive and repressive is now freer and more open. But there may be other less unequivocally happy consequences. It seems that, in doing away with patriarchal authority, we have also, perhaps unwittingly, killed off all the grown-ups.
ISBON — THE Western news media are in crisis and are turning their back on the world. We hardly ever notice. Where correspondents were once assigned to a place for years or months, reporters now handle 20 countries each. Bureaus are in hub cities, far from many of the countries they cover. And journalists are often lodged in expensive bungalows or five-star hotels. As the news has receded, so have our minds.
To the consumer, the news can seem authoritative. But the 24-hour news cycles we watch rarely give us the stories essential to understanding the major events of our time. The news machine, confused about its mandate, has faltered. Big stories are often missed. Huge swaths of the world are forgotten or shrouded in myth. The news both creates these myths and dispels them, in a pretense of providing us with truth.
Does handwriting matter?
Not very much, according to many educators. The Common Core standards, which have been adopted in most states, call for teaching legible writing, but only in kindergarten and first grade. After that, the emphasis quickly shifts to proficiency on the keyboard.
But psychologists and neuroscientists say it is far too soon to declare handwriting a relic of the past.
As an uncle I’m inconsistent about too many things.
Birthdays, for example. My nephew Mark had one on Sunday, and I didn’t remember — and send a text — until 10 p.m., by which point he was asleep.
School productions, too. I saw my niece Bella in “Seussical: The Musical” but missed “The Wiz.” She played Toto, a feat of trans-species transmogrification that not even Meryl, with all of her accents, has pulled off.
But about books, I’m steady. Relentless. I’m incessantly asking my nephews and nieces what they’re reading and why they’re not reading more. I’m reliably hurling novels at them, and also at friends’ kids. I may well be responsible for 10 percent of all sales of “The Fault in Our Stars,” a teenage love story to be released as a movie next month. Never have I spent money with fewer regrets, because I believe in reading — not just in its power to transport but in its power to transform.
In the 2004 film “Mean Girls,” Regina George, iron-fisted ruler of the high school in-crowd, snaps at her sidekick, Gretchen Wieners, over some newfangled slang Gretchen is testing out. “Stop trying to make ‘fetch’ happen,” Regina snarls. “It’s not going to happen.”
Regina was wrong. “Fetch” did happen, and is still happening. Quote that line to any young woman from middle school on, and she’ll probably finish it for you automatically. Ten years after it bowed in theaters, “Mean Girls” remains a relevant pop-culture reference point and a go-to source of shorthand for female — and human — dynamics.