I was so much older then…

On the global measure, people start out at age 18 feeling pretty good about themselves, and then, apparently, life begins to throw curve balls. They feel worse and worse until they hit 50. At that point, there is a sharp reversal, and people keep getting happier as they age. By the time they are 85, they are even more satisfied with themselves than they were at 18.” Via the NYT, a new study finds older people tend to be the happiest among us.

“‘It could be that there are environmental changes,’ said Arthur A. Stone, the lead author of a new study based on the survey, ‘or it could be psychological changes about the way we view the world, or it could even be biological — for example brain chemistry or endocrine changes.’” My guess, from where I sit at 35 — perspective, a.k.a. wisdom. You don’t live to 85 by sweating the small stuff, and by then you probably have a pretty good sense of how things tend to shake out anyway.

I Think I Feel Better.


It may be, then, that the simplest and least ethically hazardous way to capitalize on the placebo effect is to acknowledge that medicine isn’t just a set of approved treatments — it’s also a ritual, with symbolism and meaning that are key to its efficacy. At its best, that ritual spurs positive expectations, sparks associations with past healing experiences, and eases distress in ways that can alleviate suffering. These meanings, researchers say, are what the placebo effect is really about.

In the Boston Globe, Rebecca Tuhus-Dubrow examines the past, present, and future of the placebo effect. “You’re talking about many, many, many millions of dollars a year in drug treatment costs…If [doctors] can produce approximately the same therapeutic effect with less drug, then it’s obviously safer for the patient, and I can’t believe they wouldn’t want to look into doing this.’

Don’t Stand So Close to Me.

“‘No man is an island,’ said Nicholas A. Christakis, a professor of medicine and medical sociology at Harvard Medical School who helped conduct the research. ‘Something so personal as a person’s emotions can have a collective existence and affect the vast fabric of humanity.’

Forget H1N1: Psychologists uncover statistical indications that loneliness transmits like a social disease. “Loneliness is not just the property of an individual. It can be transmitted across people — even people you don’t have direct contact with.” Hmmm. Well, that explains grad school, then.

The Messaging War.

“The narrative is simple: Insurance company plans have failed to care for our people. They profit from denying care. Americans care about one another. An American plan is both the moral and practical alternative to provide care for our people.

Cognitive scientist George Lakoff discusses how the administration should best promote health reform (and the American Plan, nee “public option”), and offers a choice critique of “policy speak” — the old progressive standby of “enlightening public opinion” — that would make Walter Lippmann very happy: “To many liberals, Policy Speak sounds like the high road: a rational, public discussion in the best tradition of liberal democracy. Convince the populace rationally on the objective policy merits. Give the facts and figures. Assume self-interest as the motivator of rational choice. Convince people by the logic of the policymakers that the policy is in their interest. But to a cognitive scientist or neuroscientist, this sounds nuts. The view of human reason and language behind Policy Speak is just false.

Lakoff aside, the good folks at Media Matters have compiled a useful list of “Myths and Falsehoods about Health Care Reform,” and how best to refute them. And, next time somebody starts ranting at you about how Big Guv’mint never does anything right, send ’em here with a smile.

Absinthe Muse or Demon Rum?

“Much ink has been spilled on the question of why so many writers are alcoholics. Of America’s seven Nobel laureates, five were lushes–to whom we can add an equally drunk-and-disorderly line of Brits: Dylan Thomas, Malcolm Lowry, Brendan Behan, Patrick Hamilton, Philip Larkin, Kingsley Amis, all doing the conga to (in most cases) an early grave…In fact none of these authors would write much that was any good beyond the age of 40, Faulkner’s prose seizing up with sclerosis, Hemingway sinking into unbudgeable mawkishness.

By way of Dangerous Meta, The Economist‘s Tom Shone considers the artistic merits of novelists sobering up. “The radiance of late Carver is so marked as to make you wonder how much the imperturbable gloom of late Faulkner, or the unyielding nihilism of late Beckett — like the cramped black canvases with which Rothko ended his career — were dictated by their creators’ vision, and how much they were simply symptoms of late-stage alcoholism. This suspicion is open to the counter-charge: this contentment and bliss is all very well, but readers may simply prefer the earlier, messed-up work.

The Politics of Yecccch.

“Likewise, conservatives are more likely than liberals to sense contamination or perceive disgust. People who would be disgusted to find that they had accidentally sipped from an acquaintance’s drink are more likely to identify as conservatives.” The NYT’s Nicholas Kristof examines the hardwired psychological differences between liberals and conservatives. “The larger point is that liberals and conservatives often form judgments through flash intuitions that aren’t a result of a deliberative process. The crucial part of the brain for these judgments is the medial prefrontal cortex, which has more to do with moralizing than with rationality …For liberals, morality derives mostly from fairness and prevention of harm. For conservatives, morality also involves upholding authority and loyalty — and revulsion at disgust.

We Control The Verti…ooh, new Tweet!

“Over the last several years, the problem of attention has migrated right into the center of our cultural attention. We hunt it in neurology labs, lament its decline on op-ed pages, fetishize it in grassroots quality-of-life movements, diagnose its absence in more and more of our children every year, cultivate it in yoga class twice a week, harness it as the engine of self-help empires, and pump it up to superhuman levels with drugs originally intended to treat Alzheimer’s and narcolepsy…We are, in short, terminally distracted. And distracted, the alarmists will remind you, was once a synonym for insane.”

Or, as Matt Johnson put it 25 years ago, I’ve been filled with useless information, spewed out by papers and radio stations…Another year older and what have i done? All my aspirations have shriveled in the sun. And don’t get me started on blogs, e-mails, youtubes, and tweets. In a New York Magazine cover story, Sam Anderson runs the gamut from Buddhism to Lifehacking to ascertain whether technology has really propelled us into a “crisis of attention”. (By way of Dangerous Meta, a blog that’s invariably worth the distraction.) And his conclusion? Maybe, but thems the breaks, folks. There’s no going back at this point. “This is what the web-threatened punditry often fails to recognize: Focus is a paradox — it has distraction built into it. The two are symbiotic; they’re the systole and diastole of consciousness…The truly wise will harness, rather than abandon, the power of distraction.

Which just goes to show, the real key to harnessing distraction is…wait, hold on a tic, gotta get back to you. There’s a new funny hamster vid on Youtube.

So Tweet and So Cold.

@JohnnyCash: Hello from Reno. Shot man…just to watch him die, actually. Weird, I know.
@ACamus: Beach lovely this time of year. Also, killed Arab. Oops.

Or something like that. Apparently, a new study suggests that — uh, oh — using Twitter may stunt one’s moral development. “A study suggests rapid-fire news updates and instant social interaction are too fast for the ‘moral compass’ of the brain to process. The danger is that heavy Twitters and Facebook users could become ‘indifferent to human suffering’ because they never get time to reflect and fully experience emotions about other people’s feelings.

Hmm. I can’t say I’ve found Twitter to be particularly useful yet — to be honest, it all seems rather gimmicky to me, I worry about its Idiocracy-like implications. (Why 140 characters? Why not 10?), and, frankly, I often find that neither my life nor anyone else’s (nor, for that matter, that of anyone’s else’s adorable little children) is all that fascinating from moment to moment. (“Got up. Tired. It’s raining. Maybe I’ll eat some Grape Nuts.“) But I don’t think I can pin any personal reservoir of misanthropy on it either. (For that, I blame FOX News.)

A Hole in the Heart.

“‘This is the part of the brain involved in knowing that you want something,’ she said. ‘When people who are not adjusting well are having these sorts of thoughts about the person, they are experiencing this reward pathway being activated. They really are craving in a way that perhaps is not allowing them or helping them adapt to the new reality.‘” It’s darker than you know in those complicated shadows…A new study finds that unrelenting grief works on the brain differently than the usual kind of post-traumatic depression. “The same brain system is involved in other powerful cravings, such those that afflict drug addicts and alcoholics…It’s like they’re addicted to the happy memories.

You’re biased! No, really, you are.

“If you are unprepared to encounter interpretations that you might find objectionable, please do not proceed further…I am aware of the possibility of encountering interpretations of my IAT performance with which I may not agree. Knowing this, I wish to proceed with either the Democratic Candidates task or the Republican Candidates task. As the 2008 Democratic primary season degenerates into a Clintonian morass of identity politics and invective, now seems as good a time as any to test your own internal bias with an Implicit Association Test. (For more info, Slate’s Jay Dixit covered the test and it social implications a few years ago.)

As for me, I took it three times. At first, my reptile-brain displayed a bias for Hillary Clinton, with Barack Obama and John Edwards exactly tied below her, and Bill Richardson lagging considerably behind. (My apologies, Governor Richardson. I think it might be because you look older than the rest of the candidates. At least, I hope that’s the reason.) The second time I took it involved just the candidate’s names, and it was completely inconclusive — all four were tied exactly in the center of the chart. The third time — perhaps because I was growing more used to the interface — Barack Obama was up high, followed by Edwards, followed by Clinton followed by Richardson.