Guinness for Brains.


Drinking alcohol was ‘unintentional, accidental, and haphazard until about 10,000 years ago,’ says Satoshi Kanazawaat at Psychology Today. Smart people are generally early adopters and, in the context of human history, ‘the substance [alcohol] and the method of consumption are both evolutionarily novel.

Two new studies find a correlation between intelligence and a thirst for alcohol. Hey, I buy it – Thank you, science, for lending support to my vices! And, as Bogey said, “The problem with the world is that everyone is a few drinks behind.

I was so much older then…

On the global measure, people start out at age 18 feeling pretty good about themselves, and then, apparently, life begins to throw curve balls. They feel worse and worse until they hit 50. At that point, there is a sharp reversal, and people keep getting happier as they age. By the time they are 85, they are even more satisfied with themselves than they were at 18.” Via the NYT, a new study finds older people tend to be the happiest among us.

“‘It could be that there are environmental changes,’ said Arthur A. Stone, the lead author of a new study based on the survey, ‘or it could be psychological changes about the way we view the world, or it could even be biological — for example brain chemistry or endocrine changes.’” My guess, from where I sit at 35 — perspective, a.k.a. wisdom. You don’t live to 85 by sweating the small stuff, and by then you probably have a pretty good sense of how things tend to shake out anyway.

Change You Can Be Afraid Of.


“‘You’re coming of age in a 24/7 media environment that bombards us with all kinds of content and exposes us to all kinds of arguments, some of which don’t always rank all that high on the truth meter,’ Obama said at Hampton University, Virginia. ‘With iPods and iPads and Xboxes and PlayStations, — none of which I know how to work — information becomes a distraction, a diversion, a form of entertainment, rather than a tool of empowerment, rather than the means of emancipation.

Sigh. We’ve come a long way from “Dirt Off Your Shoulder.” In a commencement speech at Hampton University over the weekend, President Obama channels his inner grumpy-old-man (Roger Ebert?) to warn new grads about the perils of gaming and gadgetry. First off, it’s a ludicrous statement on its face: iPods are not particularly hard to work — and, if they’re really insidious Weapons of Mental Distraction, why give one to the Queen (who, by the way and to her credit, has fully embraced the Wii?)

Speaking more broadly, misinformation has been around as long as the republic — go read up on the Jefferson v. Adams race of 1800. If anything, today’s information society allows people to more easily hear the news from multiple sources, which is a very good thing. In fact, the reason our political culture these days is constantly bombarded with irrelevant, distracting, and inane mistruths has nothing — none, zip, zero — to do with iPods, iPads, Xboxes, or Playstations. It has to do with ABC, CBS, WaPo, Politico, and the rest of the Village, i.e. the very same people the President was noshing with a few weeks ago at the ne plus ultra of “information becoming distracting entertainment“, the White House Correspondents’ DInner.

Finally, while the “multi-tasking is distracting” proposition does seem to hold water, scientifically speaking, the jury’s still out on the pernicious effects of Xbox’s and the like. In fact, there are plenty of studies suggesting that video games improve vision, improve reflexes, improve attention, improve cognition, improve memory, and improve “fluid intelligence,” a.k.a. problem-solving. So, let’s not get out the torches and pitchforks just yet. It could just be that the 21st-century interactive culture is making better, smarter, more informed citizens. (And, hey, let’s not forget Admongo.)

To get to the point, while it’s not as irritating as the concerned-centrist pearl-clutching over GTA in past years, it’s just a troubling dynamic to see not only a young, ostensibly Kennedyesque president but the most powerful man in the world tsk-tsking about all this new-fangled technology ruining the lives of the young people. Let’s try to stay ahead of the curve, please. And let’s keep focus on the many problems — lack of jobs, crushing student loan and/or credit card debts, etc. — that might be distracting new graduates right now more than their iPods and PS3s. (Also, try to pick up a copy of Stephen Duncombe’s Dream — Video game-style interactivity isn’t the enemy. It’s the future.)

I Think I Feel Better.


It may be, then, that the simplest and least ethically hazardous way to capitalize on the placebo effect is to acknowledge that medicine isn’t just a set of approved treatments — it’s also a ritual, with symbolism and meaning that are key to its efficacy. At its best, that ritual spurs positive expectations, sparks associations with past healing experiences, and eases distress in ways that can alleviate suffering. These meanings, researchers say, are what the placebo effect is really about.

In the Boston Globe, Rebecca Tuhus-Dubrow examines the past, present, and future of the placebo effect. “You’re talking about many, many, many millions of dollars a year in drug treatment costs…If [doctors] can produce approximately the same therapeutic effect with less drug, then it’s obviously safer for the patient, and I can’t believe they wouldn’t want to look into doing this.’

Don’t Stand So Close to Me.

“‘No man is an island,’ said Nicholas A. Christakis, a professor of medicine and medical sociology at Harvard Medical School who helped conduct the research. ‘Something so personal as a person’s emotions can have a collective existence and affect the vast fabric of humanity.’

Forget H1N1: Psychologists uncover statistical indications that loneliness transmits like a social disease. “Loneliness is not just the property of an individual. It can be transmitted across people — even people you don’t have direct contact with.” Hmmm. Well, that explains grad school, then.

A Theory of Justice (and the Dog Park.)

“That traditional view of morality is beginning to show signs of wear and tear. The fact that human morality is different from animal morality — and perhaps more highly developed in some respects — simply does not support the broader claim that animals lack morality; it merely supports the rather banal claim that human beings are different from other animals…Unique human adaptations might be understood as the outer skins of an onion; the inner layers represent a much broader, deeper, and evolutionarily more ancient set of moral capacities shared by many social mammals, and perhaps by other animals and birds as well.

In The Chronicle of Higher Education, bioethicist Jessica Pierce and biologist Marc Bekoff suggest what apparently agreed-upon rules of canid play teach us about animal morality. (via FmH.) “Although play is fun, it’s also serious business. When animals play, they are constantly working to understand and follow the rules and to communicate their intentions to play fairly.

The Politics of Yecccch.

“Likewise, conservatives are more likely than liberals to sense contamination or perceive disgust. People who would be disgusted to find that they had accidentally sipped from an acquaintance’s drink are more likely to identify as conservatives.” The NYT’s Nicholas Kristof examines the hardwired psychological differences between liberals and conservatives. “The larger point is that liberals and conservatives often form judgments through flash intuitions that aren’t a result of a deliberative process. The crucial part of the brain for these judgments is the medial prefrontal cortex, which has more to do with moralizing than with rationality …For liberals, morality derives mostly from fairness and prevention of harm. For conservatives, morality also involves upholding authority and loyalty — and revulsion at disgust.

We Control The Verti…ooh, new Tweet!

“Over the last several years, the problem of attention has migrated right into the center of our cultural attention. We hunt it in neurology labs, lament its decline on op-ed pages, fetishize it in grassroots quality-of-life movements, diagnose its absence in more and more of our children every year, cultivate it in yoga class twice a week, harness it as the engine of self-help empires, and pump it up to superhuman levels with drugs originally intended to treat Alzheimer’s and narcolepsy…We are, in short, terminally distracted. And distracted, the alarmists will remind you, was once a synonym for insane.”

Or, as Matt Johnson put it 25 years ago, I’ve been filled with useless information, spewed out by papers and radio stations…Another year older and what have i done? All my aspirations have shriveled in the sun. And don’t get me started on blogs, e-mails, youtubes, and tweets. In a New York Magazine cover story, Sam Anderson runs the gamut from Buddhism to Lifehacking to ascertain whether technology has really propelled us into a “crisis of attention”. (By way of Dangerous Meta, a blog that’s invariably worth the distraction.) And his conclusion? Maybe, but thems the breaks, folks. There’s no going back at this point. “This is what the web-threatened punditry often fails to recognize: Focus is a paradox — it has distraction built into it. The two are symbiotic; they’re the systole and diastole of consciousness…The truly wise will harness, rather than abandon, the power of distraction.

Which just goes to show, the real key to harnessing distraction is…wait, hold on a tic, gotta get back to you. There’s a new funny hamster vid on Youtube.

So Tweet and So Cold.

@JohnnyCash: Hello from Reno. Shot man…just to watch him die, actually. Weird, I know.
@ACamus: Beach lovely this time of year. Also, killed Arab. Oops.

Or something like that. Apparently, a new study suggests that — uh, oh — using Twitter may stunt one’s moral development. “A study suggests rapid-fire news updates and instant social interaction are too fast for the ‘moral compass’ of the brain to process. The danger is that heavy Twitters and Facebook users could become ‘indifferent to human suffering’ because they never get time to reflect and fully experience emotions about other people’s feelings.

Hmm. I can’t say I’ve found Twitter to be particularly useful yet — to be honest, it all seems rather gimmicky to me, I worry about its Idiocracy-like implications. (Why 140 characters? Why not 10?), and, frankly, I often find that neither my life nor anyone else’s (nor, for that matter, that of anyone’s else’s adorable little children) is all that fascinating from moment to moment. (“Got up. Tired. It’s raining. Maybe I’ll eat some Grape Nuts.“) But I don’t think I can pin any personal reservoir of misanthropy on it either. (For that, I blame FOX News.)

A Hole in the Heart.

“‘This is the part of the brain involved in knowing that you want something,’ she said. ‘When people who are not adjusting well are having these sorts of thoughts about the person, they are experiencing this reward pathway being activated. They really are craving in a way that perhaps is not allowing them or helping them adapt to the new reality.‘” It’s darker than you know in those complicated shadows…A new study finds that unrelenting grief works on the brain differently than the usual kind of post-traumatic depression. “The same brain system is involved in other powerful cravings, such those that afflict drug addicts and alcoholics…It’s like they’re addicted to the happy memories.