To kick off his new Slate column “Anything Once,” friend Seth Stevenson finds himself reveling in the sensation of sensory deprivation. “I emerged in a profound daze. I spoke slowly and quietly, like a smooth-jazz DJ, to the person at the spa desk who inquired how my session had gone. I felt more rested than if I’d slept for 16 hours on a pile of tranquilized chinchillas. Outside, colors were saturated; sounds were vivid. I had to try this again, as soon as possible.”
When you’re lying awake at night, it’s alright: BBC’s Stephanie Hegarty delves into pre-industrial sleep habits and discovers that eight hours of uninterrupted sleep may be a recent invention. “Much like the experience of Wehr’s subjects, these references describe a first sleep which began about two hours after dusk, followed by waking period of one or two hours and then a second sleep. ‘It’s not just the number of references – it is the way they refer to it, as if it was common knowledge,’ Ekirch says.”
President Obama makes the case for federal investment in the Brain Activity Map Project. (You heard it here first, tinfoil hat people. The tyranny of the Kenyan socialist will not stop at your precious bodily fluids — He’s going to read your brainwaves too!) Seriously, though, investing in basic scientific research like this is, er, a no-brainer. It creates jobs while advancing the frontiers of human knowledge in all kinds of unanticipated ways. We’d be stupid not to support this — which means, of course, the jury’s still out on whether we will.
Update: “BAM is an acronym you’ll probably be hearing a lot in the weeks and months to come — so let’s talk about what the BAM project is, what it isn’t, and why it’s raising both interest and eyebrows throughout the scientific community.” Io9 has more.
A novel path of brain research suggests our immune system may play a key role in determining intelligence. “The same T cells that protect the brain from inflammation also work to keep us sharp; and in what appears to be a feedback loop, the mere act of learning reinforces the effect.”
Also in decoding-the-brain news, Japanese scientists visually capture the creation of a zebrafish’s thought. “[W]e shouldn’t play this down: this is a fundamental leap forward in our understanding of how brains work.”
“‘We don’t know whether the media multitasking is causing symptoms of depression and social anxiety, or if it’s that people who are depressed and anxious are turning to media multitasking as a form of distraction from their problems,’ Becker said in a statement.”
And here I thought Netflix and Warcraft went so well together: A new Michigan State study finds a correlation between depression and multi-tasking media. I wonder if the obverse is true also. One of the many reasons I love seeing a movie in the theater is that (ideally) nothing else but the film is impinging on my attention.
“‘The work shows that processes like placebo and nocebo happen without us being aware of the cues that trigger them,’ said Jensen. ‘We get these responses due to associative learning. We don’t need somebody standing there saying ‘ok, now you will feel less pain’. It’s being elicited naturally, and without us being aware, all the time.’“
A new study finds that subliminal cues help create the placebo effect (and its opposite, the “nocebo effect”)…although, reading the overview of the experiment here, the conclusion sounds more like: People will subliminally recoil from bad things that happen to them.
Developmental psychiatrist Nancy Maurer discusses her findings that playing first-person-shooters helps people born with cataracts to improve their vision. “I’m a reader. My husband and I don’t have children. So computer games wouldn’t be a part of our lives. I’ve never played one. I can’t imagine enjoying playing one.“
By way of Dangerous Meta, researchers figure out a way to manufacture embryonic stem cells without an embryo, thus clearing the path for future research in that direction unhampered by abortion politics. “The discovery could be the key to cure the incurable – from heart attacks to severed spinal cord to cancer—and open the door, some day, to eternal youth.“
To me, children of the atom: A scientific study suggests that progeny of older men are more prone to mutations like autism, schizophrenia, telekinesis, and whatnot. “A man aged 29.7 at the time he fathered a child contributed 63 new mutations on average to his offspring, the authors found, and a man aged 46.2 contributed 126 mutations — a doubling, the authors calculated.” My biological clock is ticking like this…
The NYT tells the tale of Chaser, a border collie with a vocabulary of over 1000 words now. “Dr. Pilley said that most border collies, with special training, ‘could be pretty close to where Chaser is.’…Dr. Horowitz agreed: ‘It is not necessarily Chaser or Rico who is exceptional; it is the attention that is lavished on them,” she said.’” (Sorry, Berk…At least I taught you bacon and tacos — you know, the important stuff.)
Two new studies find a correlation between intelligence and a thirst for alcohol. Hey, I buy it – Thank you, science, for lending support to my vices! And, as Bogey said, “The problem with the world is that everyone is a few drinks behind.”
“On the global measure, people start out at age 18 feeling pretty good about themselves, and then, apparently, life begins to throw curve balls. They feel worse and worse until they hit 50. At that point, there is a sharp reversal, and people keep getting happier as they age. By the time they are 85, they are even more satisfied with themselves than they were at 18.” Via the NYT, a new study finds older people tend to be the happiest among us.
“‘It could be that there are environmental changes,’ said Arthur A. Stone, the lead author of a new study based on the survey, ‘or it could be psychological changes about the way we view the world, or it could even be biological — for example brain chemistry or endocrine changes.’” My guess, from where I sit at 35 — perspective, a.k.a. wisdom. You don’t live to 85 by sweating the small stuff, and by then you probably have a pretty good sense of how things tend to shake out anyway.
Sigh. We’ve come a long way from “Dirt Off Your Shoulder.” In a commencement speech at Hampton University over the weekend, President Obama channels his inner grumpy-old-man (Roger Ebert?) to warn new grads about the perils of gaming and gadgetry. First off, it’s a ludicrous statement on its face: iPods are not particularly hard to work — and, if they’re really insidious Weapons of Mental Distraction, why give one to the Queen (who, by the way and to her credit, has fully embraced the Wii?)
Speaking more broadly, misinformation has been around as long as the republic — go read up on the Jefferson v. Adams race of 1800. If anything, today’s information society allows people to more easily hear the news from multiple sources, which is a very good thing. In fact, the reason our political culture these days is constantly bombarded with irrelevant, distracting, and inane mistruths has nothing — none, zip, zero — to do with iPods, iPads, Xboxes, or Playstations. It has to do with ABC, CBS, WaPo, Politico, and the rest of the Village, i.e. the very same people the President was noshing with a few weeks ago at the ne plus ultra of “information becoming distracting entertainment“, the White House Correspondents’ DInner.
Finally, while the “multi-tasking is distracting” proposition does seem to hold water, scientifically speaking, the jury’s still out on the pernicious effects of Xbox’s and the like. In fact, there are plenty of studies suggesting that video games improve vision, improve reflexes, improve attention, improve cognition, improve memory, and improve “fluid intelligence,” a.k.a. problem-solving. So, let’s not get out the torches and pitchforks just yet. It could just be that the 21st-century interactive culture is making better, smarter, more informed citizens. (And, hey, let’s not forget Admongo.)
To get to the point, while it’s not as irritating as the concerned-centrist pearl-clutching over GTA in past years, it’s just a troubling dynamic to see not only a young, ostensibly Kennedyesque president but the most powerful man in the world tsk-tsking about all this new-fangled technology ruining the lives of the young people. Let’s try to stay ahead of the curve, please. And let’s keep focus on the many problems — lack of jobs, crushing student loan and/or credit card debts, etc. — that might be distracting new graduates right now more than their iPods and PS3s. (Also, try to pick up a copy of Stephen Duncombe’s Dream — Video game-style interactivity isn’t the enemy. It’s the future.)
In the Boston Globe, Rebecca Tuhus-Dubrow examines the past, present, and future of the placebo effect. “You’re talking about many, many, many millions of dollars a year in drug treatment costs…If [doctors] can produce approximately the same therapeutic effect with less drug, then it’s obviously safer for the patient, and I can’t believe they wouldn’t want to look into doing this.’“
“That traditional view of morality is beginning to show signs of wear and tear. The fact that human morality is different from animal morality — and perhaps more highly developed in some respects — simply does not support the broader claim that animals lack morality; it merely supports the rather banal claim that human beings are different from other animals…Unique human adaptations might be understood as the outer skins of an onion; the inner layers represent a much broader, deeper, and evolutionarily more ancient set of moral capacities shared by many social mammals, and perhaps by other animals and birds as well.“
In The Chronicle of Higher Education, bioethicist Jessica Pierce and biologist Marc Bekoff suggest what apparently agreed-upon rules of canid play teach us about animal morality. (via FmH.) “Although play is fun, it’s also serious business. When animals play, they are constantly working to understand and follow the rules and to communicate their intentions to play fairly.“
“Likewise, conservatives are more likely than liberals to sense contamination or perceive disgust. People who would be disgusted to find that they had accidentally sipped from an acquaintance’s drink are more likely to identify as conservatives.” The NYT’s Nicholas Kristof examines the hardwired psychological differences between liberals and conservatives. “The larger point is that liberals and conservatives often form judgments through flash intuitions that aren’t a result of a deliberative process. The crucial part of the brain for these judgments is the medial prefrontal cortex, which has more to do with moralizing than with rationality …For liberals, morality derives mostly from fairness and prevention of harm. For conservatives, morality also involves upholding authority and loyalty — and revulsion at disgust.”
“Over the last several years, the problem of attention has migrated right into the center of our cultural attention. We hunt it in neurology labs, lament its decline on op-ed pages, fetishize it in grassroots quality-of-life movements, diagnose its absence in more and more of our children every year, cultivate it in yoga class twice a week, harness it as the engine of self-help empires, and pump it up to superhuman levels with drugs originally intended to treat Alzheimer’s and narcolepsy…We are, in short, terminally distracted. And distracted, the alarmists will remind you, was once a synonym for insane.”
Or, as Matt Johnson put it 25 years ago, I’ve been filled with useless information, spewed out by papers and radio stations…Another year older and what have i done? All my aspirations have shriveled in the sun. And don’t get me started on blogs, e-mails, youtubes, and tweets. In a New York Magazine cover story, Sam Anderson runs the gamut from Buddhism to Lifehacking to ascertain whether technology has really propelled us into a “crisis of attention”. (By way of Dangerous Meta, a blog that’s invariably worth the distraction.) And his conclusion? Maybe, but thems the breaks, folks. There’s no going back at this point. “This is what the web-threatened punditry often fails to recognize: Focus is a paradox — it has distraction built into it. The two are symbiotic; they’re the systole and diastole of consciousness…The truly wise will harness, rather than abandon, the power of distraction.“
Which just goes to show, the real key to harnessing distraction is…wait, hold on a tic, gotta get back to you. There’s a new funny hamster vid on Youtube.
Or something like that. Apparently, a new study suggests that — uh, oh — using Twitter may stunt one’s moral development. “A study suggests rapid-fire news updates and instant social interaction are too fast for the ‘moral compass’ of the brain to process. The danger is that heavy Twitters and Facebook users could become ‘indifferent to human suffering’ because they never get time to reflect and fully experience emotions about other people’s feelings.“
Hmm. I can’t say I’ve found Twitter to be particularly useful yet — to be honest, it all seems rather gimmicky to me, I worry about its Idiocracy-like implications. (Why 140 characters? Why not 10?), and, frankly, I often find that neither my life nor anyone else’s (nor, for that matter, that of anyone’s else’s adorable little children) is all that fascinating from moment to moment. (“Got up. Tired. It’s raining. Maybe I’ll eat some Grape Nuts.“) But I don’t think I can pin any personal reservoir of misanthropy on it either. (For that, I blame FOX News.)
“‘This is the part of the brain involved in knowing that you want something,’ she said. ‘When people who are not adjusting well are having these sorts of thoughts about the person, they are experiencing this reward pathway being activated. They really are craving in a way that perhaps is not allowing them or helping them adapt to the new reality.‘” It’s darker than you know in those complicated shadows…A new study finds that unrelenting grief works on the brain differently than the usual kind of post-traumatic depression. “The same brain system is involved in other powerful cravings, such those that afflict drug addicts and alcoholics…It’s like they’re addicted to the happy memories.“
“We appear to be bringing the worst affected parts of the brain functionally back to life.” Is Alzheimer’s disease about to go the way of polio? A new drug known as rember, according to scientists in England, seems to halt and even roll back the symptoms of Alzheimer’s. “We have demonstrated for the first time that it may be possible to arrest progression of the disease by targeting the tangles that are highly correlated with the disease. This is the most significant development in the treatment of the tangles since Alois Alzheimer discovered them in 1907.“
Ok, this one’s a bit creepy. By way of Webgoddess, watch the rotating dancer to ascertain whether you’re left-brained or right-brained. I’m pretty right-brained, it seems (which makes sense, since I’m both left-handed and left-footed). But, if I changed tasks while the dancer was on — say went to click another window or focused on the list at left, she’d sometimes switch direction. Weird…well, I just hope my right-brain knows what my left-brain is doing.
Is political conflict bred in the bone (or, put less charitably, do some among us just have an easier time with higher-order thinking)? A new joint NYU-UCLA neurobiological study finds once again that left- and right-leaning brains function differently, with liberal minds more receptive to change than their conservative counterparts. “Dozens of previous studies have established a strong link between political persuasion and certain personality traits. Conservatives tend to crave order and structure in their lives, and are more consistent in the way they make decisions. Liberals, by contrast, show a higher tolerance for ambiguity and complexity, and adapt more easily to unexpected circumstances…[In this case] respondents who had described themselves as liberals showed ‘significantly greater conflict-related neural activity’ when the hypothetical situation called for an unscheduled break in routine. Conservatives, however, were less flexible, refusing to deviate from old habits ‘despite signals that this…should be changed.’”
Eternal sunshine of the spotless mind! Each pray’r accepted, and each wish resign’d. Life imitates art as researchers hone in on drugs that will potentially erase traumatic memories. “‘This is all very preliminary,’ said Dr. Roger Pitman, a Harvard Medical School psychiatrist. ‘We’re just getting started. There is some promising preliminary data but no conclusions.‘”
Love is a stranger in an open car…or is it just a much-needed dopamine fix? Somebody writes this story every Valentine’s Day. Still, I guess it’s something to keep in mind. (And sorry, Berk, you may be my Valentine again this year, but the same type of deconstruction applies to you. No hard feelings, bud.)
“‘People are born to dance,’ Ebstein told Discovery News. ‘They have (other) genes that partially contribute to musical talent, such as coordination, sense of rhythm. However, the genes we studied are more related to the emotional side of dancing — the need and ability to communicate with other people and a spiritual side to their natures that not only enable them to feel the music, but to communicate that feeling to others via dance.” Looks like the Red Shoes are just a placebo — According to recent research at Hebrew University’s Scheinfeld Center for Genetic Studies, some people are just hardwired to dance. Now if only they could figure out why some people start conga lines or insist on breaking into the Electric Slide. (Via Dangerous Meta.)
It’s an ugly day for voter rationality in today’s New York Times. According to a new study by several political scientists, our political predispositions may be genetic (and last summer’s Zellout may have been the result of a lingering discordance between genetic and environmental factors in Miller’s make-up.) Whatsmore, we seem to choose our elected leaders immediately by their physical attributes, namely a general look of competence: “Both babies and baby-faced adults share certain characteristics: round faces, large eyes, small noses, high foreheads, and small chins. No one trusts the competence of a baby, and few, apparently, trust that of an adult who looks like one.” (Don’t lose heart, fellow advocates of an informed and capable electorate — There’s obviously a huge gaping hole in this latter theory.)
Some love is just a lie of the heart. But most of the time, it just means your oxytocin levels are out of whack…
“Sun is shinin’ in the sky, there ain’t a cloud in sight…” Life imitates art as scientists attempt to achieve “therapeutic forgetting”, a.k.a. the focused erasure of memories. Right now, though, they haven’t got much past dulling the edge off old remembrances. “Our experiences and our memories in a lot of ways define us and define who we are,” notes Stanford ethicist David Magnus about the field, “[a]nd so that’s a scary step to go down. We should be very careful about going down a path that could lead to a serious alteration of the core essence of our identities.” Can you hear me? I don’t want this anymore, I want to call it off!
A new German study finds sleep is essential for creativity. Hmm, well that explains a lot over in these parts.
In celebration of a quarter-century of Science Times, the paper ruminates on the 25 questions currently driving science, while Alan Lightman ponders the motivations that fuel scientists. I’m not sure if the likes of Stephen Hawking are really contemplating Atlantis, but there’s some intriguing stuff here.
More good news on the gaming tip – Apparently, gamers aren’t only more spatially aware, they’re social multitaskers, too. All this validation for gamers is going to end up getting me in trouble…the last thing I need right now is another Civ binge.
Meant to blog this last week but forgot: FPS games increase brainpower. Experienced players of these games are 30 percent to 50 percent better than nonplayers at taking in everything that happens around them…They identify objects in their peripheral vision, perceiving numerous objects without having to count them, switch attention rapidly and track many items at once. Glad to hear my endless logged hours of Day of Defeat have not gone to waste. And considering I rented Enter the Matrix over the weekend and spent an unhealthy amount of time beating it, I must be operating on a Zen plane right now.
“I must not fear. Fear is the mindkiller. Fear is the little-death that brings total obliteration. I will face my fear. I will permit it to pass over and through me. And when it is gone I will turn the inner eye to see its path. Where the fear has gone there will be nothing. Only I will remain.”