In a long piece at The Atlantic, Robert Wright ponders recent arguments about the biological basis of morality. “If Greene thinks that getting people to couch their moral arguments in a highly reasonable language will make them highly reasonable, I think he’s underestimating the cleverness and ruthlessness with which our inner animals pursue natural selection’s agenda. We seem designed to twist moral discourse — whatever language it’s framed in — to selfish or tribal ends, and to remain conveniently unaware of the twisting.”
Some good news for a change, by way of Dangerous Meta: Scientists have developed (for mice, at least) what appears to be a breakthrough drug that could prevent Alzheimers and other neurodegenerative diseases like Parkinson’s(!):
“When a virus hijacks a brain cell it leads to a build-up of viral proteins. Cells respond by shutting down nearly all protein production in order to halt the virus’s spread. However, many neurodegenerative diseases involve the production of faulty or ‘misfolded’ proteins. These activate the same defences, but with more severe consequences…The researchers used a compound which prevented those defence mechanisms kicking in and in turn halted neurodegeneration.”
And now, IT HAS A FACE. Scientists program an old-timey robot to dramatize the electrical signals emanating from slime mold. In a 100,000 years, this is going to seem like one of those Skynet-level bad ideas. And the hat is particularly creepy touch — Very Something Wicked This Way Comes.
More Science of Sleep: In Scientific American, two Italian academics put forward their “synaptic homeostasis hypothesis” (SHY) of slumber, whereby the brain weakens (not strengthens, as is usually assumed) synaptic links overnight. “In principle, SHY explains the essential, universal purpose of sleep…sleep restores the brain to a state where it can learn and adapt when we are awake…Most generally, sleep is the price we pay for the brain’s plasticity — its ability to modify its wiring in response to experience.”
Also part of SHY: the idea of “local sleep”: “Recently we have even found that prolonged or intense use of certain circuits can make local groups of neurons ‘fall asleep’ even though the rest of the brain (and the organism itself) remains awake…It seems that when we have been awake for too long or have overexerted certain circuits, small chunks of the brain may take quick naps without giving notice.” I believe in Internet parlance this is known as “haz-ing the dumb.”
Better living through chemistry: The NYT’s Gretchen Reynolds touts the potential medical benefits of caffeine addiction. “Participants with little or no caffeine circulating in their bloodstreams were far more likely to have progressed to full-blown Alzheimer’s than those whose blood indicated they’d had about three cups’ worth of caffeine.” Factor in all the taurine I consume to boot, and I’m disco.
To kick off his new Slate column “Anything Once,” friend Seth Stevenson finds himself reveling in the sensation of sensory deprivation. “I emerged in a profound daze. I spoke slowly and quietly, like a smooth-jazz DJ, to the person at the spa desk who inquired how my session had gone. I felt more rested than if I’d slept for 16 hours on a pile of tranquilized chinchillas. Outside, colors were saturated; sounds were vivid. I had to try this again, as soon as possible.”
When you’re lying awake at night, it’s alright: BBC’s Stephanie Hegarty delves into pre-industrial sleep habits and discovers that eight hours of uninterrupted sleep may be a recent invention. “Much like the experience of Wehr’s subjects, these references describe a first sleep which began about two hours after dusk, followed by waking period of one or two hours and then a second sleep. ‘It’s not just the number of references – it is the way they refer to it, as if it was common knowledge,’ Ekirch says.”
President Obama makes the case for federal investment in the Brain Activity Map Project. (You heard it here first, tinfoil hat people. The tyranny of the Kenyan socialist will not stop at your precious bodily fluids — He’s going to read your brainwaves too!) Seriously, though, investing in basic scientific research like this is, er, a no-brainer. It creates jobs while advancing the frontiers of human knowledge in all kinds of unanticipated ways. We’d be stupid not to support this — which means, of course, the jury’s still out on whether we will.
Update: “BAM is an acronym you’ll probably be hearing a lot in the weeks and months to come — so let’s talk about what the BAM project is, what it isn’t, and why it’s raising both interest and eyebrows throughout the scientific community.” Io9 has more.
A novel path of brain research suggests our immune system may play a key role in determining intelligence. “The same T cells that protect the brain from inflammation also work to keep us sharp; and in what appears to be a feedback loop, the mere act of learning reinforces the effect.”
Also in decoding-the-brain news, Japanese scientists visually capture the creation of a zebrafish’s thought. “[W]e shouldn’t play this down: this is a fundamental leap forward in our understanding of how brains work.”
“‘We don’t know whether the media multitasking is causing symptoms of depression and social anxiety, or if it’s that people who are depressed and anxious are turning to media multitasking as a form of distraction from their problems,’ Becker said in a statement.”
And here I thought Netflix and Warcraft went so well together: A new Michigan State study finds a correlation between depression and multi-tasking media. I wonder if the obverse is true also. One of the many reasons I love seeing a movie in the theater is that (ideally) nothing else but the film is impinging on my attention.
“‘The work shows that processes like placebo and nocebo happen without us being aware of the cues that trigger them,’ said Jensen. ‘We get these responses due to associative learning. We don’t need somebody standing there saying ‘ok, now you will feel less pain’. It’s being elicited naturally, and without us being aware, all the time.’“
A new study finds that subliminal cues help create the placebo effect (and its opposite, the “nocebo effect”)…although, reading the overview of the experiment here, the conclusion sounds more like: People will subliminally recoil from bad things that happen to them.
Developmental psychiatrist Nancy Maurer discusses her findings that playing first-person-shooters helps people born with cataracts to improve their vision. “I’m a reader. My husband and I don’t have children. So computer games wouldn’t be a part of our lives. I’ve never played one. I can’t imagine enjoying playing one.“
By way of Dangerous Meta, researchers figure out a way to manufacture embryonic stem cells without an embryo, thus clearing the path for future research in that direction unhampered by abortion politics. “The discovery could be the key to cure the incurable – from heart attacks to severed spinal cord to cancer—and open the door, some day, to eternal youth.“
To me, children of the atom: A scientific study suggests that progeny of older men are more prone to mutations like autism, schizophrenia, telekinesis, and whatnot. “A man aged 29.7 at the time he fathered a child contributed 63 new mutations on average to his offspring, the authors found, and a man aged 46.2 contributed 126 mutations — a doubling, the authors calculated.” My biological clock is ticking like this…
The NYT tells the tale of Chaser, a border collie with a vocabulary of over 1000 words now. “Dr. Pilley said that most border collies, with special training, ‘could be pretty close to where Chaser is.’…Dr. Horowitz agreed: ‘It is not necessarily Chaser or Rico who is exceptional; it is the attention that is lavished on them,” she said.’” (Sorry, Berk…At least I taught you bacon and tacos — you know, the important stuff.)
Two new studies find a correlation between intelligence and a thirst for alcohol. Hey, I buy it – Thank you, science, for lending support to my vices! And, as Bogey said, “The problem with the world is that everyone is a few drinks behind.”
“On the global measure, people start out at age 18 feeling pretty good about themselves, and then, apparently, life begins to throw curve balls. They feel worse and worse until they hit 50. At that point, there is a sharp reversal, and people keep getting happier as they age. By the time they are 85, they are even more satisfied with themselves than they were at 18.” Via the NYT, a new study finds older people tend to be the happiest among us.
“‘It could be that there are environmental changes,’ said Arthur A. Stone, the lead author of a new study based on the survey, ‘or it could be psychological changes about the way we view the world, or it could even be biological — for example brain chemistry or endocrine changes.’” My guess, from where I sit at 35 — perspective, a.k.a. wisdom. You don’t live to 85 by sweating the small stuff, and by then you probably have a pretty good sense of how things tend to shake out anyway.
Sigh. We’ve come a long way from “Dirt Off Your Shoulder.” In a commencement speech at Hampton University over the weekend, President Obama channels his inner grumpy-old-man (Roger Ebert?) to warn new grads about the perils of gaming and gadgetry. First off, it’s a ludicrous statement on its face: iPods are not particularly hard to work — and, if they’re really insidious Weapons of Mental Distraction, why give one to the Queen (who, by the way and to her credit, has fully embraced the Wii?)
Speaking more broadly, misinformation has been around as long as the republic — go read up on the Jefferson v. Adams race of 1800. If anything, today’s information society allows people to more easily hear the news from multiple sources, which is a very good thing. In fact, the reason our political culture these days is constantly bombarded with irrelevant, distracting, and inane mistruths has nothing — none, zip, zero — to do with iPods, iPads, Xboxes, or Playstations. It has to do with ABC, CBS, WaPo, Politico, and the rest of the Village, i.e. the very same people the President was noshing with a few weeks ago at the ne plus ultra of “information becoming distracting entertainment“, the White House Correspondents’ DInner.
Finally, while the “multi-tasking is distracting” proposition does seem to hold water, scientifically speaking, the jury’s still out on the pernicious effects of Xbox’s and the like. In fact, there are plenty of studies suggesting that video games improve vision, improve reflexes, improve attention, improve cognition, improve memory, and improve “fluid intelligence,” a.k.a. problem-solving. So, let’s not get out the torches and pitchforks just yet. It could just be that the 21st-century interactive culture is making better, smarter, more informed citizens. (And, hey, let’s not forget Admongo.)
To get to the point, while it’s not as irritating as the concerned-centrist pearl-clutching over GTA in past years, it’s just a troubling dynamic to see not only a young, ostensibly Kennedyesque president but the most powerful man in the world tsk-tsking about all this new-fangled technology ruining the lives of the young people. Let’s try to stay ahead of the curve, please. And let’s keep focus on the many problems — lack of jobs, crushing student loan and/or credit card debts, etc. — that might be distracting new graduates right now more than their iPods and PS3s. (Also, try to pick up a copy of Stephen Duncombe’s Dream — Video game-style interactivity isn’t the enemy. It’s the future.)
In the Boston Globe, Rebecca Tuhus-Dubrow examines the past, present, and future of the placebo effect. “You’re talking about many, many, many millions of dollars a year in drug treatment costs…If [doctors] can produce approximately the same therapeutic effect with less drug, then it’s obviously safer for the patient, and I can’t believe they wouldn’t want to look into doing this.’“
“‘No man is an island,’ said Nicholas A. Christakis, a professor of medicine and medical sociology at Harvard Medical School who helped conduct the research. ‘Something so personal as a person’s emotions can have a collective existence and affect the vast fabric of humanity.’“
Forget H1N1: Psychologists uncover statistical indications that loneliness transmits like a social disease. “Loneliness is not just the property of an individual. It can be transmitted across people — even people you don’t have direct contact with.” Hmmm. Well, that explains grad school, then.
“That traditional view of morality is beginning to show signs of wear and tear. The fact that human morality is different from animal morality — and perhaps more highly developed in some respects — simply does not support the broader claim that animals lack morality; it merely supports the rather banal claim that human beings are different from other animals…Unique human adaptations might be understood as the outer skins of an onion; the inner layers represent a much broader, deeper, and evolutionarily more ancient set of moral capacities shared by many social mammals, and perhaps by other animals and birds as well.“
In The Chronicle of Higher Education, bioethicist Jessica Pierce and biologist Marc Bekoff suggest what apparently agreed-upon rules of canid play teach us about animal morality. (via FmH.) “Although play is fun, it’s also serious business. When animals play, they are constantly working to understand and follow the rules and to communicate their intentions to play fairly.“
“Likewise, conservatives are more likely than liberals to sense contamination or perceive disgust. People who would be disgusted to find that they had accidentally sipped from an acquaintance’s drink are more likely to identify as conservatives.” The NYT’s Nicholas Kristof examines the hardwired psychological differences between liberals and conservatives. “The larger point is that liberals and conservatives often form judgments through flash intuitions that aren’t a result of a deliberative process. The crucial part of the brain for these judgments is the medial prefrontal cortex, which has more to do with moralizing than with rationality …For liberals, morality derives mostly from fairness and prevention of harm. For conservatives, morality also involves upholding authority and loyalty — and revulsion at disgust.”
Or, as Matt Johnson put it 25 years ago, I’ve been filled with useless information, spewed out by papers and radio stations…Another year older and what have i done? All my aspirations have shriveled in the sun. And don’t get me started on blogs, e-mails, youtubes, and tweets. In a New York Magazine cover story, Sam Anderson runs the gamut from Buddhism to Lifehacking to ascertain whether technology has really propelled us into a “crisis of attention”. (By way of Dangerous Meta, a blog that’s invariably worth the distraction.) And his conclusion? Maybe, but thems the breaks, folks. There’s no going back at this point. “This is what the web-threatened punditry often fails to recognize: Focus is a paradox — it has distraction built into it. The two are symbiotic; they’re the systole and diastole of consciousness…The truly wise will harness, rather than abandon, the power of distraction.“
Which just goes to show, the real key to harnessing distraction is…wait, hold on a tic, gotta get back to you. There’s a new funny hamster vid on Youtube.
Or something like that. Apparently, a new study suggests that — uh, oh — using Twitter may stunt one’s moral development. “A study suggests rapid-fire news updates and instant social interaction are too fast for the ‘moral compass’ of the brain to process. The danger is that heavy Twitters and Facebook users could become ‘indifferent to human suffering’ because they never get time to reflect and fully experience emotions about other people’s feelings.“
Hmm. I can’t say I’ve found Twitter to be particularly useful yet — to be honest, it all seems rather gimmicky to me, I worry about its Idiocracy-like implications. (Why 140 characters? Why not 10?), and, frankly, I often find that neither my life nor anyone else’s (nor, for that matter, that of anyone’s else’s adorable little children) is all that fascinating from moment to moment. (“Got up. Tired. It’s raining. Maybe I’ll eat some Grape Nuts.“) But I don’t think I can pin any personal reservoir of misanthropy on it either. (For that, I blame FOX News.)
“‘This is the part of the brain involved in knowing that you want something,’ she said. ‘When people who are not adjusting well are having these sorts of thoughts about the person, they are experiencing this reward pathway being activated. They really are craving in a way that perhaps is not allowing them or helping them adapt to the new reality.‘” It’s darker than you know in those complicated shadows…A new study finds that unrelenting grief works on the brain differently than the usual kind of post-traumatic depression. “The same brain system is involved in other powerful cravings, such those that afflict drug addicts and alcoholics…It’s like they’re addicted to the happy memories.“
“We appear to be bringing the worst affected parts of the brain functionally back to life.” Is Alzheimer’s disease about to go the way of polio? A new drug known as rember, according to scientists in England, seems to halt and even roll back the symptoms of Alzheimer’s. “We have demonstrated for the first time that it may be possible to arrest progression of the disease by targeting the tangles that are highly correlated with the disease. This is the most significant development in the treatment of the tangles since Alois Alzheimer discovered them in 1907.“
“In the race for the White House, lefties seem to have the upper hand. No matter who wins in November, six of the 12 chief executives since the end of World War II will have been left-handed: Harry Truman, Gerald Ford, Ronald Reagan, the elder Bush, Clinton and either Obama or McCain. That’s a disproportionate number, considering that only one in 10 people in the general population is left-handed.” In the WP, authors Sam Wang and Sandra Aamodt explain why all your Oval Offices are belong to us, the lefties. We also swelled the ranks of both my undergraduate and graduate cohorts, whatever that’s worth.
“Thoughtcrime is death. Thoughtcrime does not entail death. Thoughtcrime IS death. I have committed even before setting pen to paper the essential crime that contains all others unto itself.” The shape of things to come? Scientists at Berkeley conceive a way to use MRI imaging to “map” images in the brain. “Our results suggest that it may soon be possible to reconstruct a picture of a person’s visual experience from measurements of brain activity alone. Imagine a general brain-reading device that could reconstruct a picture of a person’s visual experience at any moment in time…It is possible that decoding brain activity could have serious ethical and privacy implications downstream in, say, the 30 to 50-year time frame.”