To complement Calvinism: The NYT lays out a seven-minute workout that might actually work. “The exercises should be performed in rapid succession, allowing 30 seconds for each, while, throughout, the intensity hovers at about an 8 on a discomfort scale of 1 to 10…Those seven minutes should be, in a word, unpleasant.”
To kick off his new Slate column “Anything Once,” friend Seth Stevenson finds himself reveling in the sensation of sensory deprivation. “I emerged in a profound daze. I spoke slowly and quietly, like a smooth-jazz DJ, to the person at the spa desk who inquired how my session had gone. I felt more rested than if I’d slept for 16 hours on a pile of tranquilized chinchillas. Outside, colors were saturated; sounds were vivid. I had to try this again, as soon as possible.”
An online Harvard experiment tries to guess your age by evaluating your mouse-clicking ability. Hard to say how good it is, really. It deemed me thirty — eight years too young — but then again, with blogging and gaming both ranking high among the extra-curriculars, I probably use a mouse more than most people too.
A troubling GIF captures twenty-five years of expanding waistlines in America. “Meanwhile, through 2012, no state has met the CDC’s nationwide goal to reduce obesity to 15 percent.” Update: The states behind the curve? Everyone could be lying.
When you’re lying awake at night, it’s alright: BBC’s Stephanie Hegarty delves into pre-industrial sleep habits and discovers that eight hours of uninterrupted sleep may be a recent invention. “Much like the experience of Wehr’s subjects, these references describe a first sleep which began about two hours after dusk, followed by waking period of one or two hours and then a second sleep. ‘It’s not just the number of references – it is the way they refer to it, as if it was common knowledge,’ Ekirch says.”
Happiness where are you? I’ve searched so long for you. A statistical analysis of states’ relative happiness, as determined by tweets. (Red states above are happy, blue states are not.) David Simon is 2-for-2: Next to the mouth of the Mississippi, the Maryland-Delaware area is apparently the saddest in the nation. Perhaps due to proximity to Washington DC? Definitely maybe.
In probably related news, a different map of the United States shows the most popular places cited in Craigslist’s Missed Connections. “The most popular place to spot potential love in Texas, New Mexico, Missouri, Louisiana, Arkansas, Mississippi, Alabama, Idaho, Montana, South Dakota, Ohio, West Virginia, Tennessee, North Carolina and Florida? Wal-Mart.”
Of course, this begs the question: Do people actually ever meet up on Missed Connections? Every time I’ve perused them, that section is overwhelmingly the Boulevard of Broken Dreams, just damaged, lovelorn people sending out messages in a bottle to lost exes who are actively ignoring them.
President Obama makes the case for federal investment in the Brain Activity Map Project. (You heard it here first, tinfoil hat people. The tyranny of the Kenyan socialist will not stop at your precious bodily fluids — He’s going to read your brainwaves too!) Seriously, though, investing in basic scientific research like this is, er, a no-brainer. It creates jobs while advancing the frontiers of human knowledge in all kinds of unanticipated ways. We’d be stupid not to support this — which means, of course, the jury’s still out on whether we will.
Update: “BAM is an acronym you’ll probably be hearing a lot in the weeks and months to come — so let’s talk about what the BAM project is, what it isn’t, and why it’s raising both interest and eyebrows throughout the scientific community.” Io9 has more.
A novel path of brain research suggests our immune system may play a key role in determining intelligence. “The same T cells that protect the brain from inflammation also work to keep us sharp; and in what appears to be a feedback loop, the mere act of learning reinforces the effect.”
Also in decoding-the-brain news, Japanese scientists visually capture the creation of a zebrafish’s thought. “[W]e shouldn’t play this down: this is a fundamental leap forward in our understanding of how brains work.”
“‘We don’t know whether the media multitasking is causing symptoms of depression and social anxiety, or if it’s that people who are depressed and anxious are turning to media multitasking as a form of distraction from their problems,’ Becker said in a statement.”
And here I thought Netflix and Warcraft went so well together: A new Michigan State study finds a correlation between depression and multi-tasking media. I wonder if the obverse is true also. One of the many reasons I love seeing a movie in the theater is that (ideally) nothing else but the film is impinging on my attention.
“‘The work shows that processes like placebo and nocebo happen without us being aware of the cues that trigger them,’ said Jensen. ‘We get these responses due to associative learning. We don’t need somebody standing there saying ‘ok, now you will feel less pain’. It’s being elicited naturally, and without us being aware, all the time.’“
A new study finds that subliminal cues help create the placebo effect (and its opposite, the “nocebo effect”)…although, reading the overview of the experiment here, the conclusion sounds more like: People will subliminally recoil from bad things that happen to them.
Developmental psychiatrist Nancy Maurer discusses her findings that playing first-person-shooters helps people born with cataracts to improve their vision. “I’m a reader. My husband and I don’t have children. So computer games wouldn’t be a part of our lives. I’ve never played one. I can’t imagine enjoying playing one.“
By way of Dangerous Meta, researchers figure out a way to manufacture embryonic stem cells without an embryo, thus clearing the path for future research in that direction unhampered by abortion politics. “The discovery could be the key to cure the incurable – from heart attacks to severed spinal cord to cancer—and open the door, some day, to eternal youth.“
To me, children of the atom: A scientific study suggests that progeny of older men are more prone to mutations like autism, schizophrenia, telekinesis, and whatnot. “A man aged 29.7 at the time he fathered a child contributed 63 new mutations on average to his offspring, the authors found, and a man aged 46.2 contributed 126 mutations — a doubling, the authors calculated.” My biological clock is ticking like this…
In their very first month back in power, Paul Ryan, Akin and the gang — 225 Members, in fact — were trying to define rape down — “House Speaker John Boehner (R-Ohio) has dubbed [it] a top priority in the new Congress.” — and that bill passed the House(!)
Yes, today’s Democrats have their own serious problems — our leaders prostrate themselves before the phantom deficit gods, look the other way on Wall Street malfeasance, and have been actively terrible on the civil liberties front, and our policy playbook (individual mandate, cap-and-trade) has too often been cribbed from the Republicans of the ’90s. But it’s a difference in kind, not in degree. Akin is not an aberration in the GOP — He’s the new normal. Not that anyone who comes around here still does this sort of thing, but if you vote Republican, have no illusions about what you are doing: Ayn Rand and Akinism is basically what you’re voting for. Seriously, these guys are cray-cray.
(By the way, the great facehugger pic above is from From Talking to Doctors — worth checking out.)
Less adventurous and more satisfying in its storytelling than Soderbergh’s last major film, 2009′s The Informant!, Contagion basically applies the Traffic technique of several separate, loosely interweaving tales told around the globe (albeit this time with a more subdued color palette) to spin a harrowing chronicle of a possible pandemic. The main reason the film works so well is because Contagion is actually not the end-of-times pestilence thriller the (spoilerish) trailers make it out to be. Rather, and much like David Fincher’s Zodiac, it’s mainly a smart, well-told procedural, and it’s the grounded, matter-of-factness of Contagion that ultimately makes it so frightening.
Contagion telegraphs its unsentimental, take-no-prisoners approach to the story in the first five minutes, when, after returning home to Minneapolis from a business trip to Macau (and a brief layover in Chicago), Gwyneth Paltrow starts having trouble breathing and [minor spoiler] promptly drops dead. Soon, her family (including a low-key, earnest Matt Damon) are in quarantine, and the CDC Director in Atlanta (Lawrence Fishburne) has dispatched an epidemiologist (Kate Winslet) to coordinate with local officials on plans for a possible outbreak. (FWIW, the Minnesota Department of Health is not amused with the film.) But, unfortunately for the world, the barn door is already open: This new MERS-1 virus — part-bat, part-pig — has already been unleashed, and not just in Minneapolis, but in cities all over. (Turns out, Oceans 14 in Macau was a terrible idea.)
As the situation worsens around the world, we start following more individuals on the frontlines in various locales: A CDC researcher (Jennifer Ehle) working to find a cure for this new plague. An academic biologist (Elliot Gould) trying to isolate the virus in San Francisco. A WHO official (Marion Cotillard) and Chinese doctor (Chin Han) looking to discover who was Patient Zero in Macau. Two homeland security suits (Brian Cranston and Enrico Colatoni) sent to determine if this is the work of the terr’ists. A blogger (Jude Law) firmly convinced of government conspiracies and homeopathic wonders. And all the while, even as secrets pass from person to person and fear mutates into panic, the virus continues to spread. Ain’t no Patrick Dempsey monkey gonna solve this one, I’m afraid.
There’re plenty of stars and recognizable faces flitting about this story — some have more to do than others. (The Cotillard subplot seemed a bit unnecessary to me, to be honest, and the Jude Law one is basically just an extended screw-you to the vaccines-cause-autism crowd.) But, as I said, Contagion‘s killer app is its versimilitude. The movie never talks down to its audience, or has its scientists repeating expository information over and over again. (For example, it explains once what a “R-naught” is and assumes you can keep up from there.) It doesn’t have scientists (or Matt Damon, for that matter) running around trying to catch infected monkeys with helicopters — The excitement mostly comes from watching scientists and bureaucrats do their job well. And I liked the fact that, even though no one is safe here, Contagion doesn’t feature some kind of sci-fi-ish, humanity-obliterating virus. It’s a nasty bug with, iirc, a 25% fatality rate — In other words, a more virulent version of the 1918 influenza epidemic — making the story that much more plausible, and scary.
Speaking of scary, I should say that, back in the real world, I’m a pretty sanguine sort about germs, and so I found Contagion more unsettling than anything else. But if you’re at all of the germophobe persuasion, hoo boy — You’re going to have a tough time at this one. From infecteds hacking up a lung on a public bus, to waiters wiping down bar glasses with a dirty rag, to people endlessly and unconsciously touching rails, bannisters, buttons, and each other, Soderbergh does a great job here of intimating that human beings inadvertently leave a slime trail of germy death wherever we go — not the least in movie theaters, exactly like the one you’re sitting in. Point being, [cough, cough], OCD-ish folks will probably want to Netflix this one, instead.
On the subject of HIV, it looks like the enemy of our enemy is our friend: In a stunning feat of gene therapy, scientists have used a disabled version of HIV to successfully defeat leukemia. “Mr. Ludwig’s doctors have not claimed that he is cured — it is too soon to tell — nor have they declared victory over leukemia on the basis of this experiment, which involved only three patients…But scientists say [this] may signify a turning point in the long struggle to develop effective gene therapies against cancer…In essence, the team is using gene therapy to accomplish something that researchers have hoped to do for decades: train a person’s own immune system to kill cancer cells.”
“With this legislation, which was introduced last week by Rep. Chris Smith (R-N.J.), Republicans propose that the rape exemption be limited to ‘forcible rape.’ This would rule out federal assistance for abortions in many rape cases, including instances of statutory rape, many of which are non-forcible. For example: If a 13-year-old girl is impregnated by a 24-year-old adult, she would no longer qualify to have Medicaid pay for an abortion.“
On the principle that, as per MLK, “in the end, we will remember not the words of our enemies, but the silence of our friends,” I post more often here these days about issues I have with our own, ostensibly-lefty party. But, as Dangerous Meta reminds me: Just in case anyone forgot how crazy the Republicans are these days, the GOP Congress has, for pro-life purposes, actually fashioned a bill that defines rape down. “House Speaker John Boehner (R-Ohio) has dubbed [it] a top priority in the new Congress.” There are no words.
On the eve of the State of the Union — Win the Future! — the wags at Pleated Jeans compile a handy map of what each state desperately needs to work on. (By way of Blackpepper and Webgoddess.) “Whether it’s a fat population, high rate of STDs or excessive tax rate, it turns out that every state ranks dead last in at least one unsavory category. Check out the map (click image to enlarge) to see what your state is the worst at, then review additional stats and references after the jump.”
The NYT tells the tale of Chaser, a border collie with a vocabulary of over 1000 words now. “Dr. Pilley said that most border collies, with special training, ‘could be pretty close to where Chaser is.’…Dr. Horowitz agreed: ‘It is not necessarily Chaser or Rico who is exceptional; it is the attention that is lavished on them,” she said.’” (Sorry, Berk…At least I taught you bacon and tacos — you know, the important stuff.)
Two new studies find a correlation between intelligence and a thirst for alcohol. Hey, I buy it – Thank you, science, for lending support to my vices! And, as Bogey said, “The problem with the world is that everyone is a few drinks behind.”
“On the global measure, people start out at age 18 feeling pretty good about themselves, and then, apparently, life begins to throw curve balls. They feel worse and worse until they hit 50. At that point, there is a sharp reversal, and people keep getting happier as they age. By the time they are 85, they are even more satisfied with themselves than they were at 18.” Via the NYT, a new study finds older people tend to be the happiest among us.
“‘It could be that there are environmental changes,’ said Arthur A. Stone, the lead author of a new study based on the survey, ‘or it could be psychological changes about the way we view the world, or it could even be biological — for example brain chemistry or endocrine changes.’” My guess, from where I sit at 35 — perspective, a.k.a. wisdom. You don’t live to 85 by sweating the small stuff, and by then you probably have a pretty good sense of how things tend to shake out anyway.
“‘We believe this vaccine will someday be used to prevent breast cancer in adult women in the same way that vaccines prevent polio and measles in children,’ Vincent Tuohy, Ph.D., the study’s principal investigator and an immunologist at the Lerner Institute, told WOIO. ‘If it works in humans the way it works in mice, this will be monumental. We could eliminate breast cancer,’ he added.“
Some good news for the day: Scientists at the Cleveland Clinic believe they may have zeroed in on a vaccine for breast cancer. “The key, Tuohy said, is to find a target within the tumor that isn’t typically found in a healthy person. In the case of breast cancer, he and his team targeted a-lactalbumin, a protein found in the majority of breast cancers, but not in healthy women, except during lactation. Therefore, the vaccine can rev up a woman’s immune system to target a-lactalbumin, stopping tumor formation without damaging healthy breast tissue.”
Sigh. We’ve come a long way from “Dirt Off Your Shoulder.” In a commencement speech at Hampton University over the weekend, President Obama channels his inner grumpy-old-man (Roger Ebert?) to warn new grads about the perils of gaming and gadgetry. First off, it’s a ludicrous statement on its face: iPods are not particularly hard to work — and, if they’re really insidious Weapons of Mental Distraction, why give one to the Queen (who, by the way and to her credit, has fully embraced the Wii?)
Speaking more broadly, misinformation has been around as long as the republic — go read up on the Jefferson v. Adams race of 1800. If anything, today’s information society allows people to more easily hear the news from multiple sources, which is a very good thing. In fact, the reason our political culture these days is constantly bombarded with irrelevant, distracting, and inane mistruths has nothing — none, zip, zero — to do with iPods, iPads, Xboxes, or Playstations. It has to do with ABC, CBS, WaPo, Politico, and the rest of the Village, i.e. the very same people the President was noshing with a few weeks ago at the ne plus ultra of “information becoming distracting entertainment“, the White House Correspondents’ DInner.
Finally, while the “multi-tasking is distracting” proposition does seem to hold water, scientifically speaking, the jury’s still out on the pernicious effects of Xbox’s and the like. In fact, there are plenty of studies suggesting that video games improve vision, improve reflexes, improve attention, improve cognition, improve memory, and improve “fluid intelligence,” a.k.a. problem-solving. So, let’s not get out the torches and pitchforks just yet. It could just be that the 21st-century interactive culture is making better, smarter, more informed citizens. (And, hey, let’s not forget Admongo.)
To get to the point, while it’s not as irritating as the concerned-centrist pearl-clutching over GTA in past years, it’s just a troubling dynamic to see not only a young, ostensibly Kennedyesque president but the most powerful man in the world tsk-tsking about all this new-fangled technology ruining the lives of the young people. Let’s try to stay ahead of the curve, please. And let’s keep focus on the many problems — lack of jobs, crushing student loan and/or credit card debts, etc. — that might be distracting new graduates right now more than their iPods and PS3s. (Also, try to pick up a copy of Stephen Duncombe’s Dream — Video game-style interactivity isn’t the enemy. It’s the future.)
In the Boston Globe, Rebecca Tuhus-Dubrow examines the past, present, and future of the placebo effect. “You’re talking about many, many, many millions of dollars a year in drug treatment costs…If [doctors] can produce approximately the same therapeutic effect with less drug, then it’s obviously safer for the patient, and I can’t believe they wouldn’t want to look into doing this.’“
“To create melanin particles tiny enough to squeeze through the liver, lungs, and spleen, Dr. Dadachova and her team layered several coats of synthesized melanin on silica particles. The particles, once injected into mice, clung onto bone marrow, as the researchers intended.“
It’s in the air, for you and me…By way of the always illuminating Dangerous Meta, scientists find a possible way to make people radiation resistant via melanin nanoparticles. “Clinical trials testing the melanized particles on cancer patients may begin two or three years. Dr. Dadachova also surmises that the technique has potential for protecting astronauts against radiation exposure.“
Also in the Brave New World dept. and by way of a friend, The Economist takes a gander at new “bioprinter” technology. “As for bigger body parts, Dr Forgacs thinks they may take many different forms, at least initially. A man-made biological substitute for a kidney, for instance, need not look like a real one or contain all its features in order to clean waste products from the bloodstream.“
“That traditional view of morality is beginning to show signs of wear and tear. The fact that human morality is different from animal morality — and perhaps more highly developed in some respects — simply does not support the broader claim that animals lack morality; it merely supports the rather banal claim that human beings are different from other animals…Unique human adaptations might be understood as the outer skins of an onion; the inner layers represent a much broader, deeper, and evolutionarily more ancient set of moral capacities shared by many social mammals, and perhaps by other animals and birds as well.“
In The Chronicle of Higher Education, bioethicist Jessica Pierce and biologist Marc Bekoff suggest what apparently agreed-upon rules of canid play teach us about animal morality. (via FmH.) “Although play is fun, it’s also serious business. When animals play, they are constantly working to understand and follow the rules and to communicate their intentions to play fairly.“
“Likewise, conservatives are more likely than liberals to sense contamination or perceive disgust. People who would be disgusted to find that they had accidentally sipped from an acquaintance’s drink are more likely to identify as conservatives.” The NYT’s Nicholas Kristof examines the hardwired psychological differences between liberals and conservatives. “The larger point is that liberals and conservatives often form judgments through flash intuitions that aren’t a result of a deliberative process. The crucial part of the brain for these judgments is the medial prefrontal cortex, which has more to do with moralizing than with rationality …For liberals, morality derives mostly from fairness and prevention of harm. For conservatives, morality also involves upholding authority and loyalty — and revulsion at disgust.”
“Over the last several years, the problem of attention has migrated right into the center of our cultural attention. We hunt it in neurology labs, lament its decline on op-ed pages, fetishize it in grassroots quality-of-life movements, diagnose its absence in more and more of our children every year, cultivate it in yoga class twice a week, harness it as the engine of self-help empires, and pump it up to superhuman levels with drugs originally intended to treat Alzheimer’s and narcolepsy…We are, in short, terminally distracted. And distracted, the alarmists will remind you, was once a synonym for insane.”
Or, as Matt Johnson put it 25 years ago, I’ve been filled with useless information, spewed out by papers and radio stations…Another year older and what have i done? All my aspirations have shriveled in the sun. And don’t get me started on blogs, e-mails, youtubes, and tweets. In a New York Magazine cover story, Sam Anderson runs the gamut from Buddhism to Lifehacking to ascertain whether technology has really propelled us into a “crisis of attention”. (By way of Dangerous Meta, a blog that’s invariably worth the distraction.) And his conclusion? Maybe, but thems the breaks, folks. There’s no going back at this point. “This is what the web-threatened punditry often fails to recognize: Focus is a paradox — it has distraction built into it. The two are symbiotic; they’re the systole and diastole of consciousness…The truly wise will harness, rather than abandon, the power of distraction.“
Which just goes to show, the real key to harnessing distraction is…wait, hold on a tic, gotta get back to you. There’s a new funny hamster vid on Youtube.
Or something like that. Apparently, a new study suggests that — uh, oh — using Twitter may stunt one’s moral development. “A study suggests rapid-fire news updates and instant social interaction are too fast for the ‘moral compass’ of the brain to process. The danger is that heavy Twitters and Facebook users could become ‘indifferent to human suffering’ because they never get time to reflect and fully experience emotions about other people’s feelings.“
Hmm. I can’t say I’ve found Twitter to be particularly useful yet — to be honest, it all seems rather gimmicky to me, I worry about its Idiocracy-like implications. (Why 140 characters? Why not 10?), and, frankly, I often find that neither my life nor anyone else’s (nor, for that matter, that of anyone’s else’s adorable little children) is all that fascinating from moment to moment. (“Got up. Tired. It’s raining. Maybe I’ll eat some Grape Nuts.“) But I don’t think I can pin any personal reservoir of misanthropy on it either. (For that, I blame FOX News.)
“‘This is the part of the brain involved in knowing that you want something,’ she said. ‘When people who are not adjusting well are having these sorts of thoughts about the person, they are experiencing this reward pathway being activated. They really are craving in a way that perhaps is not allowing them or helping them adapt to the new reality.‘” It’s darker than you know in those complicated shadows…A new study finds that unrelenting grief works on the brain differently than the usual kind of post-traumatic depression. “The same brain system is involved in other powerful cravings, such those that afflict drug addicts and alcoholics…It’s like they’re addicted to the happy memories.“
“We appear to be bringing the worst affected parts of the brain functionally back to life.” Is Alzheimer’s disease about to go the way of polio? A new drug known as rember, according to scientists in England, seems to halt and even roll back the symptoms of Alzheimer’s. “We have demonstrated for the first time that it may be possible to arrest progression of the disease by targeting the tangles that are highly correlated with the disease. This is the most significant development in the treatment of the tangles since Alois Alzheimer discovered them in 1907.“
Ok, this one’s a bit creepy. By way of Webgoddess, watch the rotating dancer to ascertain whether you’re left-brained or right-brained. I’m pretty right-brained, it seems (which makes sense, since I’m both left-handed and left-footed). But, if I changed tasks while the dancer was on — say went to click another window or focused on the list at left, she’d sometimes switch direction. Weird…well, I just hope my right-brain knows what my left-brain is doing.
Is political conflict bred in the bone (or, put less charitably, do some among us just have an easier time with higher-order thinking)? A new joint NYU-UCLA neurobiological study finds once again that left- and right-leaning brains function differently, with liberal minds more receptive to change than their conservative counterparts. “Dozens of previous studies have established a strong link between political persuasion and certain personality traits. Conservatives tend to crave order and structure in their lives, and are more consistent in the way they make decisions. Liberals, by contrast, show a higher tolerance for ambiguity and complexity, and adapt more easily to unexpected circumstances…[In this case] respondents who had described themselves as liberals showed ‘significantly greater conflict-related neural activity’ when the hypothetical situation called for an unscheduled break in routine. Conservatives, however, were less flexible, refusing to deviate from old habits ‘despite signals that this…should be changed.’”
Eternal sunshine of the spotless mind! Each pray’r accepted, and each wish resign’d. Life imitates art as researchers hone in on drugs that will potentially erase traumatic memories. “‘This is all very preliminary,’ said Dr. Roger Pitman, a Harvard Medical School psychiatrist. ‘We’re just getting started. There is some promising preliminary data but no conclusions.‘”
Love is a stranger in an open car…or is it just a much-needed dopamine fix? Somebody writes this story every Valentine’s Day. Still, I guess it’s something to keep in mind. (And sorry, Berk, you may be my Valentine again this year, but the same type of deconstruction applies to you. No hard feelings, bud.)
How does it feel when your heart grows cold? Statisticians have deemed today “Blue Monday,” the most depressing day of the year. Um, if you say so. Clearly, these geniuses have never heard of Valentine’s Day.