// archives

The Brain

This category contains 52 posts

Rdng is Fndmtl.

“[W]hat Spritz does differently (and brilliantly) is manipulate the format of the words to more appropriately line them up with the eye’s natural motion of reading. The ‘Optimal Recognition Point’ (ORP) is slightly left of the center of each word, and is the precise point at which our brain deciphers each jumble of letters. The unique aspect of Spritz is that it identifies the ORP of each word, makes that letter red and presents all of the ORPs at the same space on the screen. In this way, our eyes don’t move at all as we see the words, and we can therefore process information instantaneously rather than spend time decoding each word.”

Whoa…I’ve read about kung-fu. An intriguing new app aims to turn everyone into speed readers. “Spritz is about to go public with Samsung’s new line of wearable technology.”

The Morality of the Tribe.

“[I]f you’re like the average American, here’s a fact you don’t know: in 1953, the United States sponsored a coup in Iran, overthrowing a democratically elected government and installing a brutally repressive regime that ruled for decades. Iranians, on the other hand, are very aware of this, which helps explain why, to this day, many of them are gravely suspicious of American intentions…This is the way the brain works: you forget your sins (or never recognize them in the first place) and remember your grievances.”

In a long piece at The Atlantic, Robert Wright ponders recent arguments about the biological basis of morality. “If Greene thinks that getting people to couch their moral arguments in a highly reasonable language will make them highly reasonable, I think he’s underestimating the cleverness and ruthlessness with which our inner animals pursue natural selection’s agenda. We seem designed to twist moral discourse — whatever language it’s framed in — to selfish or tribal ends, and to remain conveniently unaware of the twisting.”

Of Mice and Memory.

“‘Prof Roger Morris, from King’s College London, said: “This finding, I suspect, will be judged by history as a turning point in the search for medicines to control and prevent Alzheimer’s disease.’ He told the BBC a cure for Alzheimer’s was not imminent but: I’m very excited, it’s the first proof in any living animal that you can delay neurodegeneration. The world won’t change tomorrow, but this is a landmark study.’”

Some good news for a change, by way of Dangerous Meta: Scientists have developed (for mice, at least) what appears to be a breakthrough drug that could prevent Alzheimers and other neurodegenerative diseases like Parkinson’s(!):

When a virus hijacks a brain cell it leads to a build-up of viral proteins. Cells respond by shutting down nearly all protein production in order to halt the virus’s spread. However, many neurodegenerative diseases involve the production of faulty or ‘misfolded’ proteins. These activate the same defences, but with more severe consequences…The researchers used a compound which prevented those defence mechanisms kicking in and in turn halted neurodegeneration.”

We…learn. We…feed.

“‘Slime mould’s remarkable problem-solving capabilities are well-documented and include finding the shortest path between different food sources. It also displays memory, in a similar way to a novel electrical component called a memristor, which has in turn been likened to the functionality of biological brains. ‘It’s one of the simplest organisms that can learn,’ says Gale.”

And now, IT HAS A FACE. Scientists program an old-timey robot to dramatize the electrical signals emanating from slime mold. In a 100,000 years, this is going to seem like one of those Skynet-level bad ideas. And the hat is particularly creepy touch — Very Something Wicked This Way Comes.

Sleep: Nature’s Reset Button.

“More than 20 years ago…we began to suspect that the brain’s activity during slumber may somehow restore to a baseline state the billions of neural connections that get modified every day by the events of waking life. Sleep, in this telling, would preserve the ability of the brain’s circuitry to form new memories continually over the course of an individual’s lifetime without becoming oversaturated or obliterating older memories.”

More Science of Sleep: In Scientific American, two Italian academics put forward their “synaptic homeostasis hypothesis” (SHY) of slumber, whereby the brain weakens (not strengthens, as is usually assumed) synaptic links overnight. “In principle, SHY explains the essential, universal purpose of sleep…sleep restores the brain to a state where it can learn and adapt when we are awake…Most generally, sleep is the price we pay for the brain’s plasticity — its ability to modify its wiring in response to experience.”

Also part of SHY: the idea of “local sleep”: “Recently we have even found that prolonged or intense use of certain circuits can make local groups of neurons ‘fall asleep’ even though the rest of the brain (and the organism itself) remains awake…It seems that when we have been awake for too long or have overexerted certain circuits, small chunks of the brain may take quick naps without giving notice.” I believe in Internet parlance this is known as “haz-ing the dumb.”

Good News, Coffee Achievers.

“In one large-scale epidemiological study from last year, researchers primarily at the National Cancer Institute parsed health information from more than 400,000 volunteers, ages 50 to 71, who were free of major diseases at the study’s start in 1995…men who reported drinking two or three cups of coffee a day were 10 percent less likely to have died than those who didn’t drink coffee, while women drinking the same amount had 13 percent less risk of dying during the study.”

Better living through chemistry: The NYT’s Gretchen Reynolds touts the potential medical benefits of caffeine addiction. “Participants with little or no caffeine circulating in their bloodstreams were far more likely to have progressed to full-blown Alzheimer’s than those whose blood indicated they’d had about three cups’ worth of caffeine.” Factor in all the taurine I consume to boot, and I’m disco.

The Pleasures of the Void.

“I slid the blackout door closed behind me, eased down into the water, and touched a button that switched off the lights. I was floating in total darkness and silence…For what must have been the first 15 minutes, I wondered what I was doing there…Then a transformation began…My brain went a little haywire. When the storm passed, I found myself in a new and unfamiliar state of mind.”

To kick off his new Slate column “Anything Once,” friend Seth Stevenson finds himself reveling in the sensation of sensory deprivation. “I emerged in a profound daze. I spoke slowly and quietly, like a smooth-jazz DJ, to the person at the spa desk who inquired how my session had gone. I felt more rested than if I’d slept for 16 hours on a pile of tranquilized chinchillas. Outside, colors were saturated; sounds were vivid. I had to try this again, as soon as possible.”

Don’t Sleep on Second Sleep.

“In 2001, historian Roger Ekirch of Virginia Tech published a seminal paper, drawn from 16 years of research, revealing a wealth of historical evidence that humans used to sleep in two distinct chunks…Today, most people seem to have adapted quite well to the eight-hour sleep, but Ekirch believes many sleeping problems may have roots in the human body’s natural preference for segmented sleep as well as the ubiquity of artificial light.”

When you’re lying awake at night, it’s alright: BBC’s Stephanie Hegarty delves into pre-industrial sleep habits and discovers that eight hours of uninterrupted sleep may be a recent invention. “Much like the experience of Wehr’s subjects, these references describe a first sleep which began about two hours after dusk, followed by waking period of one or two hours and then a second sleep. ‘It’s not just the number of references – it is the way they refer to it, as if it was common knowledge,’ Ekirch says.”


“‘Every dollar we invested to map the human genome returned $140 to our economy — every dollar,’ he said. ‘Today our scientists are mapping the human brain to unlock the answers to Alzheimer’s. They’re developing drugs to regenerate damaged organs, devising new materials to make batteries 10 times more powerful. Now is not the time to gut these job-creating investments in science and innovation.’”

President Obama makes the case for federal investment in the Brain Activity Map Project. (You heard it here first, tinfoil hat people. The tyranny of the Kenyan socialist will not stop at your precious bodily fluids — He’s going to read your brainwaves too!) Seriously, though, investing in basic scientific research like this is, er, a no-brainer. It creates jobs while advancing the frontiers of human knowledge in all kinds of unanticipated ways. We’d be stupid not to support this — which means, of course, the jury’s still out on whether we will.

Update: “BAM is an acronym you’ll probably be hearing a lot in the weeks and months to come — so let’s talk about what the BAM project is, what it isn’t, and why it’s raising both interest and eyebrows throughout the scientific community.” Io9 has more.

Those Infectious Ideas.

“When we learn something new, our neurons tear down old connections and build new ones. In the process they cast off lots of molecules. To the immune system, this waste may look like an infection or some other kind of trouble, resulting in inflammation and the release of harsh compounds that normally fight viruses but can also interfere with the brain and its function.”

A novel path of brain research suggests our immune system may play a key role in determining intelligence. “The same T cells that protect the brain from inflammation also work to keep us sharp; and in what appears to be a feedback loop, the mere act of learning reinforces the effect.”

The Lightbulb Goes Off.

“The researchers used a new technique to record the footage: a super-sensitive fluorescent probe that detects neuron activity. That lets us see neurons glowing when they’re active — and the cascade of light you see…is the neuronal response of a zebrafish responding to the presence of its prey. In other words, you’re seeing what the fish thinks when it sees its lunch.”

Also in decoding-the-brain news, Japanese scientists visually capture the creation of a zebrafish’s thought. “[W]e shouldn’t play this down: this is a fundamental leap forward in our understanding of how brains work.”

570 Channels (And Nothin’ On).

“‘We don’t know whether the media multitasking is causing symptoms of depression and social anxiety, or if it’s that people who are depressed and anxious are turning to media multitasking as a form of distraction from their problems,’ Becker said in a statement.”

And here I thought Netflix and Warcraft went so well together: A new Michigan State study finds a correlation between depression and multi-tasking media. I wonder if the obverse is true also. One of the many reasons I love seeing a movie in the theater is that (ideally) nothing else but the film is impinging on my attention.

I feel worse already.

‘The work shows that processes like placebo and nocebo happen without us being aware of the cues that trigger them,’ said Jensen. ‘We get these responses due to associative learning. We don’t need somebody standing there saying ‘ok, now you will feel less pain’. It’s being elicited naturally, and without us being aware, all the time.’

A new study finds that subliminal cues help create the placebo effect (and its opposite, the “nocebo effect”)…although, reading the overview of the experiment here, the conclusion sounds more like: People will subliminally recoil from bad things that happen to them.

Target Acquired.

“Then, Daphne Bavelier of Rochester University began publishing studies showing that computer games improved the vision of people with normal eyesight. I couldn’t help but wonder: If they helped the normally sighted, why not people with impairments? Also, I saw studies where enriched environments for rats improved aspects of vision damaged after early deprivation. Well, what’s an enriched visual environment for a human? It might be a computer game. I thought, ‘Click, why not give it a try?’”

Developmental psychiatrist Nancy Maurer discusses her findings that playing first-person-shooters helps people born with cataracts to improve their vision. “I’m a reader. My husband and I don’t have children. So computer games wouldn’t be a part of our lives. I’ve never played one. I can’t imagine enjoying playing one.

Rolling the Clocks Back.

[T]his new method changes everything. To start with, it uses normal adult blood cells from the patient, so there’s not need to keep umbilical cords in storage. It also doesn’t use any virus reprogramming, so it’s completely safe. It’s also very efficient: researchers successfully transformed about 50 to 60 percent of adult blood cells into embryonic stem cells that can then be turn into any type of cell—a heart muscle cell, a bone cell, a nerve cell, anything.

By way of Dangerous Meta, researchers figure out a way to manufacture embryonic stem cells without an embryo, thus clearing the path for future research in that direction unhampered by abortion politics. “The discovery could be the key to cure the incurable – from heart attacks to severed spinal cord to cancer—and open the door, some day, to eternal youth.

With Age comes Wisdom…er, Mutations.

Although older mothers are at higher risk for complications such as diabetes during pregnancy and are more likely to have children with chromosomal disorders such as Down syndrome, the study found that practically all of the new mutations detected in children came from the father.

To me, children of the atom: A scientific study suggests that progeny of older men are more prone to mutations like autism, schizophrenia, telekinesis, and whatnot. “A man aged 29.7 at the time he fathered a child contributed 63 new mutations on average to his offspring, the authors found, and a man aged 46.2 contributed 126 mutations — a doubling, the authors calculated.” My biological clock is ticking like this…

We’ll see how smart you are when the K-9 comes…

Chaser proved to be a diligent student. Unlike human children, she seems to love her drills and tests and is always asking for more. ‘She still demands four to five hours a day,” Dr. Pilley said. “I’m 82, and I have to go to bed to get away from her.‘”

The NYT tells the tale of Chaser, a border collie with a vocabulary of over 1000 words now. “Dr. Pilley said that most border collies, with special training, ‘could be pretty close to where Chaser is.’…Dr. Horowitz agreed: ‘It is not necessarily Chaser or Rico who is exceptional; it is the attention that is lavished on them,” she said.’” (Sorry, Berk…At least I taught you bacon and tacos — you know, the important stuff.)

Guinness for Brains.

Drinking alcohol was ‘unintentional, accidental, and haphazard until about 10,000 years ago,’ says Satoshi Kanazawaat at Psychology Today. Smart people are generally early adopters and, in the context of human history, ‘the substance [alcohol] and the method of consumption are both evolutionarily novel.

Two new studies find a correlation between intelligence and a thirst for alcohol. Hey, I buy it – Thank you, science, for lending support to my vices! And, as Bogey said, “The problem with the world is that everyone is a few drinks behind.

I was so much older then…

On the global measure, people start out at age 18 feeling pretty good about themselves, and then, apparently, life begins to throw curve balls. They feel worse and worse until they hit 50. At that point, there is a sharp reversal, and people keep getting happier as they age. By the time they are 85, they are even more satisfied with themselves than they were at 18.” Via the NYT, a new study finds older people tend to be the happiest among us.

“‘It could be that there are environmental changes,’ said Arthur A. Stone, the lead author of a new study based on the survey, ‘or it could be psychological changes about the way we view the world, or it could even be biological — for example brain chemistry or endocrine changes.’” My guess, from where I sit at 35 — perspective, a.k.a. wisdom. You don’t live to 85 by sweating the small stuff, and by then you probably have a pretty good sense of how things tend to shake out anyway.

Change You Can Be Afraid Of.

“‘You’re coming of age in a 24/7 media environment that bombards us with all kinds of content and exposes us to all kinds of arguments, some of which don’t always rank all that high on the truth meter,’ Obama said at Hampton University, Virginia. ‘With iPods and iPads and Xboxes and PlayStations, — none of which I know how to work — information becomes a distraction, a diversion, a form of entertainment, rather than a tool of empowerment, rather than the means of emancipation.

Sigh. We’ve come a long way from “Dirt Off Your Shoulder.” In a commencement speech at Hampton University over the weekend, President Obama channels his inner grumpy-old-man (Roger Ebert?) to warn new grads about the perils of gaming and gadgetry. First off, it’s a ludicrous statement on its face: iPods are not particularly hard to work — and, if they’re really insidious Weapons of Mental Distraction, why give one to the Queen (who, by the way and to her credit, has fully embraced the Wii?)

Speaking more broadly, misinformation has been around as long as the republic — go read up on the Jefferson v. Adams race of 1800. If anything, today’s information society allows people to more easily hear the news from multiple sources, which is a very good thing. In fact, the reason our political culture these days is constantly bombarded with irrelevant, distracting, and inane mistruths has nothing — none, zip, zero — to do with iPods, iPads, Xboxes, or Playstations. It has to do with ABC, CBS, WaPo, Politico, and the rest of the Village, i.e. the very same people the President was noshing with a few weeks ago at the ne plus ultra of “information becoming distracting entertainment“, the White House Correspondents’ DInner.

Finally, while the “multi-tasking is distracting” proposition does seem to hold water, scientifically speaking, the jury’s still out on the pernicious effects of Xbox’s and the like. In fact, there are plenty of studies suggesting that video games improve vision, improve reflexes, improve attention, improve cognition, improve memory, and improve “fluid intelligence,” a.k.a. problem-solving. So, let’s not get out the torches and pitchforks just yet. It could just be that the 21st-century interactive culture is making better, smarter, more informed citizens. (And, hey, let’s not forget Admongo.)

To get to the point, while it’s not as irritating as the concerned-centrist pearl-clutching over GTA in past years, it’s just a troubling dynamic to see not only a young, ostensibly Kennedyesque president but the most powerful man in the world tsk-tsking about all this new-fangled technology ruining the lives of the young people. Let’s try to stay ahead of the curve, please. And let’s keep focus on the many problems — lack of jobs, crushing student loan and/or credit card debts, etc. — that might be distracting new graduates right now more than their iPods and PS3s. (Also, try to pick up a copy of Stephen Duncombe’s Dream — Video game-style interactivity isn’t the enemy. It’s the future.)

I Think I Feel Better.

It may be, then, that the simplest and least ethically hazardous way to capitalize on the placebo effect is to acknowledge that medicine isn’t just a set of approved treatments — it’s also a ritual, with symbolism and meaning that are key to its efficacy. At its best, that ritual spurs positive expectations, sparks associations with past healing experiences, and eases distress in ways that can alleviate suffering. These meanings, researchers say, are what the placebo effect is really about.

In the Boston Globe, Rebecca Tuhus-Dubrow examines the past, present, and future of the placebo effect. “You’re talking about many, many, many millions of dollars a year in drug treatment costs…If [doctors] can produce approximately the same therapeutic effect with less drug, then it’s obviously safer for the patient, and I can’t believe they wouldn’t want to look into doing this.’

Don’t Stand So Close to Me.

“‘No man is an island,’ said Nicholas A. Christakis, a professor of medicine and medical sociology at Harvard Medical School who helped conduct the research. ‘Something so personal as a person’s emotions can have a collective existence and affect the vast fabric of humanity.’

Forget H1N1: Psychologists uncover statistical indications that loneliness transmits like a social disease. “Loneliness is not just the property of an individual. It can be transmitted across people — even people you don’t have direct contact with.” Hmmm. Well, that explains grad school, then.

A Theory of Justice (and the Dog Park.)

“That traditional view of morality is beginning to show signs of wear and tear. The fact that human morality is different from animal morality — and perhaps more highly developed in some respects — simply does not support the broader claim that animals lack morality; it merely supports the rather banal claim that human beings are different from other animals…Unique human adaptations might be understood as the outer skins of an onion; the inner layers represent a much broader, deeper, and evolutionarily more ancient set of moral capacities shared by many social mammals, and perhaps by other animals and birds as well.

In The Chronicle of Higher Education, bioethicist Jessica Pierce and biologist Marc Bekoff suggest what apparently agreed-upon rules of canid play teach us about animal morality. (via FmH.) “Although play is fun, it’s also serious business. When animals play, they are constantly working to understand and follow the rules and to communicate their intentions to play fairly.

The Politics of Yecccch.

“Likewise, conservatives are more likely than liberals to sense contamination or perceive disgust. People who would be disgusted to find that they had accidentally sipped from an acquaintance’s drink are more likely to identify as conservatives.” The NYT’s Nicholas Kristof examines the hardwired psychological differences between liberals and conservatives. “The larger point is that liberals and conservatives often form judgments through flash intuitions that aren’t a result of a deliberative process. The crucial part of the brain for these judgments is the medial prefrontal cortex, which has more to do with moralizing than with rationality …For liberals, morality derives mostly from fairness and prevention of harm. For conservatives, morality also involves upholding authority and loyalty — and revulsion at disgust.

We Control The Verti…ooh, new Tweet!

“Over the last several years, the problem of attention has migrated right into the center of our cultural attention. We hunt it in neurology labs, lament its decline on op-ed pages, fetishize it in grassroots quality-of-life movements, diagnose its absence in more and more of our children every year, cultivate it in yoga class twice a week, harness it as the engine of self-help empires, and pump it up to superhuman levels with drugs originally intended to treat Alzheimer’s and narcolepsy…We are, in short, terminally distracted. And distracted, the alarmists will remind you, was once a synonym for insane.”

Or, as Matt Johnson put it 25 years ago, I’ve been filled with useless information, spewed out by papers and radio stations…Another year older and what have i done? All my aspirations have shriveled in the sun. And don’t get me started on blogs, e-mails, youtubes, and tweets. In a New York Magazine cover story, Sam Anderson runs the gamut from Buddhism to Lifehacking to ascertain whether technology has really propelled us into a “crisis of attention”. (By way of Dangerous Meta, a blog that’s invariably worth the distraction.) And his conclusion? Maybe, but thems the breaks, folks. There’s no going back at this point. “This is what the web-threatened punditry often fails to recognize: Focus is a paradox — it has distraction built into it. The two are symbiotic; they’re the systole and diastole of consciousness…The truly wise will harness, rather than abandon, the power of distraction.

Which just goes to show, the real key to harnessing distraction is…wait, hold on a tic, gotta get back to you. There’s a new funny hamster vid on Youtube.

So Tweet and So Cold.

@JohnnyCash: Hello from Reno. Shot man…just to watch him die, actually. Weird, I know.
@ACamus: Beach lovely this time of year. Also, killed Arab. Oops.

Or something like that. Apparently, a new study suggests that — uh, oh — using Twitter may stunt one’s moral development. “A study suggests rapid-fire news updates and instant social interaction are too fast for the ‘moral compass’ of the brain to process. The danger is that heavy Twitters and Facebook users could become ‘indifferent to human suffering’ because they never get time to reflect and fully experience emotions about other people’s feelings.

Hmm. I can’t say I’ve found Twitter to be particularly useful yet — to be honest, it all seems rather gimmicky to me, I worry about its Idiocracy-like implications. (Why 140 characters? Why not 10?), and, frankly, I often find that neither my life nor anyone else’s (nor, for that matter, that of anyone’s else’s adorable little children) is all that fascinating from moment to moment. (“Got up. Tired. It’s raining. Maybe I’ll eat some Grape Nuts.“) But I don’t think I can pin any personal reservoir of misanthropy on it either. (For that, I blame FOX News.)

A Hole in the Heart.

“‘This is the part of the brain involved in knowing that you want something,’ she said. ‘When people who are not adjusting well are having these sorts of thoughts about the person, they are experiencing this reward pathway being activated. They really are craving in a way that perhaps is not allowing them or helping them adapt to the new reality.‘” It’s darker than you know in those complicated shadows…A new study finds that unrelenting grief works on the brain differently than the usual kind of post-traumatic depression. “The same brain system is involved in other powerful cravings, such those that afflict drug addicts and alcoholics…It’s like they’re addicted to the happy memories.

Thanks for the Memories.

“We appear to be bringing the worst affected parts of the brain functionally back to life.” Is Alzheimer’s disease about to go the way of polio? A new drug known as rember, according to scientists in England, seems to halt and even roll back the symptoms of Alzheimer’s. “We have demonstrated for the first time that it may be possible to arrest progression of the disease by targeting the tangles that are highly correlated with the disease. This is the most significant development in the treatment of the tangles since Alois Alzheimer discovered them in 1907.

Obama and McCain’s Sinister Inclinations.

“In the race for the White House, lefties seem to have the upper hand. No matter who wins in November, six of the 12 chief executives since the end of World War II will have been left-handed: Harry Truman, Gerald Ford, Ronald Reagan, the elder Bush, Clinton and either Obama or McCain. That’s a disproportionate number, considering that only one in 10 people in the general population is left-handed.” In the WP, authors Sam Wang and Sandra Aamodt explain why all your Oval Offices are belong to us, the lefties. We also swelled the ranks of both my undergraduate and graduate cohorts, whatever that’s worth.

World in My Eyes.

Thoughtcrime is death. Thoughtcrime does not entail death. Thoughtcrime IS death. I have committed even before setting pen to paper the essential crime that contains all others unto itself.” The shape of things to come? Scientists at Berkeley conceive a way to use MRI imaging to “map” images in the brain. “Our results suggest that it may soon be possible to reconstruct a picture of a person’s visual experience from measurements of brain activity alone. Imagine a general brain-reading device that could reconstruct a picture of a person’s visual experience at any moment in time…It is possible that decoding brain activity could have serious ethical and privacy implications downstream in, say, the 30 to 50-year time frame.

Omsbudsdog Emeritus

Photos on flickr

Recent Tweets


Follow Me on Pinterest 
My Pinterest Badge by: Jafaloo. For Support visit: My Pinterest Badge


Boyhood (10/10)

Visions Past

Snowpiercer (7/10)
X-Men: Days of Future Past (7.5/10)
The Double (7.5/10)
Blue Ruin (8/10)
Le Weekend (7.5/10)
God's Pocket (6.5/10)
Devil's Knot (5/10)
Under the Skin (7.5/10)
Transcendence (3/10)
Nymphomaniac, Vol. 1 (3/10)
Captain America: The Winter Soldier (8.5/10)
The Grand Budapest Hotel (6/10)
300: Rise of an Empire (4/10)
Robocop (5.5/10)
The Lego Movie (8.5/10)
The Monuments Men (4/10)
GitM BEST OF 2013
GitM Review Archive

Currently Reading

The Weirdness, Jeremy Bushnell

Recently Read

How to Live Safely in A Science Fictional Universe, Charles Yu
The Boys in the Boat, Daniel James Brown
Command and Control, Eric Schlosser
The Goldfinch, Donna Tartt

Uphill All the Way

Syndicate this site:
RSS 1.0 | Atom (2.0)

Unless otherwise specified, the opinions expressed here are those of the author (me), and me alone.

All header images intended as homage. Please contact me if you want one taken down.

GitM is and has always been ad-free. Tips are appreciated if the feeling strikes.