viernes, 31 de diciembre de 2010

Uncovering the neurobiological basis of general anesthesia


Uncovering the neurobiological basis of general anesthesia
December 30th, 2010 in Medicine & Health / Research


The use of general anesthesia is a routine part of surgical operations at hospitals and medical facilities around the world, but the precise biological mechanisms that underlie anesthetic drugs' effects on the brain and the body are only beginning to be understood. A review article in the December 30 New England Journal of Medicine brings together for the first time information from a range of disciplines, including neuroscience and sleep medicine, to lay the groundwork for more comprehensive investigations of processes underlying general anesthesia.

"A key point of this article is to lay out a conceptual framework for understanding general anesthesia by discussing its relation to sleep and coma, something that has not been done in this way before," says Emery Brown, MD, PhD, of the Massachusetts General Hospital (MGH) Department of Anesthesia, Critical Care and Pain Medicine, lead author of the NEJM paper. "We started by stating the specific physiological states that comprise general anesthesia – unconsciousness, amnesia, lack of pain perception and lack of movement while stable cardiovascular, respiratory and thermoregulatory systems are maintained – another thing that has never been agreed upon in the literature; and then we looked at how it is similar to and different from the states that are most similar – sleep and coma."

After laying out their definition, Brown and his co-authors – Ralph Lydic, PhD, a sleep expert from the University of Michigan, and Nicholas Schiff, MD, an expert in coma from Weill Cornell Medical College – compare the physical signs and electroencephalogram (EEG) patterns of general anesthesia to those of sleep. While it is common to describe general anesthesia as going to sleep, there actually are significant differences between the states, with only the deepest stages of sleep being similar to the lightest phases of anesthesia induced by some types of agents.

While natural sleep normally cycles through a predictable series of phases, general anesthesia involves the patient being taken to and maintained at the phase most appropriate for the procedure, and the phases of general anesthesia at which surgery is performed are most similar to states of coma. "People have hesitated to compare general anesthesia to coma because the term sounds so harsh, but it really has to be that profound or how could you operate on someone?" Brown explains. "The key difference is this is a coma that is controlled by the anesthesiologist and from which patients will quickly and safely recover."

In detailing how different anesthetic agents act on different brain circuits, the authors point out some apparently contradictory information – some drugs like ketamine actually activate rather than suppress neural activity, an action that can cause hallucinations at lower doses. Ketamine blocks receptors for the excitatory transmitter glutamate, but since it has a preference for receptors on certain inhibitory neurons, it actually stimulates activity when it blocks those inhibitors. This excess brain activity generates unconsciousness through a process similar to what happens when disorganized data travels through an electronic communication line and blocks any coherent signal. A similar mechanism underlies seizure-induced unconsciousness.

Brown also notes that recent reports suggest an unexpected use for ketamine – to treat depression. Very low doses of the drug have rapidly reduced symptoms in chronically depressed patients who had not responded to traditional antidepressants. Ketamine is currently being studied to help bridge the first days after a patient begins a new antidepressant – a time when many may be at risk of suicide – and the drug's activating effects may be akin to those of electroconvulsive therapy.

Another unusual situation the authors describe is the case of a brain-injured patient in a minimally conscious state who actually recovered some functions through administration of the sleep-inducing drug zolpidem (Ambien). That patient's case, analyzed previously by Schiff, mirrors a common occurrence called paradoxical excitation, in which patients in the first stage of general anesthesia may move around or vocalize. The authors describe how zolpidem's suppression of the activity of a brain structure called the globus pallidus – which usually inhibits the thalamus – stimulates activity in the thalamus, which is a key neural control center. They hypothesize that a similar mechanism may underlie paradoxical excitation.

"Anesthesiologists know how to safely maintain their patients in the states of general anesthesia, but most are not familiar with the neural circuit mechanisms that allow them to carry out their life-sustaining work," Brown says. "The information we are presenting in this article – which includes new diagrams and tables that don't appear in any anesthesiology textbook – is essential to our ability to further understanding of general anesthesia, and this is the first of several major reports that we anticipate publishing in the coming year."

Schiff adds, "We think this is, conceptually, a very fresh look at phenomena we and others have noticed and studied in sleep, coma and use of general anesthesia. By reframing these phenomena in the context of common circuit mechanisms, we can make each of these states understandable and predictable."

Provided by Massachusetts General Hospital

miércoles, 29 de diciembre de 2010

Auditory cortex spatial sensitivity sharpens during task performance

Auditory cortex spatial sensitivity sharpens during task performance

Chen-Chung Lee & John C Middlebrooks

Nature Neuroscience, Volume: 14, Pages: 108-114
Year published: (2011), Received 22 September 2010, Accepted 13 November 2010, Published online 12 December 2010
DOI:doi:10.1038/nn.2713


Abstract:
Activity in the primary auditory cortex (A1) is essential for normal sound localization behavior, but previous studies of the spatial sensitivity of neurons in A1 have found broad spatial tuning. We tested the hypothesis that spatial tuning sharpens when an animal engages in an auditory task. Cats performed a task that required evaluation of the locations of sounds and one that required active listening, but in which sound location was irrelevant. Some 26�44% of the units recorded in A1 showed substantially sharpened spatial tuning during the behavioral tasks as compared with idle conditions, with the greatest sharpening occurring during the location-relevant task. Spatial sharpening occurred on a scale of tens of seconds and could be replicated multiple times in ~1.5-h test sessions. Sharpening resulted primarily from increased suppression of responses to sounds at least-preferred locations. That and an observed increase in latencies suggest an important role of inhibitory mechanisms.


(a) Poststimulus time histogram (PSTH) showing activity as a function of time (horizontal axis) and head-centered stimulus location (vertical axis) for one example unit in A1 in the right hemisphere during the idle condition


Each symbol represents one unit, with the value in horizontal and vertical axes corresponding to its ERRF width in two different conditions. The symbols lying below the diagonal line represent units for which spatial tuning sharpened

Source: Nature Neuroscience
http://www.nature.com/neuro/journal/v14/n1/abs/nn.2713.html?lang=en

viernes, 24 de diciembre de 2010

Scans could predict onset of schizophrenia

Brain scans could be used to predict the onset of schizophrenia in young people with a family history of the disease, a new study suggests.

jueves, 23 de diciembre de 2010

Abused, neglected children have lower IQ in teens


Abused, neglected children have lower IQ in teens
December 22nd, 2010 in Medicine & Health / Health
University of Queensland research has found children who have been abused or neglected are likely to struggle academically during adolescence.
The research drew upon data from the Mater-University Study of Pregnancy (MUSP) – a longitudinal study of more than 7000 mothers and their children born at Brisbane's Mater Hospital from 1981-83.

Lead author and pediatrician Ryan Mills said the research involved confidentially linking allegations of maltreatment reported to the Department of Families, Youth and Community Care with the MUSP database.

“Both child abuse and child neglect are independently associated with impaired cognition and academic functioning in adolescence,” Dr. Mills said.

“These findings suggest that both abuse and neglect have independent and important adverse effects on a child's cognitive development.”

The MUSP database provided results of numeracy, literacy and abstract reasoning tests completed by 3796 adolescents at age 14.

The 298 adolescents (7.9 percent) who had been reported as victims of maltreatment scored the equivalent of approximately three IQ points lower than those who had not been maltreated, after accounting for a large range of socioeconomic and other factors.

Co-author Lane Strathearn, a UQ medical and PhD graduate now based at the Baylor College of Medicine and Texas Children's Hospital, said this study was one of the first to analyse outcomes of abuse and neglect independently.

“Studies have repeatedly demonstrated that at least half of maltreated children experience more than one type of abuse or neglect,” Dr. Strathearn said.

“Our sample was no different; 74 percent of the children reported to the state as suspected cases of neglect also had been reported as suspected victims of abuse.

“Our method involved grouping the physical, emotional and sexual abuse cases together and assessing both abuse and neglect - reported or substantiated - as independent nonexclusive predictor variables.”

The results highlighted the seriousness of child neglect, Dr. Strathearn said.

“The effects of abuse and neglect were found to be independent and quantitatively similar; children who experienced both abuse and neglect were doubly affected,” he said.

“The results support the notion that child neglect has developmental effects that are independently at least as deleterious as abuse, which has important implications for the allocation of resources into additional research into, and prevention of, child neglect.”

More information: Dr. Mills and Dr. Strathearn worked with a team of UQ colleagues, including Rosa Alati, Michael O'Callaghan, Jake Najman, Gail Williams and William Bor. The study was published online this month in medical journal Pediatrics.


Provided by University of Queensland

martes, 21 de diciembre de 2010

Brain imaging predicts future reading progress in children with dyslexia


Brain imaging predicts future reading progress in children with dyslexia
December 20th, 2010 in Medicine & Health / Research


Brain scans of adolescents with dyslexia can be used to predict the future improvement of their reading skills with an accuracy rate of up to 90 percent, new research indicates. Advanced analyses of the brain activity images are significantly more accurate in driving predictions than standardized reading tests or any other measures of children's behavior.

The finding raises the possibility that a test one day could be developed to predict which individuals with dyslexia would most likely benefit from specific treatments.

The research was published Dec. 20, 2010, in the Proceedings of the National Academy of Science.

"This approach opens up a new vantage point on the question of how children with dyslexia differ from one another in ways that translate into meaningful differences two to three years down the line," Bruce McCandliss, Patricia and Rodes Hart Chair of Psychology and Human Development at Vanderbilt University's Peabody College and a co-author of the report, said. "Such insights may be crucial for new educational research on how to best meet the individual needs of struggling readers.

"This study takes an important step toward realizing the potential benefits of combining neuroscience and education research by showing how brain scanning measures are sensitive to individual differences that predict educationally relevant outcomes," he continued.

The research was primarily conducted at Stanford University and led by Fumiko Hoeft, associate director of neuroimaging applications at the Stanford University School of Medicine. In addition to McCandliss, Hoeft's collaborators included researchers at MIT, the University of Jyväskylä in Finland and the University of York in the United Kingdom.

"This finding provides insight into how certain individuals with dyslexia may compensate for reading difficulties," Alan E. Guttmacher, director of the National Institutes of Health's Eunice Kennedy Shriver National Institute of Child Health and Human Development, which provided funding for the study, said.

"Understanding the brain activity associated with compensation may lead to ways to help individuals with this capacity draw upon their strengths," he continued. "Similarly, learning why other individuals have difficulty compensating may lead to new treatments to help them overcome reading disability."

The researchers used two types of brain imaging technology to conduct their study. The first, functional magnetic resonance imaging (fMRI), depicts oxygen use by brain areas involved in a particular task or activity. The second, diffusion tensor magnetic resonance imaging (DTI), maps white matter tracts that are the brain's wiring, revealing connections between brain areas.

The 45 children who took part in the study ranged in age from 11 to 14 years old. Each child first took a battery of tests to determine their reading abilities. Based on these tests, the researchers classified 25 children as having dyslexia, which means that they exhibited significant difficulty learning to read despite having typical intelligence, vision and hearing and access to typical reading instruction.

During the fMRI scan, the youths were shown pairs of printed words and asked to identify pairs that rhymed, even though they might be spelled differently. The researchers investigated activity patterns in a brain area on the right side of the head, near the temple, known as the right inferior frontal gyrus, noting that some of the children with dyslexia activated this area much more than others. DTI scans of these same children revealed stronger connections in the right superior longitudinal fasciculus, a network of brain fibers linking the front and rear of brain.

When the researchers once again administered the reading test battery to the youths two and a half years later, they found that the 13 youths showing the stronger activation pattern in the right inferior frontal gyrus were much more likely to have compensated for their reading difficulty than were the remaining 12 youths with dyslexia. When they combined the most common forms of data analysis across the fMRI and DTI scans, they were able to predict the youths' outcomes years later with 72 percent accuracy.

The researchers then adapted algorithms used in artificial intelligence research to refine the brain activity data to create models that would predict the children's later progress. Using this relatively new technique, the researchers could use the brain scanning data collected at the beginning of the study to predict with over 90 percent accuracy which children would go on to improve their reading skills two and a half years later.

In contrast, the battery of standardized, paper-and-pencil tests typically used by reading specialists did not aid in predicting which of the children with dyslexia would go on to improve their reading ability years later.

"Our findings add to a body of studies looking at a wide range of conditions that suggest brain imaging can help determine when a treatment is likely to be effective or which patients are most susceptible to risks," Hoeft said.

Hoeft further explained that the largest improvement was seen in reading comprehension, which is the ultimate goal of reading. The youths showed less improvement in other reading-related skills such as phonological awareness. Typically developing readers tend to develop phonemic awareness skills before developing fluency and comprehension skills.

Hoeft suggested the finding that youths with dyslexia recruited right brain frontal regions to compensate for their reading difficulties, rather than regions in the left side of their brains, as typical readers do, may have something to do with this.

The study is part of a rapidly developing field of research known as "educational neuroscience" that brings together neuroimaging studies with educational research to understand how individual learners differ in brain structure and activity and how learning can drive changes at the neural level. Such questions are now being effectively examined in young children even before reading instruction begins, McCandliss explained in a Proceedings of the National Academy of Science article published earlier this year.

"This latest study provides a simple answer to a very complex question—'what can neuroscience contribute to complex issues in education?'" McCandliss said. "Here we have a clear example of how new insights and discoveries are beginning to emerge by pairing rigorous education research with novel neuroimaging approaches."

Provided by Vanderbilt University

sábado, 18 de diciembre de 2010

Researchers discover new way to reduce anxiety, stress


Researchers discover new way to reduce anxiety, stress
December 17th, 2010 in Medicine & Health / Neuroscience


Two North American researchers have made a major discovery that will benefit people who have anxiety disorders. Bill Colmers, a professor of pharmacology and researcher in the Faculty of Medicine & Dentistry at the University of Alberta, collaborated with Janice Urban, an associate professor in the department of physiology and biophysics at the Chicago Medical School at Rosalind Franklin University of Medicine and Science. The duo, who have been researching anxiety for five years, discovered that blocking a process in nerve cells reduces anxiety, meaning a new drug could now be developed to better treat anxiety disorders. Their findings were published in the peer-reviewed Journal of Neuroscience in December’s edition.

Colmers explained that current anxiety drugs on the market are non-selective, which means they inhibit various neurons, or nerve cells, in the brain—including ones you don’t want to inhibit. Because no one could pinpoint how to reduce anxiety, all kinds of neurons had to be treated with anxiety medication, which can have undesirable side-effects such as drowsiness.

But now drugs can now be designed to more specifically treat anxiety disorders, likely meaning fewer undesirable side effects and a better quality of life for those with anxiety. Anxiety disorders are the most common mental-health issue in the country, affecting one in 10 Canadian adults, according to the Anxiety Disorders Association of Canada.

For years, researchers have understood what processes in the brain are responsible for high and low anxiety levels, but no one had been able to identify what triggers this process.

“No one else has discovered this,” said Colmers, a senior scientist with funding from the Alberta Heritage Foundation for Medical Research (a provincial agency now called Alberta Innovates – Health Solutions). “Others have identified the behaviour, but now we know why this process happens and how it works. Now we know why certain chemical messengers behave the way they do.”

There are two chemical messengers in a specific part of the brain known to regulate anxiety. One messenger, known as neuropeptide Y, makes one less anxious while the other, known as corticotropin-releasing factor or CRF, makes one more anxious.

These two chemical messengers regulate how “excitable” the nerve cell gets. Neuropeptide Y causes nerve cells to be less active, meaning the cells will fire less. The other chemical messenger, CRF, causes cells to be more active and fire more often. The more often these neurons fire, the more anxious a person becomes.

By working with laboratory models, Colmers and Urban discovered that blocking the process responsible for regulating cell excitability triggers less anxiety. Blocking this process had the same effect as the chemical messenger neuropeptide Y, which makes people less anxious.

Colmers said it could be 10 years before patients could start taking a new drug for anxiety based on these research findings, but the find is still significant.

“There is a real need to find better treatments for anxiety—to better target the processes in the brain that trigger anxiety disorders.”

Provided by University of Alberta

jueves, 16 de diciembre de 2010

Where unconscious memories form


Where unconscious memories form
December 15th, 2010 in Medicine & Health / Neuroscience


A small area deep in the brain called the perirhinal cortex is critical for forming unconscious conceptual memories, researchers at the UC Davis Center for Mind and Brain have found.

The perirhinal cortex was thought to be involved, like the neighboring hippocampus, in "declarative" or conscious memories, but the new results show that the picture is more complex, said lead author Wei-chun Wang, a graduate student at UC Davis.

The results were published Dec. 9 in the journal Neuron.

We're all familiar with memories that rise from the unconscious mind. Imagine looking at a beach scene, said Wang. A little later, someone mentions surfing, and the beach scene pops back into your head.

Declarative memories, in contrast, are those where we recall being on that beach and watching that surf competition: "I remember being there."

Damage to a structure called the hippocampus affects such declarative "I remember" memories, but not conceptual memories, Wang said. Neuroscientists had previously thought the same was true for the perirhinal cortex, which is located immediately next to the hippocampus.

Wang and colleagues carried out memory tests on people diagnosed with amnesia, who had known damage to the perirhinal cortex or other brain areas. They also carried out functional magnetic resonance imaging (fMRI) scans of healthy volunteers while they performed memory tests.

In a typical test, they gave the subjects a long list of words, such as chair, table or spoon, and asked them to think about how pleasant they were.

Later, they asked the subjects to think up words in different categories, such as "furniture."

Amnesiacs with damage to the perirhinal cortex performed poorly on the tests, while the same brain area lit up in fMRI scans of the healthy control subjects.

The study helps us understand how memories are assembled in the brain and how different types of brain damage might impair memory, Wang said. For example, Alzheimer's disease often attacks the hippocampus and perirhinal cortex before other brain areas.

Provided by University of California - Davis

miércoles, 15 de diciembre de 2010

Extended Mind Redux: A Response

Extended Mind Redux: A Response
By ANDY CLARK

Thanks to all who read and commented on my recent Stone post, “Out of Our Brains.” Lots of interesting, challenging, and important issues were raised, but I’d like to react very briefly to just a few recurring themes, and to add one important, and accidentally omitted, note of thanks.
The thanks (and see comment 73) are to professor David Chalmers. Dave was co-author of my original 1998 paper “The Extended Mind,” and is the sole author of a wonderful foreword to my 2008 book, “Supersizing the Mind.” Dave had a big hand in the original paper. Everyone who asked about the thin line between tools and extensions, or about the vexed question of extending the conscious mind, should read his recent forward, too.

Themewise, I was struck by the somewhat remarkable fact that about half the commentators thought the general line about extending the mind was plausible and even obvious, while about half thought it was implausible and perhaps even self-evidently false. In optimistic mode (which I mostly am) I take this as a good sign: as suggesting that there is indeed something worth thinking about here. If I were feeling less upbeat, I might take it as a sign that I just hadn’t made the thought clear enough. A couple of comments made me worry on that score, so a few clarifications seem in order.

I didn’t mean to downplay the pivotal role of the brain/body in human thought and reason. I can indeed survive the loss of my iPhone but not the loss of my brain! But as Dave Chalmers has pointed out, I can also survive the loss of my finger, the loss of a few neurons, or even the wholesale ablation of my visual cortex. It does not follow that e.g. my visual cortex, when all is up and running normally, does not constitute part of my cognitive apparatus. This reveals something important. It is only when you turn up the magnification, seeing the biological agent as herself a kind of grab-bag of distinct circuits and capacities, that the possibility of true cognitive extension even becomes visible. That possibility then takes shape as the possibility that some non-biological circuitry (connected by various forms of looping interaction to the biological core) might become sufficiently integral to some of my cognitive performances as to count as part of the machinery of mind and reason.

This talk of the machinery of mind is important. A few commentators rightly suggested that mind itself is probably not a “thing” hence not worth trying to locate. That is not to say — heaven forbid — that it is a non-material thing. Rather, it might be a bit like trying to locate the adorableness of a kitten. There is nothing magically non-physical about the kitten, but trying to fine-tune the location of the adorableness still seems like some kind of error or category mistake. In the case of mind, I think what we have is an intuitive sense of the kind of capacities that we are gesturing at when we speak of minds, and so we can then ask: where is the physical machinery that makes those capacities possible? It is the physical machinery of thought and reason that the extended mind story is meant to concern.

A couple of replies touched on what is really one of the philosophical hot potatoes here, which is the distinction between “mere” inputs to a cognitive system and elements of the system itself. Critics of the extended mind (for example, Fred Adams and Ken Aizawa, in their 2008 book called “The Bounds of Cognition”) think theorists of extended cognition are guilty of confusing inputs to the cognitive engine with stuff that is part of (and “constitutes”) the cognitive engine. I think this distinction between “mere” inputs and processing elements in far less clear than it sounds. An analogy I sometimes use is with the workings of a turbo-driven car engine. Compare: the car makes exhaust fumes (outputs) that are also inputs that drive the turbo that adds power (up to around 30 percent more power) to the engine. The exhaust fumes are both outputs and self-generated inputs that, as they loop around, surely form a proper part of the overall power-generating mechanism. I think much the same is true of our use of bodily gestures while reasoning with others, and of the way that actively writing contributes to the process of thinking. The gestures and words on the page are outputs that immediately loop back in ways that form larger circuits of ongoing thinking and reasoning.

Some respondents raised important and interesting questions concerning conscious experience. I note only that my own account of cognitive extension is not meant to make any claims extending the machinery of consciousness beyond the brain. I myself am skeptical of such extensions. But some excellent philosophers (like Alva Noë in his 2009 book, “Out of Our Heads”) do go that far, and I would refer those interested in this issue to that short and accessible treatment.

Finally, special thanks to all those who suggested sci-fi books, or other stuff that I ought to be reading. My holiday stocking (and perhaps my mind) will be greatly expanded as a result.

Can't relax? It's all in your mind: Research shows stopping a thought puts more strain on the brain


Can't relax? It's all in your mind: Research shows stopping a thought puts more strain on the brain
December 14th, 2010 in Medicine & Health / Neuroscience



Turns out, relaxing is exhausting—which could by why so many people struggle to unplug from work during vacation.

According to mathematicians at Case Western Reserve University, stopping a thought burns more energy than thinking-like stopping a truck on a downhill slope.

"Maybe this explains why it is so tiring to relax and think about nothing," says Daniela Calvetti, professor of mathematics and one of the authors of a new brain study published in an advanced online publication of the Journal of Cerebral Blood Flow & Metabolism.

Since opening up the brain for detailed monitoring isn't exactly practical, Calvetti teamed up with fellow mathematics professor Erkki Somersalo and Rossana Occhipinti, a postdoctoral researcher in physiology and biophysics, to create a computer model of brain metabolism.

Calvetti and Somersalo created a software package specifically designed to study the complex metabolic systems. The software-Metabolica-produces a numeric rendering of the pathways linking excitatory neurons that transmit thought or inhibitory neurons that put on the brakes with star-shaped brain cells called astrocytes. Astrocytes provide essential chemicals and functions to both kinds of neurons.

To stop a thought, the brain uses inhibitory neurons to prevent excitatory neurons from passing information-they block information by releasing gamma aminobutyric acid, commonly called GABA, which counteracts the effect of the neurotransmitter glutamate by excitatory neurons.

In other words, glutamate opens the synaptic gates and GABA holds them closed.

"The astrocytes, which are the Cinderellas of the brain, consume large amounts of oxygen mopping up and recycling the GABA and the glutamate, which is a neurotoxin," Somersalo says.

More oxygen requires more blood flow, although the connection between cerebral metabolism and hemodynamics is not fully understood yet.

All together, "It's a surprising expense to keep inhibition on," he says.

The researchers hope their work can provide some insight on brain diseases, which are often difficult to diagnose until advanced stages. Most brain maladies are linked to energy metabolism, and understanding the norm may enable doctors to detect problems earlier.

The toll inhibition takes may be particularly relevant to neurodegenerative diseases. "And that is truly exciting," Calvetti says.

Provided by Case Western Reserve University

martes, 14 de diciembre de 2010

Everyone thinks everyone else has less free will


Everyone thinks everyone else has less free will
December 13th, 2010 in Medicine & Health / Psychology & Psychiatry
Generally, everyone seems to believe they have more free will than everyone else.

The subject of individual free will -- whether our fates are beyond our control or whether we command our own destinies -- has been hotly argued for centuries. Now scientists have revealed a new wrinkle in the debate: generally, everyone seems to believe they have more free will than everyone else.

Social psychologist Emily Pronin at Princeton University in New Jersey studies the differences between how we perceive ourselves and how we perceive others. According to her research, we tend to view our own judgment as sound but the judgment of others as irrational; recognize the biases in others but not ourselves; and see ourselves as more individualistic and others as more conformist.

Essentially, people judge others based on what they see. But they judge themselves based on what they think and feel, a difference that often leads to misunderstandings, disagreements and conflicts. Understanding the psychological basis of these differences might help relieve some of their negative consequences, Pronin suggested.

When Pronin began wondering about other consequences of this asymmetry, "beliefs in free will struck me as a key place to look, since those beliefs really matter for things like how much responsibility we assign to our own and others' actions," she said. In four experiments, Pronin and graduate student Matthew Kugler investigated how much people believed that their lives and those of their peers were guided by free will, findings they detailed online Dec. 13 in the Proceedings of the National Academy of Sciences.

In the first experiment, the researchers studied the most classic tenet of free will — the notion that one's actions cannot be determined in advance. Fifty college students were asked the rate on a scale of one to seven how predictable they thought certain past and future decisions in their lives and those of their roommates were, such as their choice of major in college or their ultimate career path. On average, the participants viewed their own pasts and futures as less predictable than their roommates by about one point on that scale.

"By the standards of psychological research, this is a large effect," Pronin said.

In the second and third experiments, 28 restaurant workers and 50 students were asked how many choices they thought were available in their futures and those of peers. The volunteers generally thought they had more pathways open to them, good and bad.

In the last experiment, 58 students created models predicting their own behavior and those of a roommate on a Saturday night or after finishing college that indicated how important personality, history, circumstances, intentions and desires were for outcomes. The volunteers saw their own future actions as most strongly driven by their intentions and desires instead of being predetermined by personality, history, or circumstances. In contrast, they viewed personality as the strongest predictor of their roommates' behavior.

"People have been debating about the existence of free will for ages," Pronin said. "Our research suggests one reason why this debate is so persistent -- people seem to have two views of free will. One view is when they look inwards and are convinced of their own free will; the other view is when they look outwards, at others, and are convinced that those others' actions could have been predicted in advance."

"This work is a terrific advance," said research psychologist Roy Baumeister at Florida State University in Tallahassee, who did not take part in this study. "Most debates about free will take an all-or-nothing form -- either everyone has it all the time, or nobody ever does."

As to why such a difference might have evolved, "when thinking about ourselves, it may be adaptive to believe that we can control what happens to us, and that belief requires thinking that we have free will," Pronin suggested. "When thinking about others, it may be adaptive to recognize the predictability in others' actions so that we can be prepared accordingly."

The scientists are intrigued by the consequences of these differing views on free will, as well as how they might vary across lifespan and different cultures.

"How does it impact beliefs about personal responsibility and guilt?" Pronin asked. "Are people likely to spend more time kicking themselves about things that went wrong in their past because they think they could have controlled those things, even though they wouldn't think this in the case of others?"

Provided by Inside Science News Service

jueves, 9 de diciembre de 2010

Our brains are wired so we can better hear ourselves speak, study shows


Activity in the auditory cortex when we speak and listen is amplified in some regions of the brain and muted in others. In this image, the black line represents muting activity when we speak. (Courtesy of Adeen Flinker)

Like the mute button on the TV remote control, our brains filter out unwanted noise so we can focus on what we're listening to. But when following our own speech, a new brain study from UC Berkeley shows that instead of one mute button, we have a network of volume settings that can selectively silence and amplify the sounds we make and hear.

Neuroscientists from UC Berkeley, UCSF and Johns Hopkins University tracked the electrical signals emitted from the brains of hospitalized epilepsy patients. They discovered that neurons in one part of the patients' hearing mechanism were dimmed when they talked, while neurons in other parts lit up.

Their findings, published today (Dec. 8, 2010) in the Journal of Neuroscience, offer new clues about how we hear ourselves above the noise of our surroundings and monitor what we say. Previous studies have shown a selective auditory system in monkeys that can amplify their self-produced mating, food and danger alert calls, but until this latest study, it was not clear how the human auditory system is wired.

"We used to think that the human auditory system is mostly suppressed during speech, but we found closely knit patches of cortex with very different sensitivities to our own speech that paint a more complicated picture," said Adeen Flinker, a doctoral student in neuroscience at UC Berkeley and lead author of the study.

"We found evidence of millions of neurons firing together every time you hear a sound right next to millions of neurons ignoring external sounds but firing together every time you speak," Flinker added. "Such a mosaic of responses could play an important role in how we are able to distinguish our own speech from that of others."

While the study doesn't specifically address why humans need to track their own speech so closely, Flinker theorizes that, among other things, tracking our own speech is important for language development, monitoring what we say and adjusting to various noise environments.

"Whether it's learning a new language or talking to friends in a noisy bar, we need to hear what we say and change our speech dynamically according to our needs and environment," Flinker said.

He noted that people with schizophrenia have trouble distinguishing their own internal voices from the voices of others, suggesting that they may lack this selective auditory mechanism. The findings may be helpful in better understanding some aspects of auditory hallucinations, he said.

Moreover, with the finding of sub-regions of brain cells each tasked with a different volume control job – and located just a few millimeters apart – the results pave the way for a more detailed mapping of the auditory cortex to guide brain surgery.

In addition to Flinker, the study's authors are Robert Knight, director of the Helen Wills Neuroscience Institute at UC Berkeley; neurosurgeons Edward Chang, Nicholas Barbaro and neurologist Heidi Kirsch of the University of California, San Francisco; and Nathan Crone, a neurologist at Johns Hopkins University in Maryland.

The auditory cortex is a region of the brain's temporal lobe that deals with sound. In hearing, the human ear converts vibrations into electrical signals that are sent to relay stations in the brain's auditory cortex where they are refined and processed. Language is mostly processed in the left hemisphere of the brain.

In the study, researchers examined the electrical activity in the healthy brain tissue of patients who were being treated for seizures. The patients had volunteered to help out in the experiment during lulls in their treatment, as electrodes had already been implanted over their auditory cortices to track the focal points of their seizures.

Researchers instructed the patients to perform such tasks as repeating words and vowels they heard, and recorded the activity. In comparing the activity of electrical signals discharged during speaking and hearing, they found that some regions of the auditory cortex showed less activity during speech, while others showed the same or higher levels.

"This shows that our brain has a complex sensitivity to our own speech that helps us distinguish between our vocalizations and those of others, and makes sure that what we say is actually what we meant to say," Flinker said.

Provided by University of California - Berkeley

miércoles, 8 de diciembre de 2010

Support a friend's work: Jia-Jen Lin.


My practice investigates the psychological distance between artificial life and our physical sensations. By way of collecting, modifying, and representing information and materials from everyday experiences, I develop a series of works integrating sculpture, performance, and digital media.Mass-produced products and mechanical systems are manipulating our daily life and our physical sensations. We are not aware of the loss of intimate connections with our physical body, but we know we cannot live without technology.

Wearable structure and video performance as methods to investigate the potential visual dialogues between the body and the materials combined with the body. I would like to draw the attention to the initial stage: to investigate the possibilities between the material world and our physical selves.

In this work the crossing of the boundaries of media and categories becomes not only necessary, but natural. The readily available products of our society become my palette for representing the forms of the natural and for investigating relationships between people and between people and the technological

Think multitasking is new? Our prehistoric ancestors invented it


Answering e-mail while toggling between telephone conversations. Monitoring social networking sites while working. Supervising the kids' homework while listening to the news and cooking dinner. The abundance of contemporary distractions offers many reasons to curse multitasking.

But a UCLA anthropologist refuses to join the chorus. In a new book that explores the long history of multitasking, Monica L. Smith maintains that human beings should appreciate their ability to sequence many activities and to remember to return to a task once it has been interrupted, possibly even with new ideas on how to improve the activity.

"I don't think it's worth saying multitasking is bad," said Smith, the author of "A Prehistory of Ordinary People" (University of Arizona Press). "We can do it, and that is astonishing."

In fact, Smith, an associate professor of anthropology, contends that the multitasking is the ability that separates human beings from animals: "Multitasking is what makes us human."

Vast reserves of memory and the ability to project into the future are the qualities that enable humans to juggle multiple competing demands and to pick up and put down the same project until completion, she said. Animals, by contrast, lack these abilities.

The same cognitive capacity enables such uniquely human abilities as language and the ability to comprehend time and space across increments ranging from the most immediate to the most distant, she contends. Smith also credits multitasking with our ancestor's considerable track record in innovation, particularly at the hands of ordinary people.

"Great deeds have been made possible by the collective experience of people who multitasked through their everyday lives ... and then who devoted some extra portion of their time, energy and the fruits of their labor into coming up with fabulous inventions and building complex societies," she said.

Yet in the popular imagination, contemporary times have some kind of corner on the multitasking market.

"People seem to think that the past was this simpler time with fewer interruptions because so many of the modern gadgets we have today had yet to be invented," Smith said. "But we've been multitasking from the beginning. Every object that we have from the past is the result of a dynamic process where people were being interrupted all the time."

Smith, who specializes in prehistoric economic networks and in the archaeology of consumption and material culture, traces the beginning of multitasking back millions of years to our first bipedal ancestors.

"Once they started walking on two feet, their hands were free to pick up tools, fibers, fruits or kids, and their eyes could look around for opportunities and dangers," she said. "That's the beginning of multitasking right there."

By the time tool-making started 1.5 million years ago, the ability to multitask would have been essential because the linear sequence of tool production would have been subject to frequent interruptions, she said.

For these hunters and gatherers, multitasking would have taken the form of foraging for food or hunting for game while keeping an eye out for stones and other materials with which to make tools, Smith said. Because children often would have been in tow, protecting them — especially from potential predators — would have been part of the mix.

Climate changes that made the globe drier and hotter some 10,000 to 12,000 years ago made the ability to multitask all the more valuable by paving the way for agriculture and animal husbandry, Smith said. Cycles of plant and animal life posed constant scheduling challenges and interruptions for these early settlers. Farm life also demanded the creation and maintenance of a whole new array of objects and structures for food storage, further increasing the need to juggle multiple tasks and priorities.

When humans first moved into cities about 6,000 years ago, the demands met by multitasking increased once again to levels that do not differ that much from today's levels, Smith insists.

"People were trying to cook things in the household while other people were trying to make things," she said. "Night would be coming along and tasks had to be finished before it got dark outside. The seasons would be changing, adding another layer of time pressure. Unexpected visitors would arrive and they'd need to be fed. Or someone was successful at hunting, so all of a sudden, a new animal would show up and everything has to be dropped so that the animal can get gutted, skinned, cleaned, chopped and stuck into stew pot."

Smith finds support for her theory by combining research from two fields. From archaeology, she takes the calculations extracted from archaeological digs to determine the number of people who occupied prehistoric sites and the kinds of human activities that were undertaken there — such as making tools, pots and beads. From anthropological studies of traditional people today, she takes estimates of how long it takes to make similar objects using similar approaches.

"We can calculate how much prehistoric people needed to eat, how long it takes to do a particular kind of task, and any seasonal restrictions on different tasks," Smith said. "We find that there's no way that you could sit down and do any of these things from start to finish. Multitasking had to be involved."

Multitasking also makes sense from a biological perspective, Smith argues, citing recent research by economists, folklorists, neurologists and archaeologists. Researchers have noted that the type of cognitive shortcuts involved in multitasking extends the number of activities humans can accomplish without having to tap higher-order cognitive abilities such as reasoning.

"Reasoning is expensive in time and energy, and the brain circuitry of multitasking reserves this 'expensive' ability for activities with the highest payoff, including decisions about cooperation or conflict with others and the subtleties of choosing among different types of goods and priorities," Brian Loasby, a professor emeritus of economics at Scotland's Stirling University, says in the book.

In addition to being efficient, the dynamic process of repeatedly putting down and picking up tasks by generation after generation of ordinary people has provided an important opportunity for innovation, Smith argues.

"Our ancestors might have set down the stone tool they were making for an hour, a day or a year because a new kid was born or somebody died or a flood came or dinner had to be made," she explained. "When they came back to the tool, they were not exactly the same people who had put it down. Maybe they had learned a new technique, gotten some new information about creating such an object or had thoughts about improving it by changing its shape."

By appreciating the role of multitasking, the role of ordinary people in laying the foundation for great civilizations comes into clearer focus, she argues.

"Every human-made object is the result of people who were consciously integrating all the things that they knew and were learning into the production process, speeding innovation," she said. "When leaders finally came into the picture and began organizing people to build tombs and temples, it was just another layer of activity on top of what ordinary people had already been doing for thousands of years."

Provided by University of California Los Angeles

martes, 7 de diciembre de 2010

QuantumDream, Inc. has just launched DNA Decipher Journal ("DNADJ") with Inaugural Issue in January 2011.

DNA Decipher Journal is a publication in which biologists, physicists, mathematicians and other learned scholars publish their research results and express their views on the origin, nature and mechanism of DNA as a biological program and entity and its possible connection to a deeper reality. The journal is published by QuantumDream, Inc. We are committed to truth and excellence. The current policy at this journal is editorial selections of submitted papers for publication and editorial invitation for publication under the advisement of an editorial Advisory Board, members of which are under selections. All papers published by this journal are either subject to open-peer-review ("OPR") in the same issue or open to OPR in subsequent issues.

jueves, 2 de diciembre de 2010

New study suggests that a propensity for one-night stands, uncommitted sex could be genetic


New study suggests that a propensity for one-night stands, uncommitted sex could be genetic
December 1st, 2010 in Medicine & Health / Psychology & Psychiatry


So, he or she has cheated on you for the umpteenth time and their only excuse is: "I just can't help it." According to researchers at Binghamton University, they may be right. The propensity for infidelity could very well be in their DNA.

In a first of its kind study, a team of investigators led by Justin Garcia, a SUNY Doctoral Diversity Fellow in the laboratory of evolutionary anthropology and health at Binghamton University, State University of New York, has taken a broad look at sexual behavior, matching choices with genes and has come up with a new theory on what makes humans 'tick' when it comes to sexual activity. The biggest culprit seems to be the dopamine receptor D4 polymorphism, or DRD4 gene. Already linked to sensation-seeking behavior such as alcohol use and gambling, DRD4 is known to influence the brain's chemistry and subsequently, an individual's behavior.

"We already know that while many people experience sexual activity, the circumstances, meaning and behavior is different for each person," said Garcia. "Some will experience sex with committed romantic partners, others in uncommitted one-night stands. Many will experience multiple types of sexual relationships, some even occurring at the same time, while others will exchange sex for resources or money. What we didn't know was how we are motivated to engage in one form and not another, particularly when it comes to promiscuity and infidelity."

Gathering a detailed history of the sexual behavior and intimate relationships of 181 young adults along with samples of their DNA, Garcia and his team of investigators were able to determine that individual differences in sexual behavior could indeed be influenced by individual genetic variation.

"What we found was that individuals with a certain variant of the DRD4 gene were more likely to have a history of uncommitted sex, including one-night stands and acts of infidelity," said Garcia. "The motivation seems to stem from a system of pleasure and reward, which is where the release of dopamine comes in. In cases of uncommitted sex, the risks are high, the rewards substantial and the motivation variable – all elements that ensure a dopamine 'rush.'"

According to Garcia, these results provide some of the first biological evidence that at first glance, seems to be somewhat of a contradiction: that individuals could be looking for a serious committed long-term relationship, but have a history of one-night stands. At the same time, the data also suggests it is also reasonable that someone could be wildly in love with their partner, commit infidelity, and yet still be deeply attached and care for their partner. It all came back to a DRD4 variation in these individuals. Individual differences in the internal drive for a dopamine 'rush' can function independently from the drive for commitment.

"The study doesn't let transgressors off the hook," said Garcia. "These relationships are associative, which means that not everyone with this genotype will have one-night stands or commit infidelity. Indeed, many people without this genotype still have one-night stands and commit infidelity. The study merely suggests that a much higher proportion of those with this genetic type are likely to engage in these behaviors."

Garcia also cautions that the consequences of risky sexual behavior can indeed be extreme.

"One-night stands can be risky, both physically and psychologically," said Garcia. "And betrayal can be one of the most devastating things to happen to a couple. These genes do not give anyone an excuse, but they do provide a window into how our biology shapes our propensities for a wide variety of behaviors."

At this point, very little is known about how genetics and neurobiology influence one's sexuality propensities and tendencies but Garcia is hopeful that this study will add to the growing base of knowledge - in particular, how genes might predispose individuals to pursue sensation seeking across all sorts of domains – from substance use to sexuality. This study also provides further support for the notion that the biological foundations for sexual desire may often operate independently from, although absolutely linked to, deep feelings of romantic attachment.

As Garcia points out, he and his team of study co-authors have only just begun to explore the issue and plan on conducting a series of follow-up and related studies.

"We want to run a larger sample of men and women to replicate these findings and check for several other possible genetic markers," said Garcia.

"We will also be conducting a number of behavioral and biological studies to better understand what kinds of associated factors motivate uncommitted sexual behavior. Most importantly, we want to explore the receiving end of infidelity by looking at how people respond to cases of uncommitted sex and infidelity."

More information: A detailed report can be found in the current issue of Public Library of Science's PLoS ONE journal. The article, "Associations between Dopamine D4 Receptor Gene Variation with Both Infidelity and Sexual Promiscuity," can be found at http://dx.plos.org … pone.0014162


Provided by Binghamton University

miércoles, 1 de diciembre de 2010

Study finds children with autism have mitochondrial dysfunction


Study finds children with autism have mitochondrial dysfunction
November 30th, 2010 in Medicine & Health / Diseases


Children with autism are far more likely to have deficits in their ability to produce cellular energy than are typically developing children, a new study by researchers at UC Davis has found. The study, published today in the Journal of the American Medical Association (JAMA), found that cumulative damage and oxidative stress in mitochondria, the cell's energy producer, could influence both the onset and severity of autism, suggesting a strong link between autism and mitochondrial defects.

After the heart, the brain is the most voracious consumer of energy in the body. The authors propose that deficiencies in the ability to fuel brain neurons might lead to some of the cognitive impairments associated with autism. Mitochondria are the primary source of energy production in cells and carry their own set of genetic instructions, mitochondrial DNA (mtDNA), to carry out aerobic respiration. Dysfunction in mitochondria already is associated with a number of other neurological conditions, including Parkinson's disease, Alzheimer's disease, schizophrenia and bipolar disorder.

"Children with mitochondrial diseases may present exercise intolerance, seizures and cognitive decline, among other conditions. Some will manifest disease symptoms and some will appear as sporadic cases," said Cecilia Giulivi, the study's lead author and professor in the Department of Molecular Biosciences in the School of Veterinary Medicine at UC Davis. "Many of these characteristics are shared by children with autism."

The researchers stress that these new findings, which may help physicians provide early diagnoses, do not identify the cause or the effects of autism, which affects as many as 1 in every 110 children in the United States, according to the U.S. Centers for Disease Control and Prevention.

While previous studies have revealed hints of a connection between autism and mitochondrial dysfunction, these reports have been either anecdotal or involved tissues that might not be representative of neural metabolism.

"It is remarkable that evidence of mitochondrial dysfunction and changes in mitochondrial DNA were detected in the blood of these young children with autism," said Geraldine Dawson, chief science officer of Autism Speaks, which provided funding for the study. "One of the challenges has been that it has been difficult to diagnose mitochondrial dysfunction because it usually requires a muscle biopsy. If we could screen for these metabolic problems with a blood test, it would be a big step forward."

For the study, Giulivi and her colleagues recruited 10 autistic children aged 2 to 5, and 10 age-matched typically developing children from similar backgrounds. The children were randomly selected from Northern California subjects who previously had participated in the 1,600-participant Childhood Autism Risk from Genetics and the Environment (CHARGE) Study and who also consented to return for a subsequent study known as CHARGE-BACK, conducted by the UC Davis Center for Children's Environmental Health and Disease Prevention.

The children with autism met stringent diagnostic criteria for autism as defined by the two most widely used and rigorous assessment tools. Though the total number of children studied was small, it is generally representative of the much larger CHARGE cohort, and that increases the significance of the study results, the authors said.

The researchers obtained blood samples from each child and analyzed the metabolic pathways of mitochondria in immune cells called lymphocytes. Previous studies sampled mitochondria obtained from muscle, but the mitochondrial dysfunction sometimes is not expressed in muscle. Muscle cells can generate much of their energy through anaerobic glycolysis, which does not involve mitochondria. By contrast, lymphocytes, and to a greater extent brain neurons, rely more heavily on the aerobic respiration conducted by mitochondria.

The researchers found that mitochondria from children with autism consumed far less oxygen than mitochondria from the group of control children, a sign of lowered mitochondrial activity. For example, the oxygen consumption of one critical mitochondrial enzyme complex, NADH oxidase, in autistic children was only a third of that found in control children.

"A 66 percent decrease is significant," Giulivi said. "When these levels are lower, you have less capability to produce ATP (adenosine triphosphate) to pay for cellular work. Even if this decrease is considered moderate, deficits in mitochondrial energy output do not have to be dismissed, for they could be exacerbated or evidenced during the perinatal period but appear subclinical in the adult years."

Reduced mitochondrial enzyme function proved widespread among the autistic children. Eighty percent had lowered activity in NADH oxidase than did controls, while 60 percent, 40 percent and 30 percent had low activity in succinate oxidase, ATPase and cytochrome c oxidase, respectively. The researchers went on to isolate the origins of these defects by assessing the activity of each of the five enzyme complexes involved in mitochondrial respiration. Complex I was the site of the most common deficiency, found in 60 percent of autistic subjects, and occurred five out of six times in combination with Complex V. Other children had problems in Complexes III and IV.

Levels of pyruvate, the raw material mitochondria transform into cellular energy, also were elevated in the blood plasma of autistic children. This suggests the mitochondria of children with autism are unable to process pyruvate fast enough to keep up with the demand for energy, pointing to a novel deficiency at the level of an enzyme named pyruvate dehydrogenase.

Mitochondria also are the main intracellular source of oxygen free radicals. Free radicals are very reactive species that can harm cellular structures, including DNA. Cells are able to repair typical levels of such oxidative damage. Giulivi and her colleagues found that hydrogen peroxide levels in autistic children were twice as high as in normal children. As a result, the cells of children with autism were exposed to higher oxidative stress.

Mitochondria often respond to oxidative stress by making extra copies of their own DNA. The strategy helps ensure that some normal genes are present even if others have been damaged by oxidation. The researchers found higher mtDNA copy numbers in the lymphocytes of half of the children with autism. These children carried equally high numbers of mtDNA sets in their granulocytes, another type of immune cell, demonstrating that these effects were not limited to a specific cell type. Two of the five children also had deletions in their mtDNA genes, whereas none of the control children showed deletions.

Taken together, the various abnormalities, defects and levels of malfunction measured in the mitochondria of autistic children imply that oxidative stress in these organelles could be influencing autism's onset.

"The various dysfunctions we measured are probably even more extreme in brain cells, which rely exclusively on mitochondria for energy," said Isaac Pessah, director of the Center for Children's Environmental Health and Disease Prevention, a UC Davis MIND Institute researcher and professor of molecular biosciences at the UC Davis School of Veterinary Medicine.

Giulivi cautions that these findings do not amount to establishing a cause for autism.

"We took a snapshot of the mitochondrial dysfunction when the children were 2-to-5 years old. Whether this happened before they were born or after, this study can't tell us," she said. "However, the research furthers the understanding of autism on several fronts and may, if replicated, be used to help physicians diagnose the problem earlier."

"Pediatricians need to be aware of this issue so that they can ask the right questions to determine whether children with autism have vision or hearing problems or myopathies," Giulivi said. Exercise intolerance in the form of muscle cramps during intensive physical activity is one of the characteristics of mitochondrial myopathies.

The chemical fingerprints of mitochondrial dysfunction also may hold potential as a diagnostic tool. Giulivi and colleagues are now examining the mitochondrial DNA of their subjects more closely to pinpoint more precise differences between autistic and non-autistic children.

"If we find some kind of blood marker that is consistent with and unique to children with autism, maybe we can change the way we diagnose this difficult-to-assess condition," she said.

The study also helps refine the search for autism's origins.

"The real challenge now is to try and understand the role of mitochondrial dysfunction in children with autism," Pessah said. "For instance, many environmental stressors can cause mitochondrial damage. Depending on when a child was exposed, maternally or neonatally, and how severe that exposure was, it might explain the range of the symptoms of autism."

"This important exploratory research addresses in a rigorous way an emerging hypothesis about potential mitochondrial dysfunction and autism," said Cindy Lawler, program director at the National Institute of Environmental Health Sciences (NIEHS), which provided funding for the study. "Additional research in this area could ultimately lead to prevention or intervention efforts for this serious developmental disorder."

More information: JAMA. 2010;304[21]:2389-2396.

martes, 30 de noviembre de 2010

Hormone oxytocin bolsters childhood memories of mom's affections



Hormone oxytocin bolsters childhood memories of mom's affections
November 29th, 2010 in Medicine & Health / Research


Researchers have found that the naturally-occurring hormone and neurotransmitter oxytocin intensifies men's memories of their mother's affections during childhood. The study was published today in Proceedings of the National Academy of Sciences.

Researchers at the Seaver Autism Center for Research and Treatment at Mount Sinai School of Medicine wanted to determine whether oxytocin, a hormone and neurotransmitter that is known to regulate attachment and social memory in animals, is also involved in human attachment memories. They conducted a randomized, double-blind, placebo-controlled, cross-over trial, giving 31 healthy adult men oxytocin or a placebo delivered nasally on two occasions. Prior to administering the drug/placebo, the researchers measured the men's attachment style. About 90 minutes after administering the oxytocin or the placebo the researchers assessed participants' recollection of their mother's care and closeness in childhood.

They found that men who were less anxious and more securely attached remembered their mothers as more caring and remembered being closer to their mothers in childhood when they received oxytocin, compared to when they received placebo. However, men who were more anxiously attached remembered their mothers as less caring and remembered being less close to their mothers in childhood when they received oxytocin, compared to when they received placebo. These results were not due to more general effects of oxytocin on mood or well-being.

"These results may seem surprising because researchers have assumed that the neuromodulator oxytocin has ubiquitous positive effects on social behavior and social perception in humans," said Jennifer Bartz, PhD, Assistant Professor, Psychiatry, Mount Sinai School of Medicine, and lead author of the study. "The fact that oxytocin did not make all participants remember their mother as more caring, but in fact intensified the positivity or negativity of the men's pre-existing memories, suggests that oxytocin plays a more specific role in these attachment representations. We believe that oxytocin may help people form memories about important social information in their environment and attach incentive value to those memories.

"However, we do not know whether oxytocin, when administered in drug form, increases a person's ability to accurately recall their mother's affections in childhood, or sets in motion a biased search for memories that support their more general beliefs about close relationships."

The ability to bond with our caregivers early in life has long been thought to be critical to survival because these bonds insure caregiver protection for the otherwise defenseless infant.

"We know very little about the biological mechanisms that support human attachment bonds, but understand that oxytocin regulates attachment in animals, and plays a specific role in forming social memories," said Dr. Bartz. "Our study suggests that oxytocin may similarly play a key role in human attachment by modulating these early memories of mom."

Provided by The Mount Sinai Hospital

A molecular switch for memory and addiction

A molecular switch for memory and addiction
November 29th, 2010 in Medicine & Health / Research


Learning and memory formation are based on the creation of new connections between neurons in the brain. Also, behaviors such as nicotine addiction manifest themselves in long-term changes of neuronal connectivity and can – at least in this respect – be viewed as a form of learning. A team around Pierluigi Nicotera, scientific director of the German Center for Neurodegenerative Diseases (DZNE) and collaborating laboratories at the MRC, UK and University of Modena, Italy have now discovered a molecular switch that plays a crucial role in establishing addictive behavior and memory processes. These results may contribute to new strategies for preventing memory loss or treating addictive behavior. The study is published online in EMBO Journal on November 26th.

Neuronal signals are passed from one nerve cell to the next in form of chemical compounds called neurotransmitters. This signal transmission is a first step and prerequisite for any learning process in the brain. It induces a sequence of events in the downstream cell that eventually lead to changes in neuronal connectivity and thus to memory consolidation. Also nicotine or cocaine can trigger the rearrangement of brain connections in an equivalent manner.

A first step in the induction of neuronal plasticity – the formation of new connections in the brain – involves calcium. As a response to neurotransmitters, nicotine or cocaine, calcium increases at the site of neuronal connection, the synapse. In a second step, this calcium increase will induce gene expression – the synthesis of proteins that will lead to new or reinforced synaptic connectivity. It has been generally accepted that the increase of calcium is only part of the first step in this process and does not depend on gene expression. Pierluigi Nicotera and his colleagues now challenge this idea. Their study shows that the expression of genes involved in calcium signaling is required to induce plasticity in nerve cells after repeated stimulation with nicotine or cocaine.

The scientists found that nicotine administration to mice induces the expression of a gene called type 2 ryanodine receptor (RyR2). RyR2 protein is involved in releasing calcium from a cell internal calcium store, the endoplasmic reticulum, thus leading to a long-lasting reinforcement of calcium signaling in a self-sustained manner. This sustained calcium-increase then leads to neuronal plasticity. Specifically, RyR2 is expressed in a number of brain areas associated with cognition and addiction as the cortex and ventral midbrain, suggesting that RyR2 induction plays a pivotal role in these processes. This idea was confirmed in an additional experiment, in which the authors of the study demonstrate that a reduction of RyR2-activation in living animals abolishes behavior associated with learning, memory and addiction. This shows that RyR2 is absolutely required to develop long-term changes in the brain that lead to addiction.

These results are a major step forwards in understanding the molecular processes underlying memory and addiction. On the long run, the scientists hope that these insights will contribute to the development of therapies for the treatment of addictive disorders or strategies to counteract memory loss in neurodegenerative diseases like Alzheimer's disease.

More information: Elena Ziviani, Giordano Lippi, Daniele Bano, Eliana Munarriz, Stefania Guiducci, Michele Zoli, Kenneth W Young and Pierluigi Nicotera. Ryanodine receptor-2 upregulation and nicotine-mediated plasticity. EMBO Journal, published online on 26th November 2010. doi: EMBOJ.2010.279


Provided by Helmholtz Association of German Research Centres

viernes, 26 de noviembre de 2010

Researchers identify a molecular switch that controls neuronal migration in the developing brain


Researchers identify a molecular switch that controls neuronal migration in the developing brain

November 25th, 2010 in Medicine & Health / Neuroscience
Research led by David Solecki, Ph.D., of St. Jude Children's Research Hospital identifies a molecular switch that controls neuronal migration in the developing brain. The study's findings offer insight into the origins of epilepsy, mental retardation and possibly brain tumor metastasis. Credit: St. Jude Children's Research Hospital

St. Jude Children's Research Hospital investigators have identified key components of a signaling pathway that controls the departure of neurons from the brain niche where they form and allows these cells to start migrating to their final destination. Defects in this system affect the architecture of the brain and are associated with epilepsy, mental retardation and perhaps malignant brain tumors.

The findings provide insight into brain development as well as clues about the mechanism at work in the other developing tissues and organ systems, particularly the epithelial tissue that covers body surfaces. The report appears November 25 in the journal Science online at the Science Express website.

"Neurons are born in germinal zones in the brain, and the places they occupy in the mature brain are sometimes quite a distance away. The cells have to physically move to get to that final destination," said David Solecki, Ph.D., an assistant member of the St. Jude Department of Developmental Neurobiology and the paper's senior author. "If the process is compromised, the result is devastating disruption of brain circuitry that specifically targets children."

In this study, investigators identified not only the molecular complexes that work antagonistically to control departure of brain cells from germinal zones, but also the adhesion molecule that functions as the cells' exit ticket. Solecki and his colleagues showed that high levels of Siah E3 ubiquitin ligase block neuronal departure by tagging a critical part of the cell's migration machinery for degradation through a process known as ubiquitination. Siah's target is Pard3A, which is part of the PAR complex.

By manipulating levels of both Siah and Pard3A, researchers showed that only when neuronal production of Siah falls and Pard3A rises will the cells move out of the germinal zone. The change prompts the cells to alter their migratory path and move toward the location where they will incorporate into the brain's circuitry. The findings mark the first instance of PAR complex activity being regulated by an ubiquitin-targeting protein like Siah.

Investigators used a technique called time-lapse microscopy to directly observe and document the process in the developing cerebellum, the region responsible for balance and fine-tuning body movements. Neurons are the specialized cells that make up the nervous system.

Investigators went on to show that Siah-Pard3A regulates neuronal migration via the adhesion molecule JAM-C, which is short for junctional adhesion molecule C. Researchers demonstrated that silencing JAM-C production in the neurons or preventing JAM-C binding to Pard3A blocked neuronal migration out of the germinal zone.

A similar system at work in epithelial cells relies on JAM-C to keep cells together in a process that also requires the adhesion molecule to bind to the PAR complex, Solecki said. But this is the first report of such mechanisms at work in the developing brain.

Earlier work from the laboratory of Solecki and others showed neurons migrate to their final location by moving along thin fibers produced by brain cells known as glial cells. This study suggests that JAM-C expression on the surface of developing neurons allows the cells to interact with their environment to reach the glial cells. "Without JAM-C, neurons do not move to their final position," he explained.

The researchers developed a fluorescent probe that when combined with time-lapse microscopy made real-time viewing of cell-to-cell binding possible for the first time. "Until now, cell adhesion was difficult to detect and the techniques involved were laborious," Solecki said. "With this approach, it is almost as if the cells are telling us what they are doing. It was very exciting for me to look at a dish of living neurons and see adhesion occur for the first time."

The findings may also offer clues about the spread of malignant brain tumors. Solecki noted that some types of the most common pediatric brain tumor, medulloblastoma, share similarities with immature neurons and seemingly fail to depart the cerebellar germinal zone. Solecki said Siah and Pard3A might provide insight into the mechanisms involved.

Provided by St. Jude Children's Research Hospital

miércoles, 24 de noviembre de 2010

Method to erase traumatic memories may be on the horizon

Method to erase traumatic memories may be on the horizon
November 23rd, 2010 in Medicine & Health / Neuroscience


Soldiers haunted by scenes of war and victims scarred by violence may wish they could wipe the memories from their minds. Researchers at the Johns Hopkins University say that may someday be possible.

A commercial drug remains far off - and its use would be subject to many ethical and practical questions. But scientists have laid a foundation with their discovery that proteins can be removed from the brain's fear center to erase memories forever.

"When a traumatic event occurs, it creates a fearful memory that can last a lifetime and have a debilitating effect on a person's life," says Richard L. Huganir, professor and chair of neuroscience in the Hopkins School of Medicine. He said his finding on the molecular process "raises the possibility of manipulating those mechanisms with drugs to enhance behavioral therapy for such conditions as post-traumatic stress disorder."

The research has drawn interest from some involved in mental health care, and some concern.

Kate Farinholt, executive director of the mental health support and information group NAMI Maryland, said many people suffering from a traumatic event might benefit from erasing a memory. But there are a lot of unanswered questions, she said.

"Erasing a memory and then everything bad built on that is an amazing idea, and I can see all sorts of potential," she said. "But completely deleting a memory, assuming it's one memory, is a little scary. How do you remove a memory without removing a whole part of someone's life, and is it best to do that, considering that people grow and learn from their experiences."

Past research already had shown that a specific form of behavior therapy seemed to erase painful memories. But relapse was possible because the memory wasn't necessarily gone.

By looking at that process, Huganir and postdoctoral fellow Roger L. Clem discovered a "window of vulnerability" when unique receptor proteins are created. The proteins mediate signals traveling within the brain as painful memories are made. Because the proteins are unstable, they can be easily removed with drugs or behavior therapy during the window, ensuring the memory is eliminated.

Researchers used mice to find the window, but believe the process would be the same in humans. They conditioned the rodents with electric shocks to fear a tone. The sound triggered creation of the proteins, called calcium-permeable AMPARS, which formed for a day or two in the fear center, or amygdala, of the mice's brains.

The researchers are working on ways to reopen the window down the road by recalling the painful memory, and using medication to eliminate the protein. That's important because doctors often don't see victims immediately after a traumatic event. PTSD, for example, can surface months later.

Huganir, whose report on erasing fear memories in rodents was published online last month by Science Express, also believes that the window may exist in other centers of learning and may eventually be used to treat pain or drug addiction.

Connie Walker, a Leonardtown, Md., mother of an Iraq war veteran suffering from PTSD, said there isn't enough attention given to the injuries of service members in general and she specifically supports research into PTSD-related therapy. But Walker, a 23-year-Navy veteran herself, said she wouldn't want her son to take a medication to erase what he witnessed.

She said her son began functioning well after he was finally able to get therapy, which she said should be more readily available to every wounded veteran.

"My gut reaction to a drug that erases memories forever is to be frightened," she said. "A person's memory is very much a part of who they are. I recognize we all have some bad memories, though I doubt they can compete with what's coming back from Iraq and Afghanistan. But how can a drug like that be controlled? What else gets eliminated accidentally?"

For now, there aren't yet drugs to erase memories. But there are medications also targeting the amygdala and used with behavior therapy that can lessen the emotional response to painful memories in those with PTSD, such as propranolol, a beta blocker commonly used to treat hypertension.

Paul Root Wolpe, director of the Center for Ethics at Emory University in Atlanta, says permanently erasing memories in humans, if it can be done, wouldn't be a lot different ethically than such behavior modification. Both are memory manipulation. But he said erasing memories is fraught with many more potential pitfalls.

He also said that PTSD sufferers, such as service members in Iraq and Afghanistan, frequently experience more than one traumatic event, and trying to eliminate all the memories could significantly alter a person's personality and history. So could forgetting a whole person after a painful loss or breakup, as depicted in the 2004 movie "Eternal Sunshine of the Spotless Mind."

Wolpe said it can be called dementia when someone forgets that much of their past.

"I don't know what it means to erase that much of a person's life," he said. "You'd leave a giant hole in a person's history. I tend to doubt you'd even be able to."

Further, he said, the safeguards necessary to protect the process from abuse would be difficult. Inmates or soldiers in danger of capture could be subjected to it, for example. Many questions should be decided before testing is pursued in humans, because its use may become "too tempting," he said.

Wolpe could see only limited uses for erasing a memory for now, such as for those suffering after a rape or single terrifying event.

"Certainly, there may be appropriate applications," he said. "But human identity is tied into memory. It creates our distinctive personalities. It's a troublesome idea to begin to be able to manipulate that, even if for the best of motives."

sábado, 20 de noviembre de 2010

Brain–machine interfaces: See what you want to see

Brain–machine interfaces: See what you want to see
Leonie Welberg

Abstract
Visual images that we associate with a familiar concept activate neurons in the medial temporal lobe (MTL) that encode that concept. Now, Cerf, Koch, Fried and colleagues show that when multiple images are viewed simultaneously, humans can use conscious thought to regulate the activity of MTL neurons encoding different concepts, indicating that internal, cognitive processes can override neuronal activation induced by sensory input.

Source: Neuroscience
http://www.nature.com/nrn/journal/v11/n12/full/nrn2958.html

viernes, 19 de noviembre de 2010

The science of decisions


The science of decisions
November 18th, 2010 in Medicine & Health / Psychology & Psychiatry


You may not realize it, but you just made a decision: namely, to read (or at least start to read) this article. Why? What process just occurred in your brain to cause you to be reading this sentence right now? How and why did you make that decision at that moment? That's what Joe Kable, Assistant Professor of Psychology, wants to know. He studies the neurological and psychological workings of choice. "What are the processes that are going on in the brain while people are making decisions; what are the computations that are being performed in different areas of the brain during decision-making?" he asks. "That's something that neuroscientists can study using techniques of neuroscience."

One of those techniques is functional MRI (fMRI), which can show in real time the blood flow variations to different parts of the brain that are associated with increased or decreased activity in those areas. By placing test subjects in an MRI scanner and then presenting them with carefully-constructed tasks involving decision making, Kable is able to observe the ensuing physical activity inside the brain. "fMRI gives us probably the best combination of spatial and temporal resolution in a human being to get a measure of the neural activity that's occurring during a psychological process," Kable says.

To ensure that his experimental subjects take things seriously, Kable ties the decisions they make to actual rewards (i.e., money)—which also helps him to study how the subjective value people place on the consequences and payoffs of their choices can vary among individuals. Kable explains, "One kind of decision that I study is an impulsive decision with regard to the future. I give people a choice: do you want $20 dollars now or $30 in a month? Some people really want the money today, and other people are willing to wait for the larger amount of money in the future. Those differences between the people who are willing to wait and those who aren't are related to differences in how the striatum and the prefrontal cortex are active during these decisions."

Kable observes that the range of individual differences involving such choices can be huge. "There are some individuals who will take 21 dollars in two months over 20 dollars today, and there are others who will take 20 dollars today over 150 dollars in two months," he notes. There also seems to be at least some indication that personality type plays a factor in how someone makes such a decision. "Of the subjects that I've studied, the person who was most patient was a medical resident and was planning toward the future, and the subject who was most impatient was someone who sent me pictures of their skydiving expedition when they were done with the experiment."

Sometimes posing a question in a different way or in a different context can affect a person's decision-making processes and change their minds. Kable says, "One of the things we want to know is, when you change your mind, does that change how those areas are active? We're running fMRI experiments to see how that reorganizes the brain network involved in decision making."

While fMRI can demonstrate associations between neural activity and behavior, it can't quite establish a direct cause-and-effect link. For that, Kable is using other techniques, such as transcranial direct current stimulation, in which a weak electrical current is applied between electrodes placed on the head. "You can make an area more excited or less excited, and if doing that alters the decisions that people make, you have evidence that that brain region is playing a causal role," says Kable.

Even with such powerful tools at his disposal and the insights they've already granted, Kable admits that much work remains to be done. "There's more than enough just to understand the decisions people make and the conditions that lead to those decisions, so adding the additional degree of difficulty of linking that to the underlying neurobiology means I'll be busy for a while," he says with a laugh. "I don't think that we'll have this problem solved completely anytime soon." Which means that the exact reasons for your choice of reading material are likely to remain somewhat mysterious—at least for now.

Provided by University of Pennsylvania

jueves, 18 de noviembre de 2010

Differences in brain development between males and females may hold clues to mental health disorders




Differences in brain development between males and females may hold clues to mental health disorders
November 17th, 2010 in Medicine & Health / Neuroscience


Many mental health disorders, such as autism and schizophrenia, produce changes in social behavior or interactions. The frequency and/or severity of these disorders is substantially greater in boys than girls, but the biological basis for this difference between the two sexes is unknown.

Researchers at the University of Maryland School of Medicine have discovered differences in the development of the amygdala region of the brain – which is critical to the expression of emotional and social behaviors – in animal models that may help to explain why some mental health disorders are more prevalent among boys. They also found a surprising variable – a difference between males and females in the level of endocannabinoid, a natural substance in the brain that affected their behavior, specifically how they played.

The study results have been published online this month in the Proceedings of the National Academy of Sciences.

Margaret M. McCarthy, Ph.D., the senior author and a professor of physiology and psychiatry at the University of Maryland School of Medicine, says, "Our findings help us to better understand the differences in brain development between males and females that may eventually provide the biologic basis for why some mental health conditions are more prevalent in males. We need to determine if these neural differences in the developing brain that we've seen in rats may cause similar behavioral effects in human babies."

Dr. McCarthy and her colleagues found that female rats have about 30 to 50 percent more glial cells in the amygdala region of the temporal lobe of the brain than their male litter mates. They also found that the females had lower amounts of endocannabinoids, which have been dubbed the brain's own marijuana because they activate cannabinoid receptors that are also stimulated by THC, the main psychoactive ingredient of cannabis.

Researchers also found that the female rats also played 30 to 40 percent less than male rats. However, when these newborn female rats were given a cannabis-like compound to stimulate their natural endocannabinoid system, their glial cell production decreased and they displayed increased play behavior later as juveniles. In fact, the level of play exhibited by females treated with a cannabis-like compound was very similar to levels in male rats, the researchers found. Yet exposure to this cannabis-like compound did not appear to have any discernible effect on newborn male rats.

Dr. McCarthy, who is also associate dean for Graduate Studies and interim chair of the Department of Pharmacology & Experimental Therapeutics, notes, "We have never before seen a sex difference such as this in the developing brain involving cell proliferation in females that is regulated by endocannabinoids."

E. Albert Reece, M.D., Ph.D., M.B.A., vice president of medical affairs at the University of Maryland and dean of the University of Maryland School of Medicine, says, "The results of this study provide important clues to brain differences between males and females and may increase our knowledge about how these differences may affect both normal and aberrant brain development, thereby enhancing our understanding of many mental health disorders."

Provided by University of Maryland Medical Center