domingo, 31 de octubre de 2010

Emotion processing in the brain is influenced by the color of ambient light

Emotion processing in the brain is influenced by the color of ambient light
October 31st, 2010 in Medicine & Health / Neuroscience

Researchers at the Cyclotron Research Centre (University of Liege), Geneva Center for Neuroscience and Swiss Center for Affective Sciences (University of Geneva), and Surrey Sleep Research Centre (University of Surrey) investigated the immediate effect of light, and of its color composition, on emotion brain processing using functional magnetic resonance imaging. The results of their study show that the colour of light influences the way the brain processes emotional stimuli.

We are all aware that a bright day may lift our mood. However the brain mechanisms involved in such effects of light are largely unknown. Researchers at the Cyclotron Research Centre (University of Liege), Geneva Center for Neuroscience and Swiss Center for Affective Sciences (University of Geneva), and Surrey Sleep Research Centre (University of Surrey) investigated the immediate effect of light, and of its color composition, on emotion brain processing using functional magnetic resonance imaging. The results of their study (in Proceedings of the National Academy of Science of the USA) show that the colour of light influences the way the brain processes emotional stimuli.

Brain activity of healthy volunteers was recorded while they listened to “angry voices” and “neutral voices” and were exposed to blue or green light. Blue light not only increased responses to emotional stimuli in the “voice area” of the brain and in the hippocampus, which is important for memory processes, but also led to a tighter interaction between the voice area, the amygdala, which is a key area in emotion regulation, and the hypothalamus, which is essential for biological rhythms regulation by light (see figure). This demonstrates that the functional organisation of the brain was affected by blue light.

The acute effects of ambient light on emotional processing might differ from its longer-lasting effects on mood, but the present findings in healthy subjects have important implications for our understanding of the mechanisms by which changes in lighting environment could improves mood, not only in mood disorders using light therapy, but also in our day to day life, by paying more attention to our light environment at home and in the work place.

sábado, 30 de octubre de 2010

The unhealthy ego: What can neuroscience tell us about our 'self'?

The unhealthy ego: What can neuroscience tell us about our 'self'?
October 28th, 2010 in Medicine & Health / Neuroscience

With Election Day right around the corner, political egos are on full display. One might even think that possessing a "big ego" is a prerequisite for success in politics, or in any position of leadership. High achievers–CEO's, top athletes, rock stars, prominent surgeons, or scientists–often seem to be well endowed in ego.

But when does a "healthy ego" cross the line into unhealthy territory? Where is the line between confident, positive self-image and grandiose self-importance, which might signal a personality disorder or other psychiatric illness? More fundamentally, what do we mean by ego, from a neural perspective? Is there a brain circuit or neurotransmitter system underlying ego that is different in some people, giving them too much or too little?

What is Ego?

What ego is depends largely on who you ask. Philosophical and psychological definitions abound. Popularly, ego is generally understood as one's sense of self-identity or how we view ourselves. It may encompass self-confidence, self-esteem, pride, and self-worth, and is therefore influenced by many factors, including genes, early upbringing, and stress.

The popular concept of ego is a far cry from what Sigmund Freud elaborated at the turn of the 19th century in his seminal work on psychoanalytical theory. Freud distinguished between primary (id) and secondary (ego) cognitive systems and proposed that the id, or unconscious, was characterized by a free exchange of neural energy and more primitive or animistic thinking. It was the job of the ego, the conscious mind, to minimize that free energy, to "bind" it and thereby regulate the impulses of the unconscious. It was Freud's attempt to "link the workings of the unconscious mind to behavior," says Joseph T. Coyle, M.D., chair of psychiatry and neuroscience at Harvard School of Medicine/McLean Hospital and a Dana Alliance for Brain Initiatives member.

Ego constructs continue to be used in some psychoanalytical therapies, but beyond that, the term seems to be falling out of favor in modern psychiatry. ("Ego is so last century," quips Coyle.) Dana Alliance member Jerome Kagan, Ph.D., professor emeritus of psychology at Harvard, says: "Ego is a terrible word. In Freudian theory, ego has a meaning–not a very precise one, but a meaning. But you can't take the word ego out of Freudian theory and apply it in non-Freudian ways. It just doesn't work."

According to psychiatrist John M. Oldham, M.D., chief of staff at Baylor College of Medicine's Menninger Clinic and President-elect of the American Psychiatric Association (APA), terms like sense of self or self-identity are more common today. The new diagnostic criteria for personality disorders being developed for the revised APA Diagnostic and Statistical Manual of Psychiatric Disorders (DSM-5) will reflect this newer language, he says.

Where's the Ego in Neuroscience?

If ego is loosely defined in psychiatric circles, a neural definition is virtually nonexistent. "Ego doesn't exist in the brain," says Kagan. What does exist, he explains, is a brain circuit that controls the intrusiveness of feelings of self-doubt and anxiety, which can modulate self-confidence. But, Kagan says, "We are nowhere near naming the brain circuit that might mediate the feeling of 'God, I feel great; I can conquer the world.' I believe it's possible to do, but no one knows that chemistry or that anatomy."

Dana Alliance member Joseph LeDoux, Ph.D., a neurobiologist at New York University, has argued that psychological constructs such as ego are not incompatible with modern neuroscience; scientists just need to come up with better ways of thinking about the self and its relation to the brain. "For many people, the brain and the self are quite different," he writes in The Synaptic Self, where he made the opposite case. For LeDoux, it's a truism that our personality—who we are in totality—is represented in the brain as a complex pattern of synaptic connectivity, because synapses underlie everything the brain does. "We are our synapses," he says.

Researchers are increasingly applying the tools of modern neuroscience to try to understand how the brain represents self and other aspects of ego as popularly defined—they just don't call it ego. Brain-imaging studies have used self-reference experiments to investigate the neurobiology of self. For example, asking a subject to make a judgment about a statement, such as "I am a good friend" versus a statement that is self-neutral, such as "water is necessary for life." Others have looked at brain pathology in people with disorders of self. These studies have fairly consistently linked self-referential mental activity to the medial prefrontal cortex, a subregion of the frontal lobe where higher-order cognitive functions are processed.

The medial prefrontal cortex is the locus of the brain's "default mode" network, where metabolic activity is highest when the brain is not actively engaged in a task. During task performance, default mode activity decreases. Washington University neuroimaging pioneer and Dana Alliance member Marcus E. Raichle, M.D., first reported the default mode and has argued that default-state activity may hold clues to the neurobiology of self (Gusnard, 2001).

Could Raichle's default mode state be Freud's ego? Robin Carhart-Harris and Karl Friston of Imperial College London explored that question in a recent article in Brain (Carhart-Harris, 2010), where they proposed that the Freudian ideas of primary and secondary cognitive processes (corresponding to the id and the ego, respectively) "fit comfortably with modern notions of functional brain architecture, at both a computational and neurophysiological level." Acknowledging the "ambitious" nature of that thesis, the authors reviewed a large body of evidence to support it. Freud's theory that ego represses id is consistent, they argued, both with the default mode's characteristic ebb and flow of neuronal activity in opposition to neuronal firing in other brain areas and with theories about the hierarchy of brain systems (e.g., the cortical "thinking" brain is higher-order and therefore regulates the subcortical "primitive" brain).

The Disordered Self

Clues about the neurobiological underpinnings of self can also be seen in psychopathology. "There are a whole range of disorders in which self-identity is affected, in the sense of 'who am I?' and 'how am I distinguished from those around me and things occurring around me?," says Coyle.

The delusions of schizophrenia, for example, have been described as a loss of ego boundaries. Patients may interpret neutral events as being self-referential or may be unable to distinguish what's happening "in here" from "out there," as in the case of auditory hallucinations. These disruptions are thought to be linked to structural changes seen in the brains of people with schizophrenia, including smaller cortical neurons that have fewer connections than normal (van der Meer, 2010).

In frontotemporal dementia (FTD), a key feature is loss of self-awareness or self-identity, sometimes to the point of a complete shift in personality (Sturm, 2008). Imaging studies have revealed severe abnormalities in frontal regions among FTD patients with the most dramatic changes, further supporting the frontal lobe's role in mediating self (Butcher, 2001).

Narcissistic Personality Disorder is characterized by grandiose self-importance and such extreme preoccupation with self that "you lose the capacity to see things through other people's eyes," says Oldham. In contrast, people with Borderline Personality Disorder characteristically lack a strong sense of identity and sometimes get intrusively close to other people, "as if they're putting on the costume of somebody else's personality," he says. In autism, the representation of self may appear to be wholly absent or greatly exaggerated, to the extent that others are under-recognized (Lombardo, 2010)

The manic phase of bipolar disorder is often marked by grandiosity, which represents "the extreme of what we would call egocentricity, a logarithmic multiplication of extreme narcissism." says Oldham. Depression, conversely, often goes hand in hand with extremely low self-esteem.

All personality traits exist on a continuum, Oldham points out, with extremes at either end that sometimes cross the line into psychopathological behavior. The key determinants of whether that line has been crossed are the degree of disruption on interpersonal relations and daily activities. Who goes over the line and who doesn't involves a complex interplay of genetic factors—comprising up to 50 percent of the risk—and environmental triggers, mostly related to stress. Beyond that, there are many more questions than answers.

"We're just beginning to understand this," says Kagan. "There are no firm facts yet. We have some hints, but at this point everything is up for grabs."

More information: References:

(1) See for example: Gusnard DA, Akbudak E, Shulman GL, Raichle ME. Medial prefrontal cortex and self-referential mental activity: relation to a default mode of brain function. Proceedings of the National Academy of Sciences 2001; 98(7):4259-64.

(2) Carhart-Harris, Friston KJ. The default-mode, ego-functions and free-energy: a neurobiological account of Freudian ideas. Brain 2010;133:1265-83.

(3) For a review, see: van der Meer L, Costafreda S, David AS. Self-reflection and the brain: a theoretical review and meta-analysis of neuroimaging studies with implications for schizophrenia. Neuroscience and Biobehavioral Review 2010:34(6):935-40.

(4) For a review, see: Sturm VE, Ascher EA, Miller BL, Levenson RW. Diminished self-conscious emotional responding in frontotemporal lobar degeneration patients. Emotion 2008;8(6): 861-9.

(5) See for example: Butcher J. Self-image contained within right frontal lobe. The Lancet 2001;357(9267):1505.

(6) See for example: Lombardo MV and Baron-Cohen S. Unraveling the paradox of the autistic self. WIREs Cognitive Science 2010;1(3):393-403

viernes, 29 de octubre de 2010

Dynamic epigenetic regulation in neurons

Nature Neuroscience 13, 1330 - 1337 (2010)
Published online: 26 October 2010 | doi:10.1038/ nn.2671

Dynamic epigenetic regulation in neurons: enzymes, stimuli and signaling pathways
Antonella Riccio

The development and function of neurons require the regulated expression of large numbers of very specific gene sets. Epigenetic modifications of both DNA and histone proteins are now emerging as fundamental mechanisms by which neurons adapt their transcriptional response to developmental and environmental cues. In the nervous system, the mechanisms by which extracellular signals regulate the activity of chromatin-modifying enzymes have just begun to be characterized. In this Review, I discuss how extracellular cues, including synaptic activity and neurotrophic factors, influence epigenetic modifications and regulate the neuronal transcriptional response. I also summarize additional mechanisms that induce chromatin remodeling events by combinatorial assembly of multiprotein complexes on neuronal gene promoters.

miércoles, 27 de octubre de 2010

First Torun Neuroculture Event 2011

Event on music and brain with neuroscientists and cognitive scientists who play music and scientists who study music and brain as research area (symposium and concerts).

Blind people perceive touch faster than those with sight

Blind people perceive touch faster than those with sight
October 26th, 2010 in Medicine & Health / Neuroscience

People who are blind from birth are able to detect tactile information faster than people with normal vision, according to a study in the Oct. 27 issue of The Journal of Neuroscience.

The brain requires a fraction of a second to register a sight, sound, or touch. In this study, a group of researchers led by Daniel Goldreich, PhD, of McMaster University explored whether people who have a special reliance on a particular sense — in the way blind people rely on touch — would process that sense faster.

"Our findings reveal that one way the brain adapts to the absence of vision is to accelerate the sense of touch," Goldreich said. "The ability to quickly process non-visual information probably enhances the quality of life of blind individuals who rely to an extraordinary degree on the non-visual senses."

The authors tested the tactile skills of 89 people with sight and 57 people with various levels of vision loss. The volunteers were asked to discern the movements of a small probe that was tapped against the tips of their index fingers. Both groups performed the same on simple tasks, such as distinguishing small taps versus stronger taps. But when a small tap was followed almost instantly by a larger and longer-lasting vibration, the vibration interfered with most participants' ability to detect the tap — a phenomenon called masking. However, the 22 people who had been blind since birth performed better than both people with vision and people who had become blind later in life.

"We think interference happens because the brain has not yet completed the neural processing required to fully perceive the tap before the vibration arrives and disrupts it," Goldreich said. "The more time between the tap and the vibration, the more formed the perception of the tap will be, and the less interference the vibration will cause."

The authors measured the minimum amount of time needed for participants to perceive sensory input by varying the period between the tap and the vibration. They found that congenitally blind people required shorter periods than anyone else. Those same individuals also read Braille fastest. The authors note that each blind person's perception time was approximately equal to the average time that person took to move a finger from one Braille character to the next as they read.

The findings suggest that early onset blindness leads to faster perception of touch. However, whether that advantage is due to the brain adapting to the absence of vision — a change called plasticity — or to a lifetime of practicing Braille is still unclear.

Richard Held, PhD, of Massachusetts Institute of Technology, an expert in the brain and visual development who was unaffiliated with the study, said the results suggest that a lack of visual experience changes how information acquired by touch is processed.

"The heightened skill of tactile integration seems to account for the remarkable speed of Braille-reading demonstrated by some congenitally blind individuals," Held said. "This work constitutes a solid step forward in our understanding of the interaction between senses."

Provided by Society for Neuroscience

domingo, 24 de octubre de 2010

Tracking neuronal activity in the living brain

Tracking neuronal activity in the living brain
October 22nd, 2010 in Medicine & Health / Neuroscience

Refinements to a fluorescent calcium ion indicator give scientists a powerful tool for tracking neuronal activity in the living brain

As electrical signals travel along chains of neurons, each cell undergoes a dramatic shift in its internal calcium ion (Ca2+) concentration because specialized channels allow ions to flood into the cytoplasm. This shift provides a valuable indicator for tracking neural activity in real time, so scientists have developed several fluorescent protein-based Ca2+ indicators that are genetically encoded and can therefore be expressed directly in cells of interest.

Generally these indicators do not perform as well in live animals as in vitro. Takeharu Nagai of Hokkaido University and Katsuhiko Mikoshiba of the RIKEN Brain Science Institute in Wako suspected that indicators with higher affinity for Ca2+ might work better. However, their approach was risky. “It was generally believed that extremely high-affinity Ca2+ indicators would result in low cell viability due to disturbed Ca2+ homeostasis, and show no signal changes due to saturation by resting Ca2+,” say Nagai and Mikoshiba. “From this point of view, our attempt was totally against common sense.”

Nevertheless, the indicators, dubbed YC-Nano, developed by Nagai and his colleagues proved to be a remarkable success. The indicators were derived from yellow cameleon (YC), a genetically encoded indicator consisting of two fluorescent proteins, a ‘donor’ and an ‘acceptor’, connected by a Ca2+-binding domain. In the presence of Ca2+, the structure of YC rearranges such that the two come close together in a manner that allows energy from the excited donor to induce a readily detectable signal from the acceptor; in the absence of Ca2+, only a minimal signal is produced.

The researchers introduced various modifications that lengthened the Ca2+-binding segment between the two fluorescent domains, introducing additional flexibility that considerably improved indicator sensitivity. The best-performing versions exhibited five-fold greater Ca2+ affinity than YC and a high dynamic range. “We were quite surprised that we managed to systematically produce a series of indicator variants with different affinity by a very simple protein engineering trick,” says Nagai.

YC-Nano accurately tracked the complex patterns of Ca2+ activation seen in the aggregating process of social amoeba Dictyostelium, revealing propagating waves throughout the aggregates in a rotating spiral. These indicators also performed well in monitoring neuronal activity in the brains of mice, and Mikoshiba foresees numerous experimental applications in the near future. “Since YC-Nano can be stably expressed in specific types of neurons for a long range of time,” he says, “we expect to perform chronic in vivo imaging and analyze the modifications of neuronal network activities underlying learning, development or diseases of the brain.”

More information: Horikawa, K., et al. Spontaneous network activity visualized by ultrasensitive Ca2+ indicators, yellow Cameleon-Nano. Nature Methods 7, 729–732 (2010).

jueves, 21 de octubre de 2010

Seminario de Introducción a la Biosubjetividad: impartido por Luc Delannoy (Bélgica)

Seminario de Introducción a la Biosubjetividad: impartido por Luc Delannoy (Bélgica)
26 de noviembre del 2010

Fechas: viernes y sábados del 26 de noviembre al 11 de diciembre de 2010

Lugar: Centro Mexicano para la Música y las Artes (CMMAS)

Luc Delannoy (Bruselas, Belgica). Filósofo y escritor influenciado por la hermenéutica de Hans-Georg Gadamer, Paul Ricoeur y Jacques Derrida, y también por los escépticos.
Tiene una larga carrera como profesor invitado en varias universidades del continente americano. Después de ser profesor invitado en la Universidad Nacional Autónoma de México, fundó en México el Centro de Investigaciones en Neuroestética y Neuromusicología - CINNe y el Instituto de Neuroartes.

Ha desarrollado la propuesta académica llamada Neuroartes basada en las relaciones entre las neurociencias (neuroestética y neuromusicología), las artes y la salud.

Conocido principalmente por sus estudios sobre el jazz, Delannoy es uno de los pocos filósofos investigando el tema de la consciencia musical. Desde el principio de 2009 dirige el Seminario Internacional de Biosubjetividades.

Introducción a la Biosubjetividad

I.- Introducción:

Biosubjetividad: resultado de la relación íntima entre biología, filosofía y psicología. Ir de la realidad biopolítica del cuerpo y de la población como tecnología disciplinaria y reguladora de la vida (Foucault), al gobierno de sí mismo.

¿Cómo lograr una estética de la existencia a través de la materia de nuestro cuerpo y la materia del cuerpo del otro? Invención corporal del sujeto - su encarnación. La biosubjetividad implica la encarnación del sujeto (biosujeto) en su materia: una mutación corporal consciente. Implica también describir fenomenológicamente lo vivido del nuevo cuerpo.

II.- Objetivo general:

Reflexionar sobre la definición/constitución biosubjetiva del sujeto en el contexto de la filosofía del cuerpo. Establecer dicha filosofía.

Propósito: estudiar el cuerpo pensante e interactivo en sus relaciones con el cerebro y la mente a través un materialismo no reduccionista y una fenomenología biológica. Estudios con fines de aplicaciones prácticas a nuestras diferentes profesiones.

III.- Contextos:

Biohumanidad: nuevas relaciones entre la biología, las humanidades y la sociedad.

Uso de somatecnias y biotecnologías.

Visión crítica y constructiva de la ciencia.

IV.- Temas:

Teorías de la percepción humana.

o Realismo indirecto VS realismo directo.

o Nuevo realismo crítico.

o Intersubjetividad en el espacio percibido.

o Neuronas espejo y empatía.

o Subjetividad: la experiencia de la escucha musical.

Espacialidad. El cuerpo humano construye el espacio.

* Tipos de espacio: El espacio no es solamente cerebral o sea creado por el cerebro, sino que es también físicamente exterior. Nuestra relación con el espacio pasa por el cuerpo.

o Espacio físico. Interno / externo.

o Espacio percibido. El cuerpo es espacial; sujeto que percibe y es percibido.

o Espacio vivido. Espacio/cuerpo subjetivo.

Espacio del cuerpo humano.

o Esquema corporal.

o Interoceptivo / propioceptivo. Espacialidad de posición, espacialidad de situación.

o Los movimientos como actividad del espacio.

o Esquema corporal y enfermedad.

o Parálisis. El cuerpo en acción.

Cuerpo humano extendido y mutaciones sensoriales. Las plasticidades de sistemas vivos. El cuerpo bioplástico:

o Autotransformación en sujeto

o Tecnologías de sí.

o Los desafíos de la plasticidad: identidad / mutabilidad.

o Plasticidad y alteridad (arte - música).

o Espacio interno: sujeto biotecnológico.

o Mutaciones químicas y mutaciones no-invasivas (Neuroartes).

o Bioarte. Sistemas vivos. Entidades semi-vivas / objetos parcialmente vivos,

Aplicaciones prácticas en nuestras profesiones.

Duración total: 30 horas

Viernes 26 de noviembre de 16:00 a 20:00 hrs.

Sábado 27 de noviembre de 10:00 a 14:00 hrs. y de 16:00 a 18:00 hrs.

Viernes 3 de diciembre de 16:00 a 20:00 hrs.

Sábado 4 de diciembre de 10:00 a 14:00 hrs. y de 16:00 a 18:00 hrs.

Viernes 10 de diciembre de 16:00 a 20:00 hrs.

Sábado 11 de diciembre de 10:00 a 14:00 hrs. y de 16:00 a 18:00 hrs.


* Inscripción: $3000

* Disco compacto con materiales bibliográficos: $200

* El CMMAS ofrece hasta 2 becas de $2000 pesos (costo total del curso) para aquellos que la soliciten con una carta de motivos antes del viernes 21 de octubre.

* El CMMAS ofrece hasta 2 becas de $1500 pesos (costo total del curso) para aquellos que la soliciten con una carta de motivos antes del viernes 21 de octubre.

* Los pagos se pueden hacer por paypal, con tarjeta de crédito, depositando a una cuenta Banamex o en la oficinas del CMMAS (llamar a Silvana Casal al 01 443-3175679 para inscribirse o mandar un correo electrónico a

* Fecha límite de inscripción: miércoles 3 de noviembre de 2010.

* Cada participante es responsable de sus gastos de traslado, hospedaje y alimentación en Morelia.

* Se emitirán certificados para los participantes que cumplan un mínimo del 90% de asistencia.

Centro Mexicano para la Música y las Artes (CMMAS)

Casa de la Cultura,planta alta

Morelos Norte No. 485

C.P. 58000


Morelia, Mich.,

Teléfono/fax: 01-(443) -3-17-56-79

Más información:

Younger brains are easier to rewire

Younger brains are easier to rewire
October 21st, 2010 in Medicine & Health / Neuroscience

About a decade ago, scientists studying the brains of blind people made a surprising discovery: A brain region normally devoted to processing images had been rewired to interpret tactile information, such as input from the fingertips as they trace Braille letters. Subsequent experiments revealed a similar phenomenon in other brain regions. However, these studies didn’t answer the question of whether the brain can rewire itself at any time, or only very early in life.

A new paper from MIT neuroscientists, in collaboration with Alvaro Pascual-Leone at Beth Israel Deaconess Medical Center, offers evidence that it is easier to rewire the brain early in life. The researchers found that a small part of the brain’s visual cortex that processes motion became reorganized only in the brains of subjects who had been born blind, not those who became blind later in life.

The new findings, described in the Oct. 14 issue of the journal Current Biology, shed light on how the brain wires itself during the first few years of life, and could help scientists understand how to optimize the brain’s ability to be rewired later in life. That could become increasingly important as medical advances make it possible for congenitally blind people to have their sight restored, said MIT postdoctoral associate Marina Bedny, lead author of the paper.

Brain rewiring

In the 1950s and ’60s, scientists began to think that certain brain functions develop normally only if an individual is exposed to relevant information, such as language or visual information, within a specific time period early in life. After that, they theorized, the brain loses the ability to change in response to new input.

Animal studies supported this theory. For example, cats blindfolded during the first months of life are unable to see normally after the blindfolds are removed. Similar periods of blindfolding in adulthood have no effect on vision.

However, there have been indications in recent years that there is more wiggle room than previously thought, said Bedny, who works in the laboratory of MIT assistant professor Rebecca Saxe, also an author of the Current Biology paper. Many neuroscientists now support the idea of a period early in life after which it is difficult, but not impossible, to rewire the brain.

Bedny, Saxe and their colleagues wanted to determine if a part of the brain known as the middle temporal complex (MT/MST) can be rewired at any time or only early in life. They chose to study MT/MST in part because it is one of the most studied visual areas. In sighted people, the MT region is specialized for motion vision.

In the few rare cases where patients have lost MT function in both hemispheres of the brain, they were unable to sense motion in a visual scene. For example, if someone poured water into a glass, they would see only a standing, frozen stream of water.

Previous studies have shown that in blind people, MT is taken over by sound processing, but those studies didn’t distinguish between people who became blind early and late in life.

Early versus late

In the new MIT study, the researchers studied three groups of subjects — sighted, congenitally blind, and those who became blind later in life (age nine or older). Using functional magnetic resonance imaging (fMRI), they tested whether MT in these subjects responded to moving sounds — for example, approaching footsteps.

The results were clear, said Bedny. MT reacted to moving sounds in congenitally blind people, but not in sighted people or people who became blind at a later age.

This suggests that in late-blind individuals, the visual input they received in early years allowed the MT complex to develop its typical visual function, and it couldn’t be remade to process sound after the person lost sight. Congenitally blind people never received any visual input, so the region was taken over by auditory input after birth.

“We need to think of early life as a window of opportunity to shape how the brain works,” said Bedny. “That’s not to say that later experience can’t alter things, but it’s easier to get organized early on.”

Another important aspect of the work is the finding that in the congenitally blind, there is enhanced communication between the MT complex and the brain’s prefrontal cortex, said Ione Fine, associate professor of psychology at the University of Washington. That enhanced connection could help explain how the brain remodels the MT region to process auditory information. Previous studies have looked for enlarged nerve bundles, with no success. “People have been looking for bigger roads, but what she’s seeing is more traffic on the same-size road,” said Fine, who was not involved in the study.

Although this work supports the idea that brain regions can switch functions early in a person’s development, Bedny believes that by better understanding how the brain is wired during this period, scientists may be able to learn how to rewire it later in life. There are now very few cases of sight restoration, but if it becomes more common, scientists will need to figure out how to retrain the patient’s brain so it can process the new visual input.

“The unresolved question is whether the brain can relearn, and how that learning differs in an adult brain versus a child’s brain,” said Bedny.

Bedny hopes to study the behavioral consequences of the MT switch in future studies. Those would include whether blind people have an advantage over sighted people in auditory motion processing, and if they have a disadvantage if sight is restored.

More information: "Sensitive period for a multi-modal response in human MT/MST" by, Bedny, M., Konkle, T., Pelphrey, K., Saxe, R., Pascual-Leone, A. Current Biology, 14 October, 2010.

Study of brain injury hints at roots of creativity

martes, 19 de octubre de 2010

See no shape, touch no shape, hear a shape?

See no shape, touch no shape, hear a shape?
October 18th, 2010 in Medicine & Health / Neuroscience

Scientists at The Montreal Neurological Institute and Hospital – The Neuro, McGill University have discovered that our brains have the ability to determine the shape of an object simply by processing specially-coded sounds, without any visual or tactile input. Not only does this new research tell us about the plasticity of the brain and how it perceives the world around us, it also provides important new possibilities for aiding those who are blind or with impaired vision.

Shape is an inherent property of objects existing in both vision and touch but not sound. Researchers at The Neuro posed the question ‘can shape be represented by sound artificially?’ “The fact that a property of sound such as frequency can be used to convey shape information suggests that as long as the spatial relation is coded in a systematic way, shape can be preserved and made accessible - even if the medium via which space is coded is not spatial in its physical nature,” says Jung-Kyong Kim, PhD student in Dr. Robert Zatorre’s lab at The Neuro and lead investigator in the study.

In other words, similar to our ocean-dwelling dolphin cousins who use echolocation to explore their surroundings, our brains can be trained to recognize shapes represented by sound and the hope is that those with impaired vision could be trained to use this as a tool. In the study, blindfolded sighted participants were trained to recognize tactile spatial information using sounds mapped from abstract shapes. Following training, the individuals were able to match auditory input to tactually discerned shapes and showed generalization to new auditory-tactile or sound-touch pairings.

“We live in a world where we perceive objects using information available from multiple sensory inputs,” says Dr. Zatorre, neuroscientist at The Neuro and co-director of the International Laboratory for Brain Music and Sound Research. “On one hand, this organization leads to unique sense-specific percepts, such as colour in vision or pitch in hearing. On the other hand our perceptual system can integrate information present across different senses and generate a unified representation of an object. We can perceive a multisensory object as a single entity because we can detect equivalent attributes or patterns across different senses.” Neuroimaging studies have identified brain areas that integrate information coming from different senses – combining input from across the senses to create a complete and comprehensive picture.

The results from The Neuro study strengthen the hypothesis that our perception of a coherent object or event ultimately occurs at an abstract level beyond the sensory input modes in which it is presented. This research provides important new insight into how our brains process the world as well as new possibilities for those with impaired senses.

The study was published in the journal Experimental Brain Research.

Provided by McGill University

lunes, 18 de octubre de 2010

Smelling the light: 'What if we make the nose act like a retina?'

Smelling the light: 'What if we make the nose act like a retina?'
October 17th, 2010 in Medicine & Health / Neuroscience

Harvard University neurobiologists have created mice that can "smell" light, providing a potent new tool that could help researchers better understand the neural basis of olfaction.

The work, described this week in the journal Nature Neuroscience, has implications for the future study of smell and of complex perception systems that do not lend themselves to easy study with traditional methods.

"It makes intuitive sense to use odors to study smell," says Venkatesh N. Murthy, professor of molecular and cellular biology at Harvard. "However, odors are so chemically complex that it is extremely difficult to isolate the neural circuits underlying smell that way."

Murthy and his colleagues at Harvard and Cold Spring Harbor Laboratory used light instead, applying the infant field of optogenetics to the question of how cells in the brain differentiate between odors.

Optogenetic techniques integrate light-reactive proteins into systems that usually sense inputs other than light. Murthy and his colleagues integrated these proteins, called channelrhodopsins, into the olfactory systems of mice, creating animals in which smell pathways were activated not by odors, but rather by light.

"In order to tease apart how the brain perceives differences in odors, it seemed most reasonable to look at the patterns of activation in the brain," Murthy says. "But it is hard to trace these patterns using olfactory stimuli, since odors are very diverse and often quite subtle. So we asked: What if we make the nose act like a retina?"

With the optogenetically engineered animal, the scientists were able to characterize the patterns of activation in the olfactory bulb, the brain region that receives information directly from the nose. Because light input can easily be controlled, they were able to design a series of experiments stimulating specific sensory neurons in the nose and looking at the patterns of activation downstream in the olfactory bulb.

"The first question was how the processing is organized, and how similar inputs are processed by adjacent cells in the brain," Murthy says.

But it turns out that the spatial organization of olfactory information in the brain does not fully explain our ability to sense odors. The temporal organization of olfactory information sheds additional light on how we perceive odors. In addition to characterizing the spatial organization of the olfactory bulb, the new study shows how the timing of the "sniff" plays a large part in how odors are perceived. The paper has implications not only for future study of the olfactory system, but more generally for teasing out the underlying neural circuits of other systems.

viernes, 15 de octubre de 2010

Love takes up where pain leaves off, brain study shows

Love takes up where pain leaves off, brain study shows
October 13th, 2010 in Medicine & Health / Neuroscience

Love-induced pain relief was associated with the activation of primitive brain structures that control rewarding experiences, such as the nucleus accumbens - shown here in color. Courtesy of Sean Mackey and Jarred Younger

( -- Intense, passionate feelings of love can provide amazingly effective pain relief, similar to painkillers or such illicit drugs as cocaine, according to a new Stanford University School of Medicine study.

"When people are in this passionate, all-consuming phase of love, there are significant alterations in their mood that are impacting their experience of pain," said Sean Mackey, MD, PhD, chief of the Division of Pain Management, associate professor of anesthesia and senior author of the study, which will be published online Oct. 13 in PLoS ONE. "We're beginning to tease apart some of these reward systems in the brain and how they influence pain. These are very deep, old systems in our brain that involve dopamine — a primary neurotransmitter that influences mood, reward and motivation."

Scientists aren't quite yet ready to tell patients with chronic pain to throw out the painkillers and replace them with a passionate love affair; rather, the hope is that a better understanding of these neural-rewards pathways that get triggered by love could lead to new methods for producing pain relief.

"It turns out that the areas of the brain activated by intense love are the same areas that drugs use to reduce pain," said Arthur Aron, PhD, a professor of psychology at State University of New York at Stony Brook and one of the study's authors. Aron has been studying love for 30 years. "When thinking about your beloved, there is intense activation in the reward area of the brain — the same area that lights up when you take cocaine, the same area that lights up when you win a lot of money."

The concept for the study was sparked several years ago at a neuroscience conference when Aron, an expert in the study of love, met up with Mackey, an expert in the research of pain, and they began talking.

"Art was talking about love," Mackey said. "I was talking about pain. He was talking about the brain systems involved with love. I was talking about the brain systems involved with pain. We realized there was this tremendous overlapping system. We started wondering, 'Is it possible that the two modulate each other?'"

After the conference, Mackey returned to Stanford and collaborated with postdoctoral scholar Jarred Younger, PhD, now an assistant professor of anesthesia, who was also intrigued with the idea. Together the three set up a study that would entail examining the brain images of undergraduates who claimed to be "in that first phase of intense love."

"We posted fliers around Stanford University and within hours we had undergrads banging on our door," Mackey said. The fliers asked for couples who were in the first nine months of a romantic relationship.

"It was clearly the easiest study the pain center at Stanford has ever recruited for," Mackey said. "When you're in love you want to tell everybody about it.

"We intentionally focused on this early phase of passionate love," he added. "We specifically were not looking for longer-lasting, more mature phases of the relationship. We wanted subjects who were feeling euphoric, energetic, obsessively thinking about their beloved, craving their presence.

"When passionate love is described like this, it in some ways sounds like an addiction. We thought, 'Maybe this does involve similar brain systems as those involved in addictions which are heavily dopamine-related.' Dopamine is the neurotransmitter in our brain that is intimately involved with feeling good."

Researchers recruited 15 undergraduates (eight women and seven men) for the study. Each was asked to bring in photos of their beloved and photos of an equally attractive acquaintance. The researchers then successively flashed the pictures before the subjects, while heating up a computer-controlled thermal stimulator placed in the palm of their hand to cause mild pain. At the same time, their brains were scanned in a functional magnetic resonance imaging machine.

The undergraduates were also tested for levels of pain relief while being distracted with word-association tasks such as: "Think of sports that don't involve balls." Scientific evidence has shown in the past that distraction causes pain relief, and researchers wanted to make sure that love was not just working as a distraction from pain.

Results showed that both love and distraction did equally reduce pain, and at much higher levels than by concentrating on the photo of the attractive acquaintance, but interestingly the two methods of pain reduction used very different brain pathways.

"With the distraction test, the brain pathways leading to pain relief were mostly cognitive," Younger said. "The reduction of pain was associated with higher, cortical parts of the brain. Love-induced analgesia is much more associated with the reward centers. It appears to involve more primitive aspects of the brain, activating deep structures that may block pain at a spinal level — similar to how opioid analgesics work.

"One of the key sites for love-induced analgesia is the nucleus accumbens, a key reward addiction center for opioids, cocaine and other drugs of abuse. The region tells the brain that you really need to keep doing this," Younger said.

"This tells us that you don't have to just rely on drugs for pain relief," Aron said. "People are feeling intense rewards without the side effects of drugs."

Provided by Stanford University Medical Center

martes, 12 de octubre de 2010

Making decisions is the third way we learn, research shows

Making decisions is the third way we learn, research shows
October 11th, 2010 in Medicine & Health / Psychology & Psychiatry

W. P. Carey School of Business associate professor Pierre Balthazard's brain research helps show a third way we learn and make decisions.
Experts have long believed there are two main ways our brains work: cognition, which is thinking or processing information, and affect, which is feeling or emotion. However, a new breakthrough was just made in regard to a third faculty of the brain: conation.

“When people make ‘gut’ decisions or choices based on instinct, that’s really conation,” says Pierre Balthazard, associate professor in the W. P. Carey School of Business at Arizona State University, who worked on the new research. “Conation has always taken a back seat to the other two faculties of the brain, but we were able to discover some key things about how it works.”

Balthazard recently analyzed the brains of more than 100 healthy people and found evidence they were all operating in the area of conation. Moreover, the findings indicate that people can be trained to compensate for strengths and weaknesses in conation, so their brains keep functioning efficiently, even in stressful situations.

Balthazard already was well-known for doing research in the area of how to map the brain for leadership qualities, using advanced techniques to analyze brain signals. His research is funded by the U.S. Department of Defense. In this case, he also worked with a world-renowned expert in the field of conation, Kathy Kolbe of the Phoenix-based Kolbe Corp., who has been assessing behaviors related to conation for 30 years. She created the basis for the new study, using data from a half-million people who completed her widely used Kolbe A Index.

“My theory was that conation is the one human factor that’s equal among all people; we all start with instinct, but it’s how we use it that gives us our unique character,” she said. “You can manage your response to a situation, but ultimately, you do that based on various strengths hard-wired into your brain. That’s exactly what our research found.”

Balthazard tested Kolbe’s theory by having subjects perform simple tasks. For example, he got a group of mostly high-level executives together at a table with several common objects on it, such as pencils and paper clips. He asked the participants to take one minute to rank the objects in order of importance. People’s strengths and weaknesses in the area of conation determined whether they easily performed the task or whether they found it very daunting and stressful. Balthazard could tell from brain-mapping the subjects beforehand exactly which ones would react in each way.

“We can demonstrate conative stress naturally occurring in business environments, too,” Balthazard said. “What’s important is to be able to identify people’s strengths and weaknesses in this area to help them compensate for various situations, so they aren’t wasting brain power and can keep functioning in an optimal way.”

Provided by Arizona State University

lunes, 11 de octubre de 2010

Research discovers how the deaf have super vision

Research discovers how the deaf have super vision
October 10th, 2010 in Medicine & Health / Neuroscience

Deaf or blind people often report enhanced abilities in their remaining senses, but up until now, no one has explained how and why that could be. Researchers at The University of Western Ontario, led by Stephen Lomber of The Centre for Brain and Mind have discovered there is a causal link between enhanced visual abilities and reorganization of the part of the brain that usually handles auditory input in congenitally deaf cats. The findings, published online in Nature Neuroscience, provide insight into the plasticity that may occur in the brains of deaf people.

Cats are the only animal besides humans that can be born deaf. Using congenitally deaf cats and hearing cats, Lomber and his team showed that only two specific visual abilities are enhanced in the deaf: visual localization in the peripheral field and visual motion detection. They found the part of the auditory cortex that would normally pick up peripheral sound enhanced peripheral vision, leading the researchers to conclude the function stays the same but switches from auditory to visual.

"The brain is very efficient, and doesn't let unused space go to waste," says Lomber, an associate professor in the Department of Physiology and Pharmacology at the Schulich School of Medicine & Dentistry, and Department of Psychology in the Faculty of Social Science. "The brain wants to compensate for the lost sense with enhancements that are beneficial. For example, if you're deaf, you would benefit by seeing a car coming far off in your peripheral vision, because you can't hear that car approaching from the side; the same with being able to more accurately detect how fast something is moving."

Lomber and his team are trying to discover how a deaf brain differs from a hearing brain to better understand how the brain handles cochlear implants. If the brain has rewired itself to compensate for the loss of hearing, what happens when hearing is restored? "The analogy I use is, if you weren't using your cottage and lent it to a friend. That friend gets comfortable, maybe rearranges the furniture, and settles in. They may not want to leave just because you've come back," explains Lomber.

He also plans to conduct research to see if these changes in the brain also happen to those who could hear at one time, or if auditory experience prevents the changes from occurring.

Provided by University of Western Ontario

sábado, 9 de octubre de 2010

Neurons cast votes to guide decision-making

Neurons cast votes to guide decision-making
October 8th, 2010 in Medicine & Health / Research

We know that casting a ballot in the voting booth involves politics, values and personalities. But before you ever push the button for your candidate, your brain has already carried out an election of its own to make that action possible. New research from Vanderbilt University reveals that our brain accumulates evidence when faced with a choice and triggers an action once that evidence reaches a tipping point.

The research was published in the October issue of Psychological Review.

"Psychological models of decision-making explain that humans gradually accumulate evidence for a particular choice over time, and execute that choice when evidence reaches a critical level. However, until recently there was little understanding of how this might actually be implemented in the brain," Braden Purcell, a doctoral student in the Department of Psychology and lead author of the new study, said. "We found that certain neurons seem to represent the accumulation of evidence to a threshold and others represent the evidence itself, and that these two types of neurons interact to drive decision-making."

The researchers presented monkeys with a simple visual task of finding a target on a screen that also included distracting items. The researchers found that neurons processing visual information from the screen fed that information to the neurons responsible for movement. These movement neurons served as gatekeepers, suppressing action until the information they received from the visual neurons was sufficiently clear. When that occurred, the movement neurons then proceeded to trigger the chosen movement.

The researchers also found that the movement neurons mediated a competition between what was being seen—in this case, the target and distracting items—and ensured that the decision was made to look to the proper item.

"What the brain seems to do is for every vote it receives for one candidate, it suppresses a vote for the other candidate, exaggerating the differences between the two," Jeffrey Schall, E. Bronson Ingram Chair in Neuroscience and co-author of the study said. "The system that makes the response doesn't listen to the vote tally until it's clear that the election is going towards one particular candidate. At that point, the circuitry that makes the movement is triggered and the movement takes place."

The findings offer potential insights into some psychological disorders.

"Impairments in decision-making are at the core of a variety of psychological and neurological impairments. For example, previous work suggests that ADHD patients may suffer deficits in controlling evidence accumulation," Purcell said. "This work may help us to understand why these deficits occur at a neurobiological level."

An important piece of this research is the novel model the researchers used in the study. The new model combined a mathematical prediction of what they thought would transpire with actual data about what the neurons were doing.

"In a model, usually all the elements are defined by mathematical equations or computational expressions," Thomas Palmeri, associate professor of psychology and a co-author of the study, said. "In our work, rather than coming up with a mathematical expression for the inputs to the neural decision process, we defined those inputs with actual recordings from neurons. This hybrid model predicts both where and when the eyes move, and variability in the timing of those movements."

"This approach provides insight between psychological processes and what neurons are doing," Schall said. "If we want to understand the mind-brain problem, this is what solutions look like."

Provided by Vanderbilt University

viernes, 8 de octubre de 2010

When Music and Long-Term Memory Interact

When Music and Long-Term Memory Interact: Effects of Musical Expertise on Functional and Structural Plasticity in the Hippocampus

Mathilde Groussard 1, Renaud La Joie 1, Géraldine Rauchs 1, Brigitte Landeau 1, Gaël Chételat 1, Fausto Viader 1,2, Béatrice Desgranges 1, Francis Eustache 1, Hervé Platel 1

1 Inserm-EPHE-Université de Caen/Basse-Normandie, Unité U923, GIP Cyceron, CHU Côte de Nacre, Caen, France, 2 Département de Neurologie, CHU Côte de Nacre, Caen, France

The development of musical skills by musicians results in specific structural and functional modifications in the brain. Surprisingly, no functional magnetic resonance imaging (fMRI) study has investigated the impact of musical training on brain function during long-term memory retrieval, a faculty particularly important in music. Thus, using fMRI, we examined for the first time this process during a musical familiarity task (i.e., semantic memory for music). Musical expertise induced supplementary activations in the hippocampus, medial frontal gyrus, and superior temporal areas on both sides, suggesting a constant interaction between episodic and semantic memory during this task in musicians. In addition, a voxel-based morphometry (VBM) investigation was performed within these areas and revealed that gray matter density of the hippocampus was higher in musicians than in nonmusicians. Our data indicate that musical expertise critically modifies long-term memory processes and induces structural and functional plasticity in the hippocampus.

Source: PLoS One [Open Access]

martes, 5 de octubre de 2010

Canadian helps severely disabled speak through music

Canadian helps severely disabled speak through music
October 4th, 2010 in Medicine & Health / Health

Children immured within their severely disabled bodies may soon be able to communicate thanks to a newly unveiled device that translates physiological signals into music.

Stefanie Blain studied for five years the interactions of children with severe disabilities and their parents, as part of her doctoral studies at Canada's largest children's rehabilitation hospital, Holland Bloorview, in Toronto.

"They can 'read' their children by observing minuscule movements of their lips, or changes in their breathing," she told a Technology, Entertainment, Design (TEDx) conference in Montreal.

By measuring those tiny physiological signals, Blain was able to show that Max, a 15-year-old adolescent who was seemingly in a vegetative state, became animated when he spotted his favorite toy.

Even completely paralyzed, the body continues to react by changes in body temperature and sweat levels, as well as heart and breathing rates, she explained.

Blain initially created a graphic translation of these physiological arousals.

"But my curves and 3D simulations didn't speak to anyone," she said with a smile. So, being a musician, she wrote algorithms to convert them into sounds," she said.

"Another child, whom we believed was always sleeping, started to emit a 'biological song' whenever clowns entered his room. It was the first time that his parents and hospital staff realized that he was conscious of the world around him," she said.

The software she created deciphers physiological signals and translates them into a range of tonalities, from soft low-pitched sounds when an individual is calm to high-pitched and more complicated tunes when they are thinking of pleasant things.

"Each 'song' is unique," she said.

"Imagine that when I arrived at the hospital, the hallways were quiet leading to the rooms of children who could neither move, nor speak, and not even make facial expressions," she said.

"Imagine these hallways now (full of music), imagine parents who can really get to know their children."

Currently, the research team lead by Tom Chau is seeking to expand the technology to make it possible for severely disabled children to answer yes or no when prompted, and use a computer.

(c) 2010 AFP