Pulitzer Center grantee Robin Hammond traveled to Liberia and Sierra Leone to investigate the psychological scars remaining from the brutal wars that plagued the nation. Former child soldiers who suffer from Post Traumatic Stress Disorder do not receive treatment, medical or psychological.
In his “Meet the Journalist Video” Hammond discusses his reporting and the issues with illustrating the idea that signing a peace agreement does not necessarily ensure a full end to suffering.
To learn more, check out Hammond’s full Pulitzer-supported project: “Condemned: Mental Health in Liberia and Sierra Leone”.
A single dose of the hormone oxytocin, delivered via nasal spray, has been shown to enhance brain activity while processing social information in children with autism spectrum disorders, Yale School of Medicine researchers report in a new study published in the Dec. 2 issue of Proceedings of the National Academy of Sciences.
“This is the first study to evaluate the impact of oxytocin on brain function in children with autism spectrum disorders,” said first author Ilanit Gordon, a Yale Child Study Center adjunct assistant professor, whose colleagues on the study included senior author Kevin Pelphrey, the Harris Professor in the Child Study Center, and director of the Center for Translational Developmental Neuroscience at Yale.
Gordon, Pelphrey, and their colleagues conducted a double-blind, placebo-controlled study of 17 children and adolescents with autism spectrum disorders. The participants, between the ages of 8 and 16.5, were randomly given either oxytocin spray or a placebo nasal spray during a task involving social judgments. Oxytocin is naturally occurring hormone produced in the brain and throughout the body.
“We found that brain centers associated with reward and emotion recognition responded more during social tasks when children received oxytocin instead of the placebo,” said Gordon. “Oxytocin temporarily normalized brain regions responsible for the social deficits seen in children with autism.”
Gordon said oxytocin facilitated social attunement, a process that makes the brain regions involved in social behavior and social cognition activate more for social stimuli (such as faces) and activate less for non-social stimuli (such as cars).
“Our results are particularly important considering the urgent need for treatments to target social dysfunction in autism spectrum disorders,” Gordon added.
Shifting the emphasis from gaze to hand, a study by Indiana University cognitive scientists provides compelling evidence for a new and possibly dominant way for social partners — in this case, 1-year-olds and their parents — to coordinate the process of joint attention, a key component of parent-child communication and early language learning.
Previous research involving joint visual attention between parents and toddlers has focused exclusively on the ability of each partner to follow the gaze of the other. In “Joint Attention Without Gaze Following: Human Infants and Their Parents Coordinate Visual Attention to Objects Through Eye-Hand Coordination,” published in the online journal PLOS ONE, the researchers demonstrate how hand-eye coordination is much more common, and the parent and toddler interact as equals, rather than one or the other taking the lead.
The findings open up new questions about language learning and the teaching of language. They could also have major implications for the treatment of children with early social-communication impairment, such as autism, where joint caregiver-child attention with respect to objects and events is a key issue.
"Currently, interventions consist of training children to look at the other’s face and gaze," said Chen Yu, associate professor in the Department of Psychological and Brain Sciences at IU Bloomington. "Now we know that typically developing children achieve joint attention with caregivers less through gaze following and more often through following the other’s hands. The daily lives of toddlers are filled with social contexts in which objects are handled, such as mealtime, toy play and getting dressed. In those contexts, it appears we need to look more at another’s hands to follow the other’s lead, not just gaze."
The new explanation solves some of the problems and inadequacies of the gaze-following theory. Gaze-following can be imprecise in the natural, cluttered environment outside the laboratory. It can be hard to tell precisely what someone is looking at when there are several objects together. It is easier and more precise to follow someone’s hands. In other situations, it may be more useful to follow the other’s gaze.
"Each of these pathways can be useful," Yu said. "A multi-pathway solution creates more options and gives us more robust solutions."
Researchers used innovative head-mounted eye-tracking technology that records the views of those wearing it, like Google Glass, and has never been used before with young children. Recording moment-to-moment high-density data of what both parent and child visually attend to as they play together in the lab, aresearchers also applied advanced data-mining techniques to discover fine-grained eye, head and hand movement patterns from the rich dataset they derived from multimodal digital data. The results reported are based on 17 parent-infant pairs. However, over the course of a few years, Yu and Smith have looked at more than 100 kids, and their data confirm their results.
"This really offers a new way to understand and teach joint attention skills," said co-author Linda Smith, Distinguished Professor in the Department of Psychological and Brain Sciences. Smith is well known for her pioneering research and theoretical work in the development of human cognition, particularly as it relates to children ages 1 to 3 acquiring their first language. "We know that although young children can follow eye gaze, it is not precise, cueing attention only generally to the left or right. Hand actions are spatially precise, so hand-following might actually teach more precise gaze-following."
Kura came highly recommended by friends whose opinions I trust, and it recently topped New York magazine’s list of new omakase restaurants. I was excited to try them out and the sushi did not disappoint! Here’s a look…
Chef Norihiro Ishizuka at his counter…
We opted for the slightly…
*me having foodgasm*
This map shows values of the Regional Competitiveness Index in 2013 for subnational areas in the EU (more specifically, EU NUTS2 regions). The index, developed under the European Commission, incorporates a broad range of indicators including infrastructure, innovation, institutions, and human capital.
A study from Karolinska Institutet in Sweden has shown that neurons in our brain ‘mirror’ the space near others, just as if this was the space near ourselves. The study, published in the scientific journal Current Biology, sheds new light on a question that has long preoccupied psychologists and neuroscientists regarding the way in which the brain represents other people and the events that happens to those people.
"We usually experience others as clearly separated from us, occupying a very different portion of space," says Claudio Brozzoli, lead author of the study at the Department of Neuroscience. "However, what this study shows is that we perceive the space around other people in the same way as we perceive the space around our own body."
The new research revealed that visual events occurring near a person’s own hand and those occurring near another’s hand are represented by the same region of the frontal lobe (premotor cortex). In other words, the brain can estimate what happens near another person’s hand because the neurons that are activated are the same as those that are active when something happens close to our own hand. It is possible that this shared representation of space could help individuals to interact more efficiently — when shaking hands, for instance. It might also help us to understand intuitively when other people are at risk of getting hurt, for example when we see a friend about to be hit by a ball.
The study consists of a series of experiments in functional magnetic resonance imaging (fMRI) in which a total of forty-six healthy volunteers participated. In the first experiment, participants observed a small ball attached to a stick moving first near their own hand, and then near another person’s hand. The authors discovered a region in the premotor cortex that contained groups of neurons that responded to the object only if it was close to the individual’s own hand or close to the other person’s hand. In a second experiment, the authors reproduced their finding before going on to show that this result was not dependent on the order of stimulus presentation near the two hands.
"We know from earlier studies that our brains represent the actions of other people using the same groups of neurons that represent our own actions; the so called mirror neuron system", says Henrik Ehrsson, co-author of the study. "But here we found a new class of these kinds of neuronal populations that represent space near others just as they represent space near ourselves."
According to the scientists, this study provides a new perspective that could help facilitate the understanding of behavioural and emotional interactions between people, since — from the brain’s perspective — the space between us is shared.
How do you know what you’re seeing is real? The science behind these amazing illusions may shake your faith in ‘reality’.