In The News
New research discovers the brain regions that help to optimize social functioning are also important for general intelligence and emotional intelligence.
This finding suggests general intelligence emerges from the emotional and social context of one’s life.
“We are trying to understand the nature of general intelligence and to what extent our intellectual abilities are grounded in social cognitive abilities,” said Aron Barbey, a University of Illinois professor of neuroscience.
As reported in the journal Brain, studies in social psychology indicate that human intellectual functions originate from the social context of everyday life.
“We depend at an early stage of our development on social relationships — those who love us care for us when we would otherwise be helpless,” says Barbey.
“Social interdependence continues into adulthood and remains important throughout the lifespan,” Barbey said.
“Our friends and family tell us when we could make bad mistakes and sometimes rescue us when we do,” he said.
“And so the idea is that the ability to establish social relationships and to navigate the social world is not secondary to a more general cognitive capacity for intellectual function, but that it may be the other way around.
“Intelligence may originate from the central role of relationships in human life and therefore may be tied to social and emotional capacities.”
In the study, researchers studied 144 Vietnam veterans injured by shrapnel or bullets that penetrated the skull, damaging distinct brain tissues while leaving neighboring tissues intact.
Using computerized tomography (CT) scans, the scientists painstakingly mapped the affected brain regions of each participant, then pooled the data to build a collective map of the brain.
The researchers used a battery of carefully designed tests to assess participants’ intellectual, emotional, and social capabilities.
They then looked for patterns that tied damage to specific brain regions to deficits in the participants’ ability to navigate the intellectual, emotional, or social realms. Social problem solving in this analysis primarily involved conflict resolution with friends, family, and peers at work.
As in their earlier studies of general intelligence and emotional intelligence, the researchers found that regions of the frontal cortex (at the front of the brain), the parietal cortex (further back near the top of the head) and the temporal lobes (on the sides of the head behind the ears) are all implicated in social problem solving.
The regions that contributed to social functioning in the parietal and temporal lobes were located only in the brain’s left hemisphere, while both left and right frontal lobes were involved.
“The brain networks found to be important to social adeptness were not identical to those that contribute to general intelligence or emotional intelligence, but there was significant overlap,” Barbey said.
“The evidence suggests that there’s an integrated information-processing architecture in the brain, that social problem solving depends upon mechanisms that are engaged for general intelligence and emotional intelligence,” he said.
“This is consistent with the idea that intelligence depends to a large extent on social and emotional abilities, and we should think about intelligence in an integrated fashion rather than making a clear distinction between cognition and emotion and social processing.
“This makes sense because our lives are fundamentally social — we direct most of our efforts to understanding others and resolving social conflict. And our study suggests that the architecture of intelligence in the brain may be fundamentally social, too.”
Source:University of Illinois
New research confirms that some people can handle stressful situations better than others.
The difference, according to scientists, is not a result of genes as even identical twins show differences in how they respond to stress.
In a new study, researchers identified a specific electrical pattern in the brains of genetically identical mice that predicts how well individual animals will fare in stressful situations.
The findings, as published in Nature Communications, may eventually help researchers prevent potential consequences of chronic stress — such as post-traumatic stress disorder, depression, and other psychiatric disorders — in people who are prone to these problems.
“In soldiers, we have this dramatic, major stress exposure, and in some individuals it’s leading to major issues, such as problems sleeping or being around other people,” said senior author Kafui Dzirasa, M.D., Ph.D.
“If we can find that common trigger or common pathway and tune it, we may be able to prevent the emergence of a range of mental illnesses down the line.”
In the new study, Dzirasa’s team analyzed the interaction between two interconnected brain areas that control fear and stress responses in both mice and men: the prefrontal cortex and the amygdala.
The amygdala plays a role in the ‘fight-or-flight’ response. The prefrontal cortex is involved in planning and other higher-level functions.
It suppresses the amygdala’s reactivity to danger and helps people continue to function in stressful situations.
Implanting electrodes into the brains of the mice allowed the researchers to listen in on the tempo at which the prefrontal cortex and the amygdala were firing and how tightly the two areas were linked — with the ultimate goal of figuring whether the electrical pattern of cross talk could help decide how well animals would respond when faced with an acute stressor.
Indeed, in mice that had been subjected to a chronically stressful situation — daily exposure to an aggressive male mouse for about two weeks — the degree to which the prefrontal cortex seemed to control amygdala activity was related to how well the animals coped with the stress, the group found.
Next the group looked at how the brain reacted to the first instance of stress, before the mice were put in a chronically stressful situation. The mice more sensitive to chronic stress showed greater activation of their prefrontal cortex-amygdala circuit, compared with resilient mice.
“We were really both surprised and excited to find that this signature was present in the animals before they were chronically stressed,” Dzirasa said. “You can find this signature the very first time they were ever exposed to this aggressive dangerous experience.”
Dzirasa hopes to use the signatures to come up with potential treatments for stress. “If we pair the signatures and treatments together, can we prevent symptoms from emerging, even when an animal is stressed? That’s the first question,” he said.
The group also hopes to delve further into the brain, to see whether the circuit-level patterns can interact with genetic variations that confer risk for psychiatric disorders such as schizophrenia.
Researchers anticipate the new study will help them separate stress-susceptible and resilient animals before they are subjected to stress, thus allowing them to identify molecular, cellular, and systemic differences.
Source: Duke University
A new study finds that running for only a few minutes a day or at slow speeds may significantly reduce a person’s risk of death from cardiovascular disease when compared to someone who does not run.
While it is well known that exercise can improve health, authorities traditionally believed 75 minutes of moderate intensity exercise per week were necessary to improve cardiac functions and convey health benefits.
In the study, researchers followed 55,137 adults between the ages of 18 and 100 over a 15-year period to determine whether there is a relationship between running and longevity.
Data was drawn from the Aerobics Center Longitudinal Study, where participants were asked to complete a questionnaire about their running habits.
In the study period, 3,413 participants died, including 1,217 whose deaths were related to cardiovascular disease.
In this population, 24 percent of the participants reported running as part of their leisure-time exercise.
As reported in the Journal of the American College of Cardiology, compared with non-runners, the runners had a 30 percent lower risk of death from all causes and a 45 percent lower risk of death from heart disease or stroke.
Runners on average lived three years longer compared to non-runners.
Also, to reduce mortality risk at a population level from a public health perspective, the authors concluded that promoting running is as important as preventing smoking, obesity, or hypertension.
The benefits were the same no matter how long, far, frequently, or fast participants reported running.
Benefits were also the same regardless of sex, age, body mass index, health conditions, smoking status, or alcohol use.
The study showed that participants who ran less than 51 minutes, fewer than six miles, slower than six miles per hour, or only one to two times per week had a lower risk of dying compared to those who did not run.
D.C. Lee, Ph.D., lead author of the said they found that runners who ran less than an hour per week have the same mortality benefits compared to runners who ran more than three hours per week.
Thus, it is possible that more may not be better in relation to running and longevity.
Researchers also looked at running behavior patterns and found that those who persistently ran over a period of six years on average had the most significant benefits, with a 29 percent lower risk of death for any reason and 50 percent lower risk of death from heart disease or stroke.
“Since time is one of the strongest barriers to participate in physical activity, the study may motivate more people to start running and continue to run as an attainable health goal for mortality benefits,” Lee said.
“Running may be a better exercise option than more moderate intensity exercises for healthy but sedentary people since it produces similar, if not greater, mortality benefits in five to 10 minutes compared to the 15 to 20 minutes per day of moderate intensity activity that many find too time consuming.”
Source: American College of Cardiology
New research presents evidence that while the impact of life’s stressors accumulate overtime and accelerate cellular aging, a healthy lifestyle may counteract many of these effects.
In the new study, University of California, San Francisco researchers discovered maintaining a healthy diet, exercising, and sleeping well reduced the negative effects of life stress.
“The study participants who exercised, slept well, and ate well had less telomere shortening than the ones who didn’t maintain healthy lifestyles, even when they had similar levels of stress,” said lead author Eli Puterman, Ph.D.
“It’s very important that we promote healthy living, especially under circumstances of typical experiences of life stressors like death, caregiving, and job loss.”
The paper will be published in the journal Molecular Psychiatry.
Telomeres are the protective caps at the ends of chromosomes that affect how quickly cells age. They are combinations of DNA and proteins that protect the ends of chromosomes and help them remain stable.
As they become shorter, and as their structural integrity weakens, the cells age and die quicker. Telomeres also get shorter with age.
In the study, researchers examined three healthy behaviors — physical activity, dietary intake, and sleep quality — over the course of one year in 239 post-menopausal, non-smoking women.
The women provided blood samples at the beginning and end of the year for telomere measurement and reported on stressful events that occurred during those 12 months.
In women who engaged in lower levels of healthy behaviors, there was a significantly greater decline in telomere length in their immune cells for every major life stressor that occurred during the year.
Yet women who maintained active lifestyles, healthy diets, and good quality sleep appeared protected when exposed to stress — accumulated life stressors did not appear to lead to greater shortening.
“This is the first study that supports the idea, at least observationally, that stressful events can accelerate immune cell aging in adults, even in the short period of one year.”
Researchers were excited to find that during tines of high stress, keeping active, eating, and sleeping well attenuated the accelerated aging of our immune cells.
In recent years, shorter telomeres have become associated with a broad range of aging-related diseases, including stroke, vascular dementia, cardiovascular disease, obesity, osteoporosis diabetes, and many forms of cancer.
Research on telomeres, and the enzyme that makes them, telomerase, was pioneered by three Americans, including University of California, San Francisco molecular biologist and co-author Elizabeth Blackburn, Ph.D. Blackburn co-discovered the telomerase enzyme in 1985.
The scientists received the Nobel Prize in Physiology or Medicine in 2009 for their work.
“These new results are exciting yet observational at this point.
“They do provide the impetus to move forward with interventions to modify lifestyle in those experiencing a lot of stress, to test whether telomere attrition can truly be slowed,” said Blackburn.
New research from the UK identifies a link between obesity in childhood and the lowering of the age of puberty.
In a new investigation, endocrinologists studied a protein called sex hormone-binding globulin (SHBG).
SHBG binds to the sex hormones androgen and estrogen with SHGB levels initially high in childhood then declining significantly before puberty — in essence ‘allowing’ puberty to happen.
The research team analyzed data from the EarlyBird longitudinal study of 347 schoolchildren in Plymouth, UK, aged five to 15 years.
The findings of this assessment showed that a child who is heavier at age five tends to have lower levels of SHBG throughout childhood and reaches puberty sooner.
The tendency was more striking in girls than in boys.
The study suggested that a combination of hormonal disturbances that are associated with weight gain and obesity, together with inflammation, might be the biological mechanism that explains the observed relationship between weight gain and the declining age of puberty.
It is not known why increasing body weight is associated with earlier puberty, especially in girls, but one possible explanation for this is that humans, like all mammals, require large amounts of energy to reproduce.
Throughout most of evolution, a well-nourished state would have greatly favored successful pregnancy in a world with high perinatal mortality.
Conversely, a state of poor nutrition and low body weight, is disadvantageous to reproduction, and slows down reproductive maturation or leads to infertility.
Deliberate weight control in female athletes and dancers, or the state of anorexia nervosa, still result in the same phenomenon of infertility.
Thus, the hormones that control appetite and body weight interact closely with those that allow fertility.
The new findings show that SHBG is part of an interaction between the body’s systems for controlling energy balance and reproduction.
The findings are of additional interest because they might go some way to answering the question of why, historically, the age of puberty has declined over the past century.
For example, the onset of puberty in girls in 1920 was 14.6 years; in 1950 13.1; in 1980 12.5; and in 2010 10.5.
In boys, puberty has always tended to occur a year or so later than in girls.
The findings also open a debate about the role of the worldwide obesity epidemic in the general lowering of the age of puberty.
The World Health Organization recognizes childhood obesity as one of the most serious global health challenges for the 21st century.
Figures from the UK’s 2012/2013 National Child Measurement Program show that almost a third of 10 to 11 year old’s and over a fifth of four to five year old’s were either obese or overweight.
Alarming, childhood obesity also increases the risk of heart disease and diabetes in later life.
Professor Jonathan Pinkney comments, “There are critical windows early in life which the die is cast for our long term health. We know that weight gain often begins early and we wanted to investigate how early weight gain might be linked to earlier puberty.”
He added: “Here we have found compelling evidence that hormonal effects of obesity, and associated inflammation, affect levels of SHBG and hence the age when puberty commences. As a higher proportion of youngsters around the world have become obese, so has the age of puberty dropped. We now know that the relation between these issues is more than coincidental.”
He concluded: “These findings have significant implications for children’s development and public health around the world.
“Reduction in the age of puberty, as a result of early weight gain, expedites physical and psychosocial development at a younger age, and this potentially means an earlier ability to reproduce as well as poorer long term adult health.
“The observed effects on puberty are another reason to take action against childhood obesity”.
Source: University of Plymouth
Accurate or not, first impressions appear to stem from how a person looks.
Researchers in the Department of Psychology at the University of York determined a first impression is accurately predicted from measurements of physical features in everyday images of faces, such as those found on social media.
Investigators explain that when we look at a picture of a face we rapidly form judgments about a person’s character, for example whether they are friendly, trustworthy, or competent.
Even though it is not clear how accurate they are, these first impressions can influence our subsequent behavior (for example, judgments of competence based on facial images can predict election results).
The impressions we create through images of our faces (“avatars” or “selfies”) are becoming more and more important in a world where we increasingly get to know one another online rather than in the flesh.
Previous research has shown that many different judgments can be boiled down to three distinct “dimensions”: approachability (do they want to help or harm me?), dominance (can they help or harm me?), and youthful-attractiveness (perhaps representing whether they’d be a good romantic partner – or a rival!).
To investigate the basis for these judgments the research team took ordinary photographs from the web and analyzed physical features of the faces to develop a model that could accurately predict first impressions.
Each of 1,000 faces was described in terms of 65 different features such as “eye height”, “eyebrow width”, and so on. By combining these measures the model could explain more than half of the variation in human raters’ social judgments of the same faces.
Reversing the process it was also possible to create new cartoon-like faces that produced predictable first impressions in a new set of judges. These images also illustrate the features that are associated with particular social judgments.
The study, published in the Proceedings of the National Academy of Science (PNAS), shows how important faces and specific images of faces can be in creating a favorable or unfavorable first impression.
It provides a scientific insight into the processes that underlie these judgments and perhaps into the instinctive expertise of those (such as casting directors, portrait photographers, picture editors, and animators) who create and manipulate these impressions professionally.
Richard Vernon, a Ph.D. student who was part of the research team, said: “Showing that even supposedly arbitrary features in a face can influence people’s perceptions suggests that careful choice of a photo could make (or break) others’ first impressions of you.”
Fellow Ph.D. student, Clare Sutherland, said: “We make first impressions of others so intuitively that it seems effortless — I think it’s fascinating that we can pin this down with scientific models. I’m now looking at how these first impressions might change depending on different cultural or gender groups of perceivers or faces.”
Professor Andy Young, of the Department of Psychology at York, said, “Showing how these first impressions can be captured from very variable images of faces offers insight into how our brains achieve this seemingly remarkable perceptual feat.”
Dr Tom Hartley, who led the research with Professor Young, added, “In everyday life I am not conscious of the way faces and pictures of faces are influencing the way I interact with people.
“Whether in ‘real life’ or online; it feels as if a person’s character is something I can just sense. These results show how heavily these impressions are influenced by visual features of the face — it’s quite an eye opener!”
Source: University of York
Older women with mild cognitive impairment may benefit significantly from regular aerobic exercise, new findings show. Mild cognitive impairment (MCI) is an established risk factor for dementia and “represents a vital opportunity for intervening,” say Dr. Teresa Liu-Ambrose of the University of British Columbia, Canada, and colleagues in the British Journal of Sports Medicine.
Currently, 35.6 million people worldwide have dementia and this number is expected to increase to 115.4 million by the year 2050.
“Exercise is a promising strategy for combating cognitive decline by improving brain structure and function,” they write. Aerobic training in particular may benefit otherwise healthy community-dwelling older people.
They recruited 86 women aged 70 to 80 years with probable MCI. The women undertook either aerobic training (brisk walking), resistance training (lunges, squats, and weights), or balance and tone training twice a week, for six months. The balance and tone training was not strenuous exercise, and was considered the “control” group.
At the start and the end, the women were given MRI scans of their hippocampal volume. The hippocampus plays important roles in short-term and long-term memory, and spatial navigation, and appears to be very sensitive to the effects of aging and neurological damage. Tests were also given to measure verbal memory and learning.
Compared with the balance and tone “control” group, aerobic training significantly improved left, right, and total hippocampal volumes, the team reports. “We observed a 5.6 percent increase in the left hippocampus, a 2.5 percent increase in the right hippocampus, and a four percent increase in the total hippocampus,” they write.
But they add that there was “some evidence” that increased left hippocampal volume was linked with poorer verbal memory. However, in earlier studies, increased left hippocampal volume has been linked to better performance on verbal memory tests.
“The relationship between brain volume and cognitive performance is complex, and requires further research,” say the authors.
“We might have assumed a one percent gain in hippocampal volume should improve verbal learning memory by one percent, but our results suggest that it may not be that simple,” said Dr. Liu-Ambrose. “There may be other factors we are not considering.”
Limitations of this study include the lack of male participants and those aged below 70 and over 80. But the authors do recommend regular aerobic exercise to help prevent mild cognitive decline, in addition to its many other heath benefits.
They conclude, “Aerobic training significantly increased hippocampal volume in older women with probable MCI. More research is needed to ascertain the relevance of exercise-induced changes in hippocampal volume on memory performance in older adults with MCI.”
“The degree of benefit in terms of brain structure might actually be greater in people with early functional complaints than in healthy older people,” the team adds. “Understanding the effect of exercise on the hippocampus will increase our appreciation of the role exercise may play in dementia prevention,” they conclude.
Our understanding of the impact of exercise on MCI would now benefit from studies with more participants, as well as a focus on the different MCI subtypes (single-domain versus multidomain MCI).
The intensity of aerobic exercise performed may not be crucial, according to a 2012 study. Dr. Slivia Varela of the University of Vigo, Spain, and colleages looked at the effects of aerobic exercise at two different intensities on 48 elderly people with MCI living in care homes.
Aerobic exercise at 40 percent of resting heart rate has similar effects after three months to aerobic exercise at 60 percent of resting heart rate. Both led to “marginal improvements” on cognitive level as measured by the Mini Mental State Examination, and functional ability, measured by the Timed Up and Go test.
“No statistically significant differences were found at any time during the evaluation regarding cognitive level and functional autonomy,” write the researchers in the journal Clinical Rehabilitation. “Intensity does not seem to be a determining factor when aerobic exercise is performed by people with MCI.”
Potential mechanisms behind the cognition-enhancing effects of aerobic exercise have been investigated in animal research. They include beneficial effects on neuron function, neuron inflammation, hormonal responses to stress, and the amount of amyloid in the brain. Amyloid deposits raise the risk of Alzheimer’s disease as well as brain hemorrhages.
Of course exercise also has positive effects on physiological processes such as cardiovascular health and glucose regulation that, when compromised, increase the risk of developing cognitive impairment and dementia.
Ten Brinke, L. F. et al. Aerobic exercise increases hippocampal volume in older women with probable mid cognitive impairment: a six month randomised controlled trial. British Journal of Sports Medicine, 9 April 2014, doi 10.1136/bjsports-2013-093184
Varela, S. et al. Effects of two different intensities of aerobic exercise on elderly people with mild cognitive impairment: a randomized pilot study. Clinical Rehabilitation, May 2012 doi: 10.1177/0269215511425835
Researchers have found the practice of educating children with special needs in regular classes helps to improve the language skills of preschoolers with disabilities.
Researchers found that the average language skills of a child’s classmates in the fall significantly predicted the child’s language skills in the spring — especially for children with disabilities.
The results support inclusion policies in schools that aim to have students with disabilities in the same classrooms alongside their typically developing peers.
“Students with disabilities are the ones who are affected most by the language skills of the other children in their class,” said Laura Justice, co-author of the study and professor of teaching and learning at Ohio State University.
“We found that children with disabilities get a big boost in their language scores over the course of a year when they can interact with other children who have good language skills.”
In fact, after one year of preschool, children with disabilities had language skills comparable to children without disabilities when surrounded by highly skilled peers in their classroom.
“The biggest problem comes when we have a classroom of children with disabilities with no highly skilled peers among them,” Justice said. “In that case, they have limited opportunity to improve their use of language.”
The study, which will appear in the journal Psychological Science, involved 670 preschool-aged children enrolled in 83 early childhood special education classrooms in Ohio.
About half of the children had an Individualized Education Plan, signaling presence of a disability. Between 25 and 100 percent of children in each classroom had a disability.
All children’s language skills were measured in the fall and spring of the academic year with a commonly used test called the Descriptive Pragmatics Profile.
The average score of all children in an individual classroom was used to determine each child’s relative status in terms of language development, and whether their classmates were more highly skilled, less skilled, or average.
While all children’s language skills were affected somewhat by the skill levels of their classmates, the effect was strongest for those with disabilities, the study found.
For those children with disabilities who were in classrooms with the most highly skilled peers, language scores in the spring were about 40 percent better than those of children with disabilities who were placed with the lowest-ranked peers.
Students who had no disabilities showed about a 27 percent difference in scores between those with the highest-ranked peers and the lowest-ranked peers.
“This study, like others, finds that the most highly skilled students are the ones whose language improvement is least affected by the skill of their classmates,” Justice said.
“The highly skilled children aren’t hurt by being in classrooms with children who have disabilities,” she said.
“But children with disabilities are vulnerable if they aren’t placed with more highly skilled peers.”
Justice said she and her colleagues are currently doing research that directly compares the effects teachers have on language development versus the effect of peers.
Early results suggest teachers matter most, “but peers definitely have an impact on language development,” she said.
Peers help because they spend more time one-on-one with their fellow classmates than teachers can. Children with disabilities have the opportunity to observe, imitate, and model the language use of their peers who do not have disabilities.
“In a sense, the typically developing children act as experts who can help their classmates who have disabilities,” Justice said.
Statistics from the U.S. Department of Education show that more than half of preschoolers with disabilities are enrolled in early childhood classrooms with typically developing peers.
Justice said these results suggest that all preschoolers with disabilities would benefit from inclusion policies.
“We have to give serious thought to how we organize our classrooms to give students with disabilities the best chance to succeed,” she said.
Source: Ohio State University
An emerging theory may help to explain why older adults show declining cognitive ability with age, but don’t necessarily show declines in the workplace or daily life.
Dr. Tom Hess, a psychology researcher at North Carolina State University believes older adults are good at prioritizing their attention and use this skill when dealing with tasks that they consider meaningful.
“My research team and I wanted to explain the difference we see in cognitive performance in different settings,” says Hess.
“For example, laboratory tests almost universally show that cognitive ability declines with age, so you would expect older adults to perform worse in situations that rely on such abilities, such as job performance — but you don’t.
“Why is that? That’s what this theoretical framework attempts to address.”
Hess developed the framework — “selective engagement” — based on years of study on the psychology of aging.
Hess’ findings, “Selective Engagement of Cognitive Resources: Motivational Influences on Older Adults’ Cognitive Functioning,” are published online in the journal Perspectives on Psychological Science.
Hess believes the issue is best discussed from a cognitive performance verse cognitive functioning perspective.
Both views deal with cognition, which is an individual’s ability to focus on complex mental tasks, switch between tasks, tune out distractions, and retain a good working memory.
However, cognitive performance generally refers to how people fare under test conditions, whereas cognitive functioning usually refers to an individual’s ability to deal with mental tasks in daily life.
“There’s a body of work in psychology research indicating that performing complex mental tasks is more taxing for older adults,” Hess says.
“This means older adults have to work harder to perform these tasks. In addition, it takes older adults longer to recover from this sort of exertion.
“As a result, I argue that older adults have to make decisions about how to prioritize their efforts.”
This is where selective engagement comes in.
The idea behind the theory is that older adults are more likely to fully commit their mental resources to a task if they can identify with the task or consider it personally meaningful.
This would explain the disparity between cognitive performance in experimental settings and cognitive functioning in the real world.
“This first occurred to me when my research team saw that cognitive performance seemed to be influenced by how we framed the tasks in our experiments,” Hess says.
“Tasks that people found personally relevant garnered higher levels of cognitive performance than more abstract tasks.”
Hess next hopes to explore the extent to which selective engagement is reflected in the daily life of older adults and the types of activities they choose to engage in.
“This would not only further our understanding of cognition and aging, it may also help researchers identify possible interventions to slow declines in cognitive functioning,” Hess says.
Source: North Carolina State University
New research suggests that the emotional connections and desires established on a first date, determine the fate of a potential relationship.
Responsiveness, or the support for another’s needs and goals, may be one of those initial “sparks” necessary to fuel sexual desire and land a second date.
However, it may not be a desirable trait for both men and women on a first date.
A study published in Personality and Social Psychology Bulletin investigates if responsiveness increases sexual desire in the other person? Also, does the perception vary by gender?
Researchers from the Interdisciplinary Center (IDC) Herzliya, the University of Rochester, and the University of Illinois at Urbana-Champaign, collaborated on three studies to observe people’s perceptions of responsiveness.
People often say that they seek a partner that is “responsive to their needs,” and that such a partner would arouse their sexual interest.
“Sexual desire thrives on rising intimacy and being responsive is one of the best ways to instill this elusive sensation over time,” lead researcher Gurit Birnbaum explains.
“Our findings show that this does not necessarily hold true in an initial encounter, because a responsive potential partner may convey opposite meanings to different people.”
In the first study, the researchers examined whether responsiveness is perceived as feminine or masculine, and whether men or women perceived a responsive person of the opposite sex as sexually desirable.
Men who perceived female partners as more responsive also perceived them as more feminine, and more attractive.
However, the association between responsiveness and male partner’s masculinity was not significant for women.
Women’s perceptions of partner responsiveness were marginally and negatively associated with perceptions of partner attractiveness.
Participants in the second study were asked to interact with a responsive or non-responsive individual of the opposite sex, and view that individual’s photo (the same photo was given to each participant).
They were then asked to interact online with this individual, and discuss details on a current problem in their life.
The responsiveness of the virtual individual was manipulated, for example, “You must have gone through a very difficult time” as a responsive reply, versus “Doesn’t sound so bad to me” as a non-responsive reply.
Men who interacted with a responsive female individual perceived her as more feminine and as more sexually attractive than did men in the unresponsive condition.
Women are more cautious than men when interpreting a stranger’s expressions of responsiveness, and their perceptions of the stranger, which were seemingly unaffected by perceived responsiveness, may reflect conflicting trends among different women.
“Some women, for example, may interpret responsiveness negatively and feel uncomfortable about a new acquaintance who seems to want to be close.
Such feelings may impair sexual attraction to this responsive stranger. Other women may perceive a responsive stranger as warm and caring and therefore as a desirable long-term partner,” Dr. Birnbaum elaborates.
The third and final study tested the possibility that responsiveness may activate motivational mechanisms for men that fuel pursuit of either short-term or long-term sexual relationship opportunities.
A female partner’s actual responsiveness led men to perceive her as more feminine, and consequently to feel more sexually aroused.
Heightened sexual arousal, in turn, was linked to both increased perception of partner attractiveness and greater desire for a long-term relationship with that partner.
The findings of the study imply that whether a responsive partner will be seen as sexually desirable or not depends on the context and meaning assign to responsiveness.
In early dating, the meaning of responsiveness is likely shaped by gender-specific expectations.
Women did not perceive a responsive man as less masculine, but even so, women did not find a responsive man as more attractive.
The study helps to explain why men find responsive women sexually attractive, but does not reveal the mechanism that underlies women’s desire for new acquaintanceships.
“We still do not know why women are less sexually attracted to responsive strangers; it may not necessarily have to do with ‘being nice.’
“Women may perceive a responsive stranger as less desirable for different reasons,” Prof. Birnbaum cautions.
“Women may perceive this person as inappropriately nice and manipulative (i.e., trying to obtain sexual favors) or eager to please, perhaps even as desperate, and therefore less sexually appealing.
Alternatively, women may perceive a responsive man as vulnerable and less dominant. Regardless of the reasons, perhaps men should slow down if their goal is to instill sexual desire.”
A new study demonstrates that childhood abuse affects the way genes are activated, thereby influencing a child’s long-term development.
Previous studies focused on how a particular child’s individual characteristics and genetics interacted with that child’s experiences in an effort to understand how health problems emerge.
In the new study, researchers were able to measure the degree to which genes were turned “on” or “off” through a biochemical process called methylation.
This new technique reveals the ways that nurture changes nature — that is, how our social experiences can change the underlying biology of our genes.
The study is found in the journal Child Development.
Sadly, nearly one million children in the United States are neglected or abused every year.
Researchers at the University of Wisconsin, Madison found an association between the kind of parenting children had and a particular gene (called the glucocorticoid receptor gene) that’s responsible for crucial aspects of social functioning and health.
Not all genes are active at all times. DNA methylation is one of several biochemical mechanisms that cells use to control whether genes are turned on or off. The researchers examined DNA methylation in the blood of 56 children ages 11 to 14.
Half of the children had been physically abused.
They found that compared to the children who hadn’t been maltreated, the maltreated children had increased methylation on several sites of the glucocorticoid receptor gene, also known as NR3C1, echoing the findings of earlier studies of rodents.
In this study, the effect occurred on the section of the gene that’s critical for nerve growth factor, which is an important part of healthy brain development.
There were no differences in the genes that the children were born with, the study found. instead, the differences were seen in the extent to which the genes had been turned on or off.
“This link between early life stress and changes in genes may uncover how early childhood experiences get under the skin and confer lifelong risk,” notes Seth D. Pollak, professor of psychology and pediatrics at the University of Wisconsin, Madison, who directed the study.
Previous studies have shown that children who have experienced physical abuse, sexual abuse, and neglect are more likely to develop mood, anxiety, and aggressive disorders, as well as to have problems regulating their emotions.
These problems, in turn, can disrupt relationships and affect school performance. Maltreated children are also at risk for chronic health problems such as cardiac disease and cancer. The current study helps explain why these childhood experiences can affect health years later.
The gene identified by the researchers affects the hypothalamic-pituitary-adrenal (HPA) axis in rodents.
Disruptions of this system in the brain would make it difficult for people to regulate their emotional behavior and stress levels. Circulating through the body in the blood, this gene affects the immune system, leaving individuals less able to fight off germs and more vulnerable to illnesses.
“Our finding that children who were physically maltreated display a specific change to the glucocorticoid receptor gene could explain why abused children have more emotional difficulties as they age,” according to Pollak.
“They may have fewer glucocorticoid receptors in their brains, which would impair the brain’s stress-response system and result in problems regulating stress.”
The findings have implications for designing more effective interventions for children, especially since studies of animals indicate that the effects of poor parenting on gene methylation may be reversible if caregiving improves.
New research finds a key element in the treatment of overweight and obese preschoolers is parental involvement.
Investigators discovered traditional approaches to overweight prevention and treatment focusing only on the child are outdated, with interventions targeting both parent and child more effective.
The research, conducted at the University at Buffalo and Women and Children’s Hospital of Buffalo is published in the journal Pediatrics.
Children enrolled in the study were overweight or obese and had one parent who participated in the study who also was overweight or obese.
During the course of the study, children who were treated concurrently with a parent experienced more appropriate weight gain while growing normally in height.
Children in the intervention group gained an average of 12 pounds over 24 months compared to children in the control group who gained almost 16 pounds.
This more appropriate weight accrual resulted in a decrease of 0.21 percent over body mass index (BMI) from baseline to 24 months.
Parents in the intervention group lost an average of 14 pounds, resulting in a BMI decrease of over two units while the weight of parents in the control group was essentially unchanged.
“This study is important because while we know that it is critical to begin treating overweight or obese children early, there has been limited data on what works best in preschool-aged children,” says Teresa A. Quattrin, MD, senior author.
The research was part of Buffalo Healthy Tots, a novel family-based, weight control intervention in preschool children that Quattrin directed in urban and suburban pediatric practices in Western New York.
The intervention was the first of its kind in the U.S. to compare traditional approaches where only the child is treated to family-based, behavioral treatment implemented in pediatric primary care practices.
The study of 96 children ages two to five found that when overweight and obese youth and their parents were treated in a primary care setting with behavioral intervention, parents and children experienced greater decreases in BMI than did the children who received the traditional treatment, focusing only on the child.
Weight loss for both parent and child was sustained after a 12-month followup.
Quattrin notes that an important feature of the study was the use of practice enhancement assistants, trained in psychology, nutrition, or exercise science.
These assistants worked with the families both during treatment and education sessions and afterward by phone.
The intervention was delivered through the parents, who were instructed about the appropriate number of food servings for children and appropriate calorie values.
They were taught to avoid “high-energy” foods, such as those with high sugar content, more than fives grams of fat per serving or artificial sweeteners.
Parents monitored the number of servings in each food category, using a simple diary to cross off icons pertaining to the food consumed or type of physical activity performed.
Parents also were taught to record their own and their child’s weight on a simple graph.
Weight loss goals for children were 0.5 to one pound per week and for parents it was at least one pound per week.
Quattrin says that the study results suggest that overweight or obese children and their parents can be successfully treated in the primary care setting with the assistance of practice enhancers.
“Instead of the more traditional approach of referring these patients to a specialty clinic, the patient-centered medical home in the pediatrician’s office may be an ideal setting for implementing these family-based treatments,” she says.
“We have entered a new era where students, trainees, and specialists have to learn how to better interact with primary care providers and implement care coordination.
This paper suggests that, indeed, family-based strategies for any chronic disorder, including obesity, can be successful in primary care. The pediatrician’s office can become a “family-centered medical home.”
Source: University of Buffalo
New research suggests material items designed to create or enhance an experience can make shoppers just as happy as performance of life activities.
San Francisco State University researchers discovered the products satisfy a different, but equally powerful, psychological need than experiential purchases.
While life experiences help consumers feel closer to others, products such as books, sporting goods, video games, or musical instruments allow them to utilize and develop new skills and knowledge, resulting in similar levels of happiness.
The study sheds additional light on how consumers can best spend their discretionary income to improve their well-being and fills a crucial gap in previous research, which had not examined the effects of experiential products on happiness.
“This is sort of good news for materialists,” said Ryan Howell, an associate professor of psychology at San Francisco State Universtiy and co-author of the study.
“If your goal is to make yourself happier but you’re a person who likes stuff, then you should buy things that are going to engage your senses. You’re going to be just as happy as if you buy a life experience, because in some sense this product is going to give you a life experience.”
Years of research consistently have shown that purchasing life experiences, such as tickets to a play or a vacation, will make shoppers happier than material products such as clothes, jewelry, or accessories.
“But by focusing on those two extremes,” Howell said, “psychologists have ignored the middle of the buying spectrum, leaving out a large number of items that are tangible but are nevertheless designed to engage users in some way.”
Howell and lead author Darwin Guevarra, asked consumers about a recent purchase and how happy that purchase made them.
Expecting that material items would provide the smallest happiness boost and life experiences the largest, with experiential products falling in the middle, they were surprised to find that experiential products actually provided the same level of happiness as experiences.
To learn why, they next looked at whether the purchases satisfied any of three key psychological needs: identity expression (the purchase reflects the consumer’s true values); competence (the purchase allows the consumer to utilize skills and knowledge); and relatedness (the purchase brings the consumer closer to others).
The results showed that, while experiential products and life experiences offered similar levels of identity expression, the former were best at providing competence and the latter best at providing relatedness.
“They are essentially two different routes to the same well-being,” Howell said.
“If you’re not feeling very competent, the best way to alleviate that deprivation would be through the use of experiential products.
On the other hand, if you’re feeling lonely, you should buy life experiences and do things with others.”
“The ideal products for happiness,” he added, “may be those that simultaneously satisfy both needs, such as a board game you play with others or going to the museum with friends.”
Because increased happiness is linked to a variety of individual and societal benefits, including better health and longer life, Howell hopes to develop intervention methods that can help researchers steer individuals who have materialistic buying tendencies toward purchases that improve their happiness.
Source: San Francisco State University
Babies in the womb begin to respond to the rhythm of a familiar nursery rhyme by 34 weeks gestational age and are able to remember a set rhyme just before birth, according to new research by the University of Florida. The study also highlights the important role of the mother’s voice in the baby’s learning capabilities.
For the study, published in the journal Infant Behavior and Development, pregnant women recited a rhyme to their unborn babies three times a day for six weeks, starting at 28 weeks of pregnancy, the beginning of the third trimester.
“The mother’s voice is the predominant source of sensory stimulation in the developing fetus,” said nursing researcher Charlene Krueger, an associate professor in the UF College of Nursing.
“This research highlights just how sophisticated the third trimester fetus really is and suggests that a mother’s voice is involved in the development of early learning and memory capabilities. This could potentially affect how we approach the care and stimulation of the preterm infant.”
The researchers recruited 32 women (ages 18 to 39) who were in the 28th week of their first pregnancy. Overall, 68 percent of the women were white, 28 percent were black, and four percent were of another race or ethnicity. The participants were randomly assigned to either an experimental or a control group.
From 28 to 34 weeks of pregnancy, all mothers in the study recited a particular passage or nursery rhyme twice a day and then came in for testing at 28, 32, 33, and 34 weeks’ gestation. Then to determine whether the fetus could remember the pattern of speech, all mothers were asked to stop speaking the passage at 34 weeks. Then the fetuses were tested again at 36 and 38 weeks.
To test the babies’ responses, researchers used a fetal heart monitor to record heart rate and detect any changes. A small heart rate deceleration in the fetus is considered a sign that the baby is familiar with a stimulus.
During the experiment, the fetuses were played a recording of the same rhyme their mother had been reciting at home but spoken by a female stranger. Those in the control group heard an unfamiliar rhyme spoken by a stranger. This was to determine if the fetus was responding simply to its mother’s voice or to a familiar pattern of speech, said Krueger.
The findings showed that the fetus’ heart rate began to respond to the familiar rhyme recited by a stranger’s voice by 34 weeks of gestational age — at this point the mother had spoken the rhyme out loud at home for six weeks. The babies continued to respond with a small cardiac deceleration up to four weeks after the mother had stopped reciting the rhyme until about 38 weeks.
At 38 weeks, there was a statistically significant difference between the two fetus groups — the experimental group who heard the original rhyme responded with a deeper and more sustained cardiac deceleration, whereas the control group who heard a new rhyme experienced a cardiac acceleration.
“This study helped us understand more about how early a fetus could learn a passage of speech and whether the passage could be remembered weeks later even without daily exposure to it,” Krueger said.
“This could have implications to those preterm infants who are born before 37 weeks of age and the impact an intervention such as their mother’s voice may have on influencing better outcomes in this high-risk population.”
Source: University of Florida
A simple test that measures how fast people walk and whether they have any cognitive complaints can predict the likelihood of developing dementia, according to a new study.
The study, involving about 27,000 older adults on five continents, found that nearly one in 10 met the criteria for pre-dementia based on this simple test. Those who tested positive for pre-dementia were twice as likely to develop dementia within 12 years, according to researchers at the Albert Einstein College of Medicine of Yeshiva University and Montefiore Medical Center in the Bronx.
The test diagnoses motoric cognitive risk syndrome (MCR). Testing for the newly described syndrome involves measuring gait speed — our manner of walking — as well as asking a few simple questions about a patient’s cognitive abilities, both of which take just seconds, the researchers explained.
Because the test doesn’t rely on technology, it can be easily done in a clinical setting, allowing a diagnosis in the early stages of dementia, according to the researchers.
Early diagnosis is critical because it allows time to identify and possibly treat the underlying causes of the disease, which may delay or even prevent the onset of dementia in some cases, researchers noted.
“In many clinical and community settings, people don’t have access to the sophisticated tests — biomarker assays, cognitive tests or neuroimaging studies — used to diagnose people at risk for developing dementia,” said Joe Verghese, M.B.B.S., a professor in the Saul R. Korey Department of Neurology and of medicine at Einstein and chief of geriatrics at Einstein and Montefiore.
“Our assessment method could enable many more people to learn if they’re at risk for dementia, since it avoids the need for complex testing and doesn’t require that the test be administered by a neurologist.
“The potential payoff could be tremendous — not only for individuals and their families, but also in terms of healthcare savings for society. All that’s needed to assess MCR is a stopwatch and a few questions, so primary care physicians could easily incorporate it into examinations of their older patients.”
According to the Centers for Disease Control and Prevention, up to 5.3 million Americans — about one in nine people 65 years and over — have Alzheimer’s disease, the most common type of dementia. That number is expected to more than double by 2050 due to the population aging, researchers said.
“As a young researcher, I examined hundreds of patients and noticed that if an older person was walking slowly, there was a good chance that his cognitive tests were also abnormal,” said Verghese. “This gave me the idea that perhaps we could use this simple clinical sign — how fast someone walks — to predict who would develop dementia.”
“In a 2002 New England Journal of Medicine study, we reported that abnormal gait patterns accurately predict whether people will go on to develop dementia. MCR improves on the slow gait concept by evaluating not only patients’ gait speed, but also whether they have cognitive complaints.”
The new study, published in Neurology, the medical journal of the American Academy of Neurology, reported on the prevalence of MCR among 26,802 adults, 60 and older, without dementia or disability enrolled in 22 studies in 17 countries. About 9.7 percent met the criteria for MCR, including an abnormally slow gait and cognitive complaints.
While the syndrome was equally common in men and women, highly educated people were less likely to test positive for MCR compared with less-educated individuals, the researchers noted.
A slow gait is a walking speed slower than about one meter per second, which is about 2.2 miles per hour. Less than 0.6 meters per second (or 1.3 mph) is “clearly abnormal,” according to Verghese.
To test whether MCR predicts future dementia, the researchers focused on four of the 22 studies. These tested 4,812 people for MCR and then evaluated them annually over an average follow-up period of 12 years to see which ones developed dementia. Those who met the criteria for MCR were nearly twice as likely to develop dementia over the following 12 years compared with people who did not, the researchers discovered.
Verghese emphasizes that a slow gait alone is not sufficient for a diagnosis of MCR.
“Walking slowly could be due to conditions such as arthritis or an inner ear problem that affects balance, which would not increase risk for dementia,” he explained. “To meet the criteria for MCR requires having a slow gait and cognitive problems. An example would be answering ‘yes’ to the question, ‘Do you think you have more memory problems than other people?’”
“For patients meeting MCR criteria, the next step is to look for the causes of their slow gait and cognitive complaints,” Verghese said. “The search may reveal underlying — and controllable — problems,” he noted.
Evidence increasingly suggests that brain health is closely tied to cardiovascular health — meaning that treatable conditions such as hypertension, smoking, high cholesterol, obesity, and diabetes can interfere with blood flow to the brain and thereby increase a person’s risk for developing Alzheimer’s and other dementias,” he said.
But what if an underlying problem can’t be found?
“Even in the absence of a specific cause, we know that most healthy lifestyle factors, such as exercising and eating healthier, have been shown to reduce the rate of cognitive decline,” said Verghese.
“In addition, our group has shown that cognitively stimulating activities — playing board games, card games, reading, writing, and also dancing — can delay dementia’s onset. Knowing they’re at high risk for dementia can also help people and their families make arrangements for the future, which is an aspect of MCR testing that I’ve found is very important in my own clinical practice.”
Early life experiences, such as childhood socioeconomic status and literacy, may have greater influence on the risk of cognitive impairment later in life than demographic characteristics such as race and ethnicity, according to new research.
The new study from researchers at the University of California Davis Alzheimer’s Disease Center and the University of Victoria, Canada, challenges earlier research that suggests a link between race and ethnicity, particularly among Latinos, and an increased risk of cognitive impairment and dementia later in life.
“Declining cognitive function in older adults is a major personal and public health concern,” said Bruce Reed, a professor of neurology and associate director of the University of California Davis Alzheimer’s Disease Center.
“But not all people lose cognitive function, and understanding the remarkable variability in cognitive trajectories as people age is of critical importance for prevention, treatment, and planning to promote successful cognitive aging and minimize problems associated with cognitive decline.”
For their research, the scientists recruited more than 300 men and women, all 60 years old or older. Recruited from senior citizen recreational and residential centers, as well as churches and health-care settings, the seniors had no major psychiatric illnesses or life threatening medical illnesses. Participants were Caucasian, African-American, or Hispanic and spoke either English or Spanish.
Testing included multidisciplinary diagnostic evaluations through the University of California Davis Alzheimer’s Disease Center in either English or Spanish, according to the researchers.
Consistent with previous research, the study found that non-Latino Caucasians scored 20 to 25 percent higher on tests of semantic memory — general knowledge — and 13 to 15 percent higher on tests of executive functioning compared to the other ethnic groups.
However, ethnic differences in executive functioning disappeared and differences in semantic memory were reduced by 20 to 30 percent when group differences in childhood socioeconomic status, adult literacy, and the extent of physical activity during adulthood were considered, the researchers discovered.
“This study is unusual in that it examines how many different life experiences affect cognitive decline in late life,” said Dan Mungas, professor of neurology and associate director of the University of California Davis Alzheimer’s Disease Research Center.
“It shows that variables like ethnicity and years of education that influence cognitive test scores in a single evaluation are not associated with rate of cognitive decline, but that specific life experiences like level of reading attainment and intellectually stimulating activities are predictive of the rate of late-life cognitive decline. This suggests that intellectual stimulation throughout the life span can reduce cognitive decline in old age.”
Regardless of ethnicity, advanced age, and apolipoprotein-E (APOE genotype) were associated with increased cognitive decline over the four years the participants were followed. APOE is the largest known genetic risk factor for late-onset Alzheimer’s, according to the researchers.
Less decline was experienced by people who reported more engagement in recreational activities in late life and who maintained their levels of activity from middle age to old age, the researchers found.
Single-word reading — the ability to decode a word on sight, which often is considered an indication of quality of educational experience — was also associated with less cognitive decline, a finding that was true for both English and Spanish readers, irrespective of their race or ethnicity, according to the study. These findings suggest that early life experiences affect late-life cognition indirectly, through literacy and late-life recreational pursuits, the researchers said.
“These findings are important, because it challenges earlier research that suggests associations between race and ethnicity, particularly among Latinos, and an increased risk of late-life cognitive impairment and dementia,” explained Paul Brewster, lead author of the study, a doctoral student at the University of Victoria, Canada, and a pre-doctoral psychology intern at the University of California San Diego Department of Psychiatry.
“Our findings suggest that the influences of demographic factors on late-life cognition may be reflective of broader socioeconomic factors, such as educational opportunity and related differences in physical and mental activity across the life span.”
The study, “Life Experiences and Demographic Influences on Cognitive Function in Older Adults,” was published in Neuropsychology, a journal of the American Psychological Association.
New research has found that when people have to choose between two or more equally positive outcomes, they often experience paradoxical feelings of pleasure and anxiety — feelings associated with activity in different regions of the brain.
A series of experiments led by Amitai Shenhav, an associate research scholar at the Princeton Neuroscience Institute at Princeton University, found evidence of parallel brain activity in people asked to make decisions on a variety of products.
In one experiment, for example, 42 people were asked to rate the desirability of more than 300 products using an auction-like procedure. They then looked at images of paired products with different or similar values and were asked to choose between them.
Their brain activity was scanned using functional magnetic resonance imaging (fMRI). After the scan, they were asked to report their feelings before and during each choice. They then received one of their choices at the end of the study.
The study found that choices between two highly valued items, such as a digital camera and a camcorder, were associated with the most positive feelings and the greatest anxiety, compared with choices between items of low value, like a desk lamp and a water bottle, or between items of different values.
Functional MRI scans showed activity in two regions of the brain, the striatum and the prefrontal cortex, both known to be involved in decision-making.
According to the findings, lower parts of both regions were more active when subjects felt excited about being offered the choice, while activity in upper parts was strongly tied to feelings of anxiety.
This evidence that parallel brain circuits are associated with opposing emotional reactions helps to answer a puzzling question, according to Shenhav. “Why isn’t our positivity quelled by our anxiety, or our anxiety quelled by the fact that we’re getting this really good thing at the end?
“This suggests that it’s because these circuits evolved for two different reasons. One of them is about evaluating the thing we’re going to get, and the other is about guiding our actions and working out how difficult the choice will be.”
A second fMRI experiment showed that the same patterns of emotional reactions and brain activity persisted even when the participants were told before each choice how similarly they had valued the items. Their anxiety didn’t abate, despite knowing how little they stood to lose by making a “wrong” choice, he noted.
In a third experiment, Shenhav and Randy Buckner, a professor of psychology and neuroscience at Harvard University and the study’s senior author, tested whether giving people more than two choices increased their levels of anxiety.
It did — the researchers found that providing six options led to higher levels of anxiety than two options, particularly when all six of the options were highly valued items. But positive feelings about being presented with the choice were similar for two or six options, they noted.
This suggests that the anxiety stems from the conflict of making the decision, rather than the opportunity cost of the choice — an economic concept that refers to the lost value of the second-best option. The opportunity cost should be the same, regardless of the number of choices, the researchers noted.
In addition, subjects in this final study were given an unlimited amount of time to make a decision, compared with 1.5 seconds in the first two studies. The results showed that time pressure was not the main source of anxiety during the choices, according to the researchers.
At the end of each study, participants had a surprise opportunity to reverse their earlier choices. Higher activity in a part of the brain called the anterior cingulate cortex around the time of an initial choice predicted whether that decision would later be reversed, according to the study’s findings.
Previous work has shown that this brain region is involved in assessing how conflicted an individual feels over a particular choice. This suggests that some choices may continue to elicit conflict after the participant made a decision, Shenhav said.
The researchers also found that people who reported more anxiety in their daily lives were more likely to change their minds.
According to Shenhav, this research could shed light on the neural processes that can make momentous choices so paralyzing for some people — for instance, deciding where to go to college or which job offer to take.
But he admits that even more trivial decisions can be tough for him.
“I probably experience more win-win choice anxiety than the average person,” he said. “I’m even terrible at choosing where to eat dinner.”
The study was published in the Proceedings of the National Academy of Sciences.
Source: Princeton University
Allowing the television to remain on when no one is watching negatively impacts children’s learning and development, according to a new study that investigated the effects of television on children’s social and emotional development.
Specifically, the University of Iowa (UI) researchers found that the background noise of TV diverts a child’s attention away from playing and learning. The researchers also found that allowing children to watch programs that are non-educational in nature hinders their cognitive development as well.
“Kids are going to learn from whatever you put in front of them,” says lead author Deborah Linebarger, associate professor in education at UI. “So what kinds of messages, what kinds of things do you want them to learn? That would be the kinds of media you’d purposefully expose them to.”
The findings, published in the Journal of Developmental & Behavioral Pediatrics, are based on a national survey of more than 1,150 families with children between two and eight years old. The researchers examined family demographics, parenting styles, media use, and how those factors could impact kids’ future success.
The team found a connection between the content children are exposed to and their executive cognitive functioning. This was especially strong among children in families identified as high risk — such as families living in poverty or families whose parents have little education.
Yet even children from high-risk families who watched educational television saw increases in executive function, the researchers found.
“Regardless of family demographics, parenting can act as a buffer against the impacts of background TV,” said the researchers.
“Children whose parents create a home environment that is loving and nurturing and where rules and expectations are the same from one time to another are better able to control their behavior, display more empathy, and do better academically,” Linebarger says.
In particular, she suggests that parents be mindful what shows their children are watching, especially the content of the program.
“Sit down to watch a particular show and when it’s done, turn it off,” she says.
In a previous study, Linbarger and another team found that children, on average, are exposed to nearly four hours of background TV each day. One of the sneaky effects of background TV is that it pulls children’s attention away from other activities, such as playing and learning.
Source: University of Iowa
In the largest genome study ever conducted on any psychiatric disorder, an international team of researchers identified more than 100 genes linked to the development of schizophrenia. The findings, published online in the journal Nature, could lead to new approaches to treating the disease, which has seen little improvement in drug development in more than 60 years.
Current schizophrenia drugs treat only one of the symptoms of the disorder (psychosis), and do not help ease the devastating cognitive symptoms. In part, treatment options are limited because the biological mechanisms underlying the illness are not well understood.
Research on schizophrenia has focused on genes because of the disorder’s high heritability. Previous studies have shown the complexity of the disease (it is potentially caused by the combined effects of many genes), and roughly two dozen genomic regions have been linked to the disorder. The new study confirms those earlier findings and sheds even more light on the genetic basis of schizophrenia and its underlying biology.
“By studying the genome, we are getting a better handle on the genetic variations that are making people vulnerable to psychiatric disease,” said National Institute of Mental Health Director Thomas Insel.
“Through the wonders of genomic technology, we are in a period in which, for the first time, we are beginning to understand many of the players at the molecular and cellular level.”
In the genome-wide association study (GWAS), the researchers analyzed more than 80,000 genetic samples from schizophrenia patients and healthy volunteers and found 108 specific locations in the human genome associated with risk for schizophrenia. Eighty-three of those loci (specific locations of genes) had not been linked previously to the disorder.
“In just a few short years, by analyzing tens of thousands of samples, our consortium has moved from identifying only a handful of loci associated with schizophrenia, to finding so many that we can see patterns among them,” said first author Stephan Ripke, a scientist at the Broad’s Stanley Center for Psychiatric Research.
“We can group them into identifiable pathways — which genes are known to work together to perform specific functions in the brain. This is helping us to understand the biology of schizophrenia.”
For the most part, the study points to genes expressed in brain tissue. The researchers also found a smaller number of schizophrenia genes that are active in the immune system — a finding that offers support for a previously hypothesized link between schizophrenia and immunological processes.
The study also found a link between schizophrenia and the region of the genome that holds the gene known as DRD2. This gene produces the dopamine receptor targeted by all approved medications for schizophrenia. This finding suggests that the new gene locations may also become therapeutic targets.
“The fact that we were able to detect genetic risk factors on this massive scale shows that schizophrenia can be tackled by the same approaches that have already transformed our understanding of other diseases,” said the paper’s senior author, Michael O’Donovan, deputy director of the MRC Centre for Neuropsychiatric Genetics and Genomics at Cardiff University School of Medicine. ‘
The wealth of new findings have the potential to kick-start the development of new treatments in schizophrenia, a process which has stalled for the last 60 years.”
Source: Harvard University
A new study has found that when minority teens from low-income families began attending high-performing public charter high schools, they were much less likely to engage in risky health behaviors, compared to peers who were not admitted into those schools. They also scored significantly higher on state standardized math and English tests.
The new study, led by researchers at the University of California, Los Angeles (UCLA), is the first to examine whether the quality of education influences students’ risky health behaviors.
“These students’ higher cognitive skills may lead them to better health literacy and decision-making. They may be exposed to less negative peer pressure, and the school environment may promote the resilience that steers them away from these risky behaviors,” said lead investigator Dr. Mitchell Wong, a professor of medicine in the division of general internal medicine and health services research.
“In addition, in a better academic environment students spent more time studying, leaving them less time to engage in risky behaviors.”
The researchers divided risky behaviors into two categories: risky and very risky. ”Risky behavior” was defined as any use of tobacco, alcohol, and marijuana within the past 30 days.
“Very risky behavior” included the following: binge drinking, alcohol use in school, use of any drug other than marijuana, carrying a weapon to school, gang membership, pregnancy, multiple sex partners, sex under the influence of drugs or alcohol, and sex without the use of contraceptives.
For the study, the researchers compared two groups of high school students from low-income neighborhoods in Los Angeles: the first group included 521 students who were offered admission to high-performing public charter schools through the district lottery, and the second group included 409 students who were not. The researchers compared the students’ health behaviors and standardized test scores.
Students who attended the high-performing schools performed much better on standardized tests. Furthermore, significantly fewer charter school students (36 percent versus 42 percent of those who did not attend charter schools) engaged in “very risky behaviors.”
Students who changed schools or dropped out were more likely to engage in the “very risky behaviors.” There was no significant difference found between the two groups for “risky behaviors,” such as alcohol and marijuana use.
The researchers believe that putting successful public charter high schools in low-income neighborhoods can have beneficial health effects and could help close the growing academic achievement gap between wealthy and low-income students.
The study is published in the journal Pediatrics.