Tri-County Services

In The News

Syndicate content
Psychology, psychiatry and mental health news and research findings, every weekday.
Updated: 1 hour 5 min ago

Emotional Impact of Being Unfriended

Wed, 04/23/2014 - 8:30am

Emerging research from the University of Colorado Denver explores the relatively new experience of being unfriended on Facebook.

One study clarifies the most common type of “friend” to be unfriended while another looks at the emotional impact of the action.

The studies, published earlier this year, show that the most likely person to be unfriended is a high school acquaintance.

“The most common reason for unfriending someone from high school is that the person posted polarizing comments often about religion or politics,” said Christopher Sibona, a doctoral student at the Colorado University Denver Business School. “The other big reason for unfriending was frequent, uninteresting posts.”

Both studies were based on a survey of 1,077 people conducted on Twitter.

The first study found that the top five kinds of people respondents unfriended were:

  • high school friends;
  • other;
  • friend of a friend;
  • work friends;
  • common interest friends.

“We found that people often unfriend co-workers for their actions in the real world rather than anything they post on Facebook,” Sibona said.

One reason he believes high school friends are top targets for unfriending is that their political and religious beliefs may not have been as strong when they were younger.

According, if those beliefs have grown more strident over time, it becomes easier to offend others.

“Your high school friends may not know your current political or religious beliefs and you may be quite vocal about them,” Sibona said. “And one thing about social media is that online disagreements escalate much more quickly.”

The second study looked at the emotional impact of being unfriended.

Sibona found a range of emotions connected to unfriending, from being bothered to being amused. The most common responses to being unfriended were:

  • “I was surprised;”
  • “It bothered me;”
  • “I was amused;”
  • “I felt sad.”

“The strongest predictor is how close you were at the peak of your friendship when the unfriending happened,” said Sibona, who has studied the real world consequences of Facebook unfriending since 2010.

“You may be more bothered and saddened if your best friend unfriends you.”

The study found four core factors can be used to predict someone’s emotional response to being unfriended.

Two factors predicted that a user would be negatively affected: if the unfriended person was once a close friend to the one who unfriended them; and how closely the person monitored their own friend’s list.

Two other factors predicted that a user would be less negatively affected: if difficulties were discussed between the friends before the unfriending; and if the person unfriended talked about it with others after the unfriending.

Interestingly, the research showed that unfriending happens more often to friends who were once close than to those who are acquaintances.

“Despite the preponderance of weak ties throughout online social networks, these findings help to place unfriending within the greater context of relationship dissolution,” the study said.

Sibona said that the “one size fits all” method of ending digital relationships is unique but with real world consequences that warrant additional research.

“If you have a lot of friends on Facebook, the cost of maintaining those friendships is pretty low,” he said. “So if you make a conscious effort to push a button to get rid of someone, that can hurt.”

Source: University of Colorado, Denver

Risk of Autism Goes Up With Older Parents (Especially Moms)

Wed, 04/23/2014 - 7:00am

Swedish researchers have created a new autism model that predicts older parents are more likely to have a child who develops an autism spectrum disorder (ASD) than are younger parents.

The model shows that autism risk grows steadily with fathers’ increasing age, but accelerates with mothers’ age after 30.

The new study from researchers from the Drexel University School of Public Health in Philadelphia and Karolinska Institute in Sweden provides more insight into how the risk associated with parental age varies between mothers’ and fathers’ ages.

Investigators found that the risk of having a child with both ASD and intellectual disability is larger for older parents.

In the study, published in the International Journal of Epidemiology, researchers report that fathers’ and mothers advancing ages have different impacts on their child’s risk.

The rise in ASD risk with parental age was greater for older mothers as compared to older fathers.

“The open question at hand really is, what biological mechanisms underlie these age effects?” said Brian K. Lee, Ph.D, senior author of the study.

“The observed differences in risk based on mothers’ and fathers’ ages point to a need to continue investigating underlying mechanisms of ASD that may be influenced by a mother’s age,” Lee said, “even though much recent discussion has focused on fathers’ and even grandfathers’ ages.”

The risk of having a child with ASD had a more complicated relationship to age in women than in men, whose risk of fathering a child with ASD increased linearly with age across their lifespan.

Among women giving birth before the age of 30, the risk of ASD in the child showed no association with age; it was simply very low. But for babies born to mothers aged 30 and older, the chance of developing ASD rose rapidly with the mother’s age.

Lee noted that the non-linear maternal age effect that is relatively stronger than the paternal age effect on ASD risk has been observed in previous studies, but has not received much attention.

Multiple mechanisms could be in play to account for the different patterns of risk, including environmental risk factors occurring in women after age 30. Factors such as complications in pregnancy could also underlie the effect of mothers’ ages on a child’s ASD risk but not a paternal age effect.

“The linear, steady increase in risk associated with fathers’ ages is consistent with the hypothesis of increased genomic alterations over the father’s lifespan that can increase risk of ASD,” Lee said.

In this study, Lee and colleagues analyzed a large population registry sample of 417,303 children born in Sweden between 1984 and 2003, adjusted for numerous possible factors that could vary with parental age and also influence risk, such as family income and each parent’s psychiatric history.

The study also used a particularly comprehensive case-finding approach, to identify more ASD cases than other studies might, based on all pathways to care in a socialized health system.

A goal was to study these parental age effects in more detail by looking at possible differing risks of ASD with and without intellectual disability — one of the most serious comorbid diagnoses with ASD, with a significant impact on functional status in life.

This was the first population-based study with an ASD sample large enough to study ASD risk in populations of children with and without intellectual disability.

“When considering risk factors, we can’t necessarily lump all ASD cases together, even though they fall under a broad umbrella of autism,” Lee said.

“We need to keep an open mind in case intellectual disability might be a marker of a different underlying mechanism.”

The finding that ASD with intellectual disability had a stronger association with older parents, compared to ASD without intellectual disability, supports continued investigation of possible different mechanisms.

Lee noted that, although age effects are important indicators of risk at the population level that could eventually help researchers identify preventable causes of disability, they aren’t very significant for a couple’s family planning because the overall risk remains low.

“The absolute risk of having a child with ASD is still approximately one in 100 in the overall sample, and less than two in 100 even for mothers up to age 45,” he said.

Source: Drexel
Mother holding her baby photo by shutterstock.

Nearly 1 in 3 Canadian Adults Reports Abuse

Wed, 04/23/2014 - 7:00am

A sobering new study finds that almost one-third of adults in Canada have experienced child abuse violence in their home.

The abuse may have taken the form of physical abuse, sexual abuse, or exposure to intimate partner violence (parents, stepparents, or guardians).

Also, as published in the CMAJ (Canadian Medical Association Journal), investigators found that child abuse is linked to mental disorders and suicidal ideation (thoughts) or suicide attempts.

“From a public health standpoint, these findings highlight the urgent need to make prevention of child abuse a priority in Canada,” writes Dr. Tracie Afifi, departments of Community Health Sciences and Psychiatry, University of Manitoba with coauthors.

Although the link between child abuse and mental health is known, in Canada there is a lack of recent, comprehensive information on the prevalence of child abuse and the link between different types of abuse and mental conditions in adults.

The current review is the first nationally representative study on child abuse and mental disorders in Canada.

Researchers looked at data from 23,395 people from across Canada who participated in the 2012 Canadian Community Health Survey: Mental Health.

The participants were 18 years or older and were representative of people living in the 10 provinces. The study excluded residents in the three territories, residents in indigenous communities, full-time members of the Canadian Forces, and people living in institutions.

According to the study, 32 percent of adult Canadians experienced child abuse, with physical abuse the most common (26 percent), followed by sexual abuse (10 percent), and exposure to intimate partner violence (8 percent).

Men were more likely to have been physically abused (31 percent v. 21 percent in women) and had a higher rate of any abuse (34 percent v. 30 percent. Sexual abuse was more common in women (14 percent v. six percent in men) as was exposure to intimate partner violence (nine percent v. seven percent) as children.

People between 35 and 64 years of age were more likely than those aged 18 to 34 years to report having been abused as a child.

“All three types of child abuse were associated with all types of interview-diagnosed mental disorders, self-reported mental conditions, suicide ideation [thoughts of suicide], and suicide attempts in models adjusting for sociodemographic variables,” write the authors.

Drug abuse or dependence, suicidal thoughts and suicide attempts remained associated with all types of child abuse even in the most adjusted models.

The least severe type of physical abuse (being slapped on the face, head or ears or hit or spanked with something hard) showed a strong association with all mental conditions in models adjusting for sociodemographic variables.

Exposure to more than one type of abuse increased the odds of having a mental condition.

Canada’s western provinces had the highest rates of child abuse, with Manitoba first (40 percent), followed by British Columbia and Alberta (36 percent). Newfoundland and Labrador had the lowest rates of abuse at 21 percent.

“All health care providers should be aware of the relation between specific types of child abuse and certain mental conditions.

“Clinicians working in the mental health field should be skilled in assessing patients for exposure to abuse and should understand the implications for treatment,” the authors conclude.

Source: Canadian Medical Association Journal

Neuroticism Marked by Low Motivation to Action

Wed, 04/23/2014 - 6:00am

A new study on neurotic behavior finds that individuals are often powerless to take action on their negativity.

Researchers studied nearly 4,000 college students in 19 countries and uncovered new details about why neurotic people may avoid making decisions and moving forward with life.

Investigators learned that when they are asked if action is positive, favorable, good, they just don’t like it as much as non-neurotics.

Therefore, persuasive communications and other interventions may be useful if they simply alter neurotics’ attitudes toward inaction.

Dolores Albarracín, Ph.D., from the Annenberg School for Communication at the University of Pennsylvania served as principal investigator.

The researchers explain that although “neurotic” is a common descriptor, the personality trait “neuroticism” is a complex condition defined by the experience of chronic negative affect — including sadness, anxiety, irritability and self-consciousness. Moreover, it is easily triggered but difficult to control.

Neurotic people tend to avoid acting when confronted with major and minor life stressors, leading to negative life consequences.

The researchers sought to determine whether and under what conditions neuroticism is associated with favorable or unfavorable representations of action and inaction.

They investigated whether depression and anxiety would decrease proactive behavior among neurotic individuals, and whether a person’s collectivistic tendencies — considering the social consequences of one’s behavior before acting — would moderate the negative associations between neuroticism and action/inaction.

The study found neurotics look at action less favorably and inaction more favorably than emotionally stable people do.

“People who are less emotionally stable have less positive attitudes towards action and more positive attitudes toward inaction,” the authors wrote.

“Furthermore, anxiety was primarily responsible for neurotic individuals’ less positive attitudes toward action.

The link between neuroticism and less positive attitudes toward action was strongest among individuals who endorsed more collectivistic than individualistic beliefs.”

People who are interested in reducing the harmful consequences of neuroticism in their own lives should think about how their attitudes toward action might be affecting their behavior, the authors noted.

“By learning to value action, they may be able to change many of the negative behaviors associated with neuroticism and anxiety — such as freezing when they should act, or withdrawing from stress instead of dealing proactively with it,” they said, suggesting that attitudes about action and inaction goals have broad consequences for behavior across diverse contexts and cultures.

“These findings lay the groundwork for finding new methods of studying and ultimately preventing the negative consequence of neurotic action avoidance. Specifically, increasing exposure to action may be sufficient to combat tendencies to avoid proactive behavior.”

The study is published in the Journal of Personality.

Source: University of Pennsylvania
Take action sign photo by shutterstock.

Specific Sleep Disorder Associated with Brain Diseases

Wed, 04/23/2014 - 5:30am

A new study suggests a sleep disorder that causes people to act out their dreams is the best current predictor of brain diseases like Parkinson’s and Alzheimer’s.

Researchers at the University of Toronto say that rapid-eye-movement sleep behavior disorder (RBD) is not just a precursor but also a critical warning sign of neurodegeneration that can lead to brain disease.

“In fact, as many as 80 to 90 percent of people with RBD will develop a brain disease,” said associate professor and lead author John Peever, Ph.D.

As labeled, the disorder occurs during the rapid-eye-movement (REM) stage of sleep and causes people to act out their dreams, often resulting in injury to themselves and/or bed partner.

In healthy brains, muscles are temporarily paralyzed during sleep to prevent this from happening.

“It’s important for clinicians to recognize RBD as a potential indication of brain disease in order to diagnose patients at an earlier stage,” said Peever.

“This is important because of drugs that reduce severe degenerative disorders.”

His research examines the idea that neurodegeneration might first affect areas of the brain that control sleep before attacking brain areas that cause more common brain diseases like Alzheimer’s.

Peever said he hopes the results of his study lead to earlier and more effective treatment of neurodegenerative diseases.

The research findings have been published in the journal Trends in Neuroscience.

Source: University of Toronto

Improve Dementia Management – Without More Drugs

Tue, 04/22/2014 - 8:30am

A panel of specialists in senior mental health has developed a new approach for handing agitation, aggression, and other unwanted behaviors by people with dementia.

Researchers believe the strategy may help reduce the use of antipsychotics and other psychiatric drugs in this population, and make life easier for them and their caregivers.

Experts believe the new guidelines will improve teamwork among those who care for dementia patients at home, in residential facilities, and in hospitals and clinics.

The approach, termed DICE, has been incorporated by the federal agency that runs Medicare as an official part of its toolkit for reducing the use of antipsychotic drugs and other mental health medications in people with dementia.

Though these drugs may still help some patients, the new paper in the Journal of the American Geriatrics Society suggests that many non-medication approaches could also help reduced unwanted behaviors — known as neuropsychiatric symptoms of dementia.

However, researchers and experts warn that it will take teamwork and communication to do it.

Most people with Alzheimer’s disease and other memory-affecting conditions also get aggressive, agitated, depressed, anxious, or delusional from time to time, said senior author Helen C. Kales, M.D. Or, they might have delusions, hallucinations, or lose inhibitions.

“Often more than memory loss, behavioral symptoms of dementia are among the most difficult aspects of caring for people with dementia. These symptoms are experienced almost universally, across dementia stages and causes,” she said.

“Sadly, these symptoms are often associated with poor outcomes including early nursing home placement, hospital stays, caregiver stress and depression, and reduced caregiver employment.”

Doctors often prescribe these patients medications often used in patients with mental health disorders, despite little hard evidence that they work well and despite the risks they can pose — including hastening death.

Meanwhile, studies have shown promise from non-medication approaches to changing dementia patients’ behavior and reducing triggers for behavioral issues in their environment and daily life. But too few health teams are trained in their use.

The DICE Approach

Kales and colleagues Laura N. Gitlin, Ph.D., and Constantine G. Lyketsos, M.D., from Johns Hopkins University authored the new paper on behalf of a group of experts, called the Detroit Expert Panel on the Assessment and Management of the Neuropsychiatric Symptoms of Dementia, who developed the DICE approach.

Sponsored by Kales’ program, the national multidisciplinary panel of experts met in Michigan to create a comprehensive approach to behavioral management.

Dubbed “DICE” for Describe, Investigate, Evaluate, and Create, it details key patient, caregiver, and environmental considerations with each step of the approach and describes the “go-to” behavioral and environmental interventions that should be considered.

Briefly described, the components are:

• D: Describe – Asking the caregiver, and the patient if possible, to describe the “who, what, when and where” of situations where problem behaviors occur and the physical and social context for them. Caregivers could take notes about the situations that led to behavior issues, to share with health professionals during visits;
• I: Investigate – Having the health provider look into all the aspects of the patient’s health, dementia symptoms, current medications, and sleep habits, that might be combining with physical, social, and caregiver-related factors to produce the behavior;
• C: Create – Working together, the patient’s caregiver and health providers develop a plan to prevent and respond to behavioral issues in the patient, including everything from changing the patient’s activities and environment, to educating and supporting the caregiver;
• E: Evaluate – Giving the provider responsibility for assessing how well the plan is being followed and how it’s working, or what might need to be changed.

The researchers believe doctors should prescribe psychotropic drugs only after they and the patient and caregiver have made significant efforts to change dementia patients’ behavior through environmental modifications and other interventions.

Exceptions to this policy are situations related to severe depression, psychosis, or aggression that present risk to the patient or others.

The authors believe all health providers as well as spouses, adult children, and others who care for dementia patients should familiarize themselves with the DICE approach.

“Innovative approaches are needed to support and train the front-line providers for the burgeoning older population with behavioral symptoms of dementia,” said Kales.

“We believe that the DICE approach offers clinicians an evidence-informed structured clinical reasoning process that can be integrated into diverse practice settings.”

Gitlin, who directs the Center for Innovative Care in Aging at the Johns Hopkins School of Nursing, added, “The DICE approach is inherently patient- and caregiver-centered because the concerns of individuals with dementia and their caregivers are integral to each step of the process.

“DICE also enables clinicians to consider the roles of nonpharmacologic, medical and pharmacologic treatments concurrently.”

Lyketsos stresses that the approach “has tremendous utility in clinical trials of treatments for behavioral symptoms, particularly in testing new medications.

“DICE can be used to better subtype behaviors, or focus on particular behaviors at randomization coupled with systematic treatment approaches.”

Source: University of Michigan

 
Helping a dementia patient photo by shutterstock.

Brain Risk from Fluctuating Blood Pressure

Tue, 04/22/2014 - 7:45am

Older people whose blood pressure fluctuates more than average are at a higher risk of impaired cognitive function, recent research suggests.

Dr. Simon Mooijaart of Leiden University Medical Centre, the Netherlands, and colleagues explain that elements of the vascular system (arteries, veins and capillaries) may contribute to the development and progression of dementia.

They point out that some cardiovascular risk factors may be reversible, potentially cutting the risk of cognitive decline and dementia.

The team investigated the link between variability in blood pressure and cognitive function in 5,461 people aged 70 to 82 years living in Ireland, Scotland and the Netherlands. All were taking part in a study on the impact of statin drugs on blood vessel health, as they were at risk of cardiovascular disease. Blood pressure was measured every three months.

The well-known mini mental state examination was used to test cognitive function at the start of the study to exclude those with poor cognitive function.

After a followup of roughly three years, participants were tested on four areas of cognitive function: attention, processing speed, immediate memory and delayed memory.

“Participants with higher visit-to-visit variability in systolic blood pressure had worse performance on all cognitive tests, independent of average blood pressure,” the team reported in the British Medical Journal.

A subgroup of 553 participants underwent magnetic resonance imaging (MRI) of the brain. This showed that greater variability in blood pressure was linked to a smaller hippocampus, which is vital for memory, as well as a higher rate of cortical infarcts, a type of stroke causing difficulties reading, writing or speaking, and visual field defects.

Greater variability in blood pressure was also linked to cerebral microbleeds (small hemorrhages).

The team concluded: “Higher visit-to-visit variability in blood pressure, independent of average blood pressure and cardiovascular disease, was associated with impaired cognitive function in old age.”

It is already established that variability in blood pressure is linked to cerebrovascular damage; that is, damage affecting the blood vessels to the brain. Variability raises the risk of stroke, and possibly microvascular damage, or changes in the small blood vessels of the cerebral cortex due to damage to artery walls.

Mooijaart and colleagues say that disruption of the blood-brain barrier due to microvascular damage “results in neuronal injury and accelerates neuronal loss and brain atrophy.” In this way, higher variability in blood pressure might lead to cognitive impairment via changes in the brain structures and development of cerebral small vessel disease.

“Our findings may suggest that decreased hippocampal volume, cerebral microbleeds, and cortical infarcts are potential pathogenic mechanisms behind the association between variability in blood pressure and cognitive impairment,” they write.

In terms of preventative medication, they add, “Calcium channel blockers, the most effective drug class to reduce variability in blood pressure, show significant efficacy in lowering the risk of vascular cognitive impairment.”

They call for further work to determine whether reducing variability in blood pressure really can decrease the risk of cognitive impairment in old age.

These findings are consistent with an earlier study on 201 elderly people with a mean age of 80 years. All were at high risk of cardiovascular disease.

The researchers showed that high visit-to-visit variability in blood pressure over the course of 12 months was linked to worse performance in the mini mental state examination and global deterioration scale, widely used to measure the progression of dementia.

A separate study has looked specifically at cerebral microbleeds, which are common among the elderly population. Their impact on cognitive function is unclear. So a team from Erasmus MC University Medical Center, The Netherlands, investigated further.

They tested performance over “multiple cognitive domains” in 3,979 people aged around 60 years, without dementia.

A higher number of microbleeds were linked to worse performance on the mini mental state examination, particularly in tests of information processing speed and motor speed. Having five or more microbleeds was linked with worse performance in all cognitive domains except memory.

All of these findings emphasize the connection between healthy blood vessels and better cognitive functioning. “Once the correct treatment is in place for variation in blood pressure, this will eventually contribute to a further decline in the risk of dementia,” believes Dr. Mooijaart.

References

Sabayan, B. et al. Association of visit-to-visit variability in blood pressure with cognitive function in old age: prospective cohort study. BMJ, 31 July 2013 doi: 10.1136/bmj.f4600
www.bmj.com/cgi/doi/10.1136/bmj.f4600

Nagai, M. et al. Visit-to-visit blood pressure variations: new independent determinants for cognitive function in the elderly at high risk of cardiovascular disease. Journal of Hypertension, August 2012, Volume 30, pp. 1556-63, doi: 10.1097/HJH.0b013e3283552735

Poels, M. M. et al. Cerebral microbleeds are associated with worse cognitive function: the Rotterdam Scan Study. Neurology, 31 January 2012, Volume 78, Issue 5, pp. 326-33, doi: 10.1212/WNL.0b013e3182452928
Elderly woman having her blood pressure taken photo by shutterstock.

Selective Memories of Atrocities Linked to Social Group

Tue, 04/22/2014 - 7:00am

The memory of wartime atrocities committed by American soldiers while on duty in Iraq and Afghanistan is often incomplete.

Researchers now believe the omission of details can lead people to have different memories for the event depending on social group membership.

“We started thinking about this project around the time when stories began to emerge in the popular media about atrocities committed by American soldiers in Iraq and Afghanistan,” said lead researcher Alin Coman, Ph.D., psychological scientist at Princeton University.

“We wanted to scientifically investigate the effect of hearing about these incidents at the level of the American public,” Coman said.

“How will people remember these atrocities? Will they tend to suppress the memory to preserve the positive view of their in-group? Will they conjure potential pieces of information to justify the atrocities?”

The research is found in Psychological Science, a journal of the Association for Psychological Science.

As people discuss events, such as abuses at Abu Ghraib and Guantanamo, the stories are often reworked over time. Coman and colleagues wondered whether this reworking might alter people’s memories for the events.

Blending work on moral psychology and cognitive psychology, the researchers hypothesized that listeners would more easily forget unrepeated justifications for atrocities that are supposedly perpetrated by someone from an outside group.

However, listeners would be motivated to remember the unrepeated justifications when the perpetrators are members of their own group — the memory process possibly serving as a way of shielding in-group members from moral responsibility.

To test their hypothesis, the researchers asked 72 American participants to read stories about perpetrators of war atrocities who were either American soldiers (in-group) or Afghan soldiers (out-group).

The stories were drawn from or constructed to resemble actual media reports, and the atrocities in the stories were accompanied by a justifying action. For example, the perpetrator submerged an insurgent’s head in cold water because he had withheld information about an upcoming attack.

Participants studied the stories and, after a 10-minute distractor task, they watched a video of another person recounting the atrocities — but not repeating the justifications — from two of the four stories that had been presented.

After another distractor task, participants were asked to recall as much as they could about each of the four stories they had studied.

The results showed that participants were more likely to forget justifications for the atrocities committed by Afghan soldiers that had been recounted in the videos compared to justifications for the atrocities that hadn’t been recounted.

The results indicate that hearing the stories repeated without the original justifications led participants to forget those justifications, just as the researchers expected.

But participants showed no memory impairment for unrepeated justifications when the perpetrator was American.

That is, in-group membership made participants more likely to remember the reasons why the soldier committed the act, even though they had not been reminded of those reasons in the video.

“What we learn from this research is that moral disengagement strategies are fundamentally altering our memories,” said Coman.

“More specifically, these strategies affect the degree to which our memories are influenced by the conversations we have with one another.”

These findings are important, the researchers argue, because the ways in which people recall justifications could “influence attitudes and beliefs, the willingness to pay reparations and the level of aggression toward out-groups.”

Source: Association for Psychological Science
Man reading newspaper and talking on phone about it photo by shutterstock.

Crime Rarely Associated with Mental Illness

Tue, 04/22/2014 - 6:15am

Despite high-profile crimes associated with mentally ill suspects such as the school shootings in Sandy Hook, Connecticut, new research discovers less than 10 percent of crimes are directly related to symptoms of mental illness.

“When we hear about crimes committed by people with mental illness, they tend to be big headline-making crimes so they get stuck in people’s heads,” said Jillian Peterson, Ph.D., lead researcher of the study published  online in the journal Law and Human Behavior.

“The vast majority of people with mental illness are not violent, not criminal and not dangerous.”

In the study, investigators reviewed 429 crimes committed by 143 offenders with three major types of mental illness. They found that three percent of their crimes were directly related to symptoms of major depression, four percent to symptoms of schizophrenia disorders, and 10 percent to symptoms of bipolar disorder.

The study was conducted with former defendants of a mental health court in Minneapolis. The participants completed a two-hour interview about their criminal history and mental health symptoms, covering an average of 15 years.

“The study may be the first to analyze the connection between crime and mental illness symptoms for offenders over an extended period of their lives,” said Peterson.

The study didn’t find any predictable patterns linking criminal conduct and mental illness symptoms over time.

Two-thirds of the offenders who had committed crimes directly related to their mental illness symptoms also had committed unrelated crimes for other reasons, such as poverty, unemployment, homelessness, and substance abuse, according to the research.

“Is there a small group of people with mental illness committing crimes again and again because of their symptoms? We didn’t find that in this study,” Peterson said.

In the United States, more than 1.2 million people with mental illness are incarcerated in jails or prisons, according to the federal Bureau of Justice Statistics.

People with mental illnesses also are on probation or parole at two to four times the rate for the general population.

In addition to interviews with offenders, the researchers reviewed criminal history and social worker files to help rate crimes based on their association with symptoms of schizophrenia disorders (hallucinations and delusions), bipolar disorder (impulsivity and risk-taking behavior), or major depression (hopelessness and suicidal thoughts).

The ratings were: no relationship between mental illness symptoms and the crime, mostly unrelated, mostly related, or directly related.

A crime could be rated as mostly unrelated or mostly related to mental illness symptoms if those symptoms contributed to the cause of the crime but weren’t solely responsible for it.

For example, an offender with schizophrenia who was agitated because he heard voices earlier in the day later got into a bar fight, but he wasn’t hearing voices at the time of the altercation, so the crime was categorized as mostly related.

When the directly related and mostly related categories were combined, the percentage of crimes attributed to mental illness symptoms increased from 7.5 percent to 18 percent, or less than one in five of the crimes analyzed in the study.

Of crimes committed by participants with bipolar disorder, 62 percent were directly or mostly related to symptoms, compared with 23 percent for schizophrenia and 15 percent for depression.

“In some cases participants may have described their mood as “manic” during a crime even though they could have just been angry or abusing drugs or alcohol, so the percentage of crimes attributed to bipolar disorder may be inflated,” Peterson said.

Almost two-thirds of the study participants were male, with an average age of 40. They were evenly divided between white and black offenders (42 percent each, 16 percent other races), and 85 percent had substance abuse disorders.

The study did not include offenders with serious violent offenses because the mental health court did not adjudicate those crimes, but the participants did describe other violent crimes they had committed.

The study also did not examine how substance abuse interacted with mental illness to influence criminal behavior.

The researchers said programs designed to reduce recidivism for mentally ill offenders should be expanded beyond mental health treatment to include cognitive-behavioral treatment about criminal thinking, anger management, and other behavioral issues.

“Programs to address basic needs also are essential to reduce recidivism for all offenders after incarceration, including drug treatment and housing and employment support,” Peterson said.

Source: American Psychological Association

 
Woman being arrested photo by shutterstock.

Teachers’ Tough Talk on Exams May Do More Harm Than Good

Tue, 04/22/2014 - 5:30am

As the end of the school year approaches some teachers believe in reminding students of the negative consequences that may occur if a student fails an exam.

New research finds that this can be the wrong approach as students may focus on failure and actually become less motivated.

“Teachers are desperately keen to motivate their students in the best possible way but may not be aware of how messages they communicate to students around the importance of performing well in exams can be interpreted in different ways,” said lead author David Putwain, Ph.D., of Edge Hill University in Lancashire, England.

The study, published in the journal School Psychology Quarterly, involved 347 students, average age 15, of whom 174 were male.

The students attended two schools that offer an 18-month study program for the exam leading to a General Certificate of Secondary Education, the equivalent of a high school diploma in the U.S.

Students who said they felt threatened by their teachers’ messages that frequently focused on failure reported feeling less motivated and scored worse on the exam than students who said their teacher used fewer fear tactics that they considered less threatening, the study found.

A message such as, “If you fail the exam, you will never be able to get a good job or go to college. You need to work hard in order to avoid failure,” was an example of attempting to motivate by fear.

Messages focusing on success might include, “The exam is really important as most jobs that pay well require that you pass and if you want to go to college you will also need to pass the exam,” according to the study.

“Both messages highlight to students the importance of effort and provide a reason for striving,” said Putwain.

“Where these messages differ is some focus on the possibility of success while others stress the need to avoid failure.”

Twice over 18 months, students responded to a teacher at the school who was provided a script of questions to ask when other information was collected for registration and administration.

The teachers asking questions were not the students’ exam-preparatory instructors.

The first set of questions asked how frequently their teachers attempted to motivate them with fear of failure, such as, “How often do your teachers tell you that unless you work hard you will fail your exam?”

Students’ level of feeling threatened was measured with questions such as, “Do you feel worried when your teachers tell you that your exam is getting nearer?” The teachers asked students to rate each item on a scale of one to five, with one being “never” and five being “most of the time.”

Three months later, students completed a questionnaire with the base question, “What is the reason for doing your schoolwork?”

The students had several answer options representing different types of motivation, including rising from within or from an external source. At the end of the 18-month program, researchers collected the students’ final grades.

“Psychologists who work in or with schools can help teachers consider the types of messages they use in the classroom by emphasizing how their messages influence students in both positive and negative ways and by recommending they consider the messages they currently use and their possible consequences,” Putwain said.

“Teachers should plan what types of messages would be the most effective and how they could be incorporated into the lesson plans.”

Source: American Psychological Association

 
Stern teacher talking to students photo by shutterstock.

Brain Appears Hardwired for Some Aspects of Language

Mon, 04/21/2014 - 7:00am

A new study discovers that human brains share common linguistic restrictions on the sound pattern of language.

The understanding that language is hard-wired helps to explain why language is so constrained. For example, people blog, they don’t lbog, and they schmooze, not mshooze.

The groundbreaking study is published in PLOS ONE by psychologist Dr. Iris Berent of Northeastern University and researchers at Harvard Medical School.

Investigators discovered the brains of individual speakers are sensitive to language universals. Syllables that are frequent across languages are recognized more readily than infrequent syllables.

Experts say that language universals have been the subject of intense research, but their basis had remained elusive. Indeed, the similarities between human languages could result from a host of reasons that are tangential to the language system itself.

Syllables like lbog, for instance, might be rare due to sheer historical forces, or because they are just harder to hear and articulate.

A more interesting possibility, however, is that these facts could stem from the biology of the language system.

Could the unpopularity of lbogs result from universal linguistic principles that are active in every human brain? To address this question, Berent and her colleagues examined the response of human brains to distinct syllable types — either ones that are frequent across languages (e.g., blif, bnif), or infrequent (e.g., bdif, lbif).

In the experiment, participants heard one auditory stimulus at a time (e.g., lbif), and were then asked to determine whether the stimulus includes one syllable or two while their brain was simultaneously imaged.

Results showed the syllables that were infrequent and ill-formed, as determined by their linguistic structure, were harder for people to process.

Remarkably, a similar pattern emerged in participants’ brain responses: worse-formed syllables (e.g., lbif) exerted different demands on the brain than syllables that are well-formed (e.g., blif).

The localization of these patterns in the brain further sheds light on their origin.

If the difficulty in processing syllables like lbif were solely due to unfamiliarity, failure in their acoustic processing, and articulation, then such syllables are expected to only exact cost on regions of the brain associated with memory for familiar words, audition, and motor control.

In contrast, if the dislike of lbif reflects its linguistic structure, then the syllable hierarchy is expected to engage traditional language areas in the brain.

While syllables like lbif did, in fact, tax auditory brain areas, they exerted no measurable costs with respect to either articulation or lexical processing.

Instead, it was Broca’s area — a primary language center of the brain — that was sensitive to the syllable hierarchy.

These results show for the first time that the brains of individual speakers are sensitive to language universals: the brain responds differently to syllables that are frequent across languages (e.g., bnif) relative to syllables that are infrequent (e.g., lbif).

Researchers say that this was a remarkable finding given that participants (English speakers) had never encountered most of those syllables before.

Therefore, the result shows that language universals are encoded in human brains.

The fact that the brain activity engaged Broca’s area — a traditional language area — suggests that this brain response might be due to a linguistic principle.

This result opens up the possibility that human brains share common linguistic restrictions on the sound pattern of language.

This proposal is further supported by a second study that recently appeared in the Proceedings of the National Academy of Science, also co-authored by Berent.

This study shows that, like their adult counterparts, newborns are sensitive to the universal syllable hierarchy.

The findings from newborns are particularly striking because they have little to no experience with any such syllable.

Together, these results demonstrate that the sound patterns of human language reflect shared linguistic constraints that are hardwired in the human brain already at birth.

Source: Northeastern University

 
Abstract of brain photo by shutterstock.

Depression May Lessen Effectiveness of Parkinson’s Drugs

Mon, 04/21/2014 - 6:15am

In an unexpected finding, new research suggests depression can hamper cognitive function among Parkinson’s’ patients receiving traditional dopamine replacement therapy.

Scientists from the University of Kentucky College of Medicine found that the dopamine replacement therapy commonly used to treat motor symptoms of Parkinson’s disease (PD) was associated with a decline in cognitive performance among depressed Parkinson patients.

In the study, as published in the journal Psychiatry Research, investigators found that in contrast, non-depressed Parkinson patients’ cognitive function improved on dopamine replacement therapy.

The study also found that mood in depressed Parkinson’s patients was actually worse while on dopaminergic medications.

“This was a surprise,” said Lee Blonder, Ph.D., the study’s principal investigator.

“It is the opposite of our original hypothesis that both groups of PD patients would improve in cognitive performance on dopaminergic medications, and that mood in the depressed PD group would also improve.”

A cohort of 28 patients with PD — 18 nondepressed and 10 depressed — were given a baseline series of tests to assess cognitive function and the incidence and severity of depression. They were then re-tested with and without their dopamine replacement therapy.

Results revealed a statistically significant interaction between depression and medication status on three measures of verbal memory and a facial affect naming task.

In all cases, depressed Parkinson’s patients performed significantly more poorly while on dopaminergic medication than while off. The opposite pattern emerged for the non-depressed Parkinson’s group.

Depression is a common and serious comorbidity in patients with Parkinson’s; studies suggest that approximately 40 percent of PD patients suffer from depression.

Blonder cautions that these results are to some extent preliminary due to the small cohort of 28 participants.

“Additional studies are required before these results should be used to alter treatment plans,” Blonder says.

But, “future research should ultimately focus on investigating treatment options for patients with Parkinson’s and depression to maximize patient function without compromising their mental health.”

Source: University of Kentucky

Online Searches for ‘Health’ Information Occur Early in the Week

Mon, 04/21/2014 - 5:30am

An interesting new study discovers that Americans are more apt to search for “health” content at the beginning of the week.

The pattern could be used to improve health promotion strategies.

As published in the American Journal of Preventive Medicine, researchers analyzed weekly patterns in health-related Google searches.

Investigators from San Diego State University (SDSU), the Santa Fe Institute, Johns Hopkins University, and the Monday Campaigns, analyzed “healthy” Google searches (searches that included the term healthy and were indeed health-related, e.g., “healthy diet”) originating in the U.S. from 2005 to 2012.

The Monday Campaigns is a nonprofit organization in association with Johns Hopkins Bloomberg School of Public Health, Columbia University Mailman School of Public Health, and the Maxwell School of Syracuse University.

They dedicate the first day of every week to health to create a movement of individuals and organizations that join together every Monday to commit to healthy behaviors that can help end preventable chronic diseases.

In the new study, researchers found that on average, searches for health topics were 30 percent more frequent at the beginning of the week than on days later in the week, with the lowest average number of searches on Saturday.

This pattern was consistent year after year, week after week, using a daily measure to represent the proportion of healthy searches to the total number of searches each day.

“Many illnesses have a weekly clock with spikes early in the week,” said SDSU’s John W. Ayers, Ph.D., lead author of the study.

“This research indicates that a similar rhythm exists for positive health behaviors, motivating a new research agenda to understand why this pattern exists and how such a pattern can be utilized to improve the public’s health.”

Joanna Cohen, Ph.D, a co-author of the study and professor at the Johns Hopkins Bloomberg School of Public Health, added, “We could be seeing this effect because of the perception that Monday is a fresh start, akin to a mini New Year’s Day.

“People tend to indulge in less healthy behaviors on the weekend, so Monday can serve as a ‘health reset’ to get back on track with their health regimens.”

“It’s interesting to see such a consistent and similar rhythm emerging from search data,” added Benjamin Althouse, study co-author and Omidyar Fellow at the Santa Fe Institute.

“These consistent rhythms in healthy searches likely reflect something about our collective mindset, and understanding these rhythms could lead to insights about the nature of health behavior change.”

Results showed that search volumes on Monday and Tuesday were three percent greater relative to Wednesday, 15 percent greater than Thursday, 49 percent greater than Friday, 80 percent greater than Saturday, and 29 percent greater than Sunday.

The team also examined whether media exposure could be driving this weekly pattern.

Co-author Mark Dredze from Johns Hopkins said, “We tested this hypothesis by monitoring the daily frequency of news stories encouraging healthy lifestyles, but those stories actually peaked on Wednesdays and were statistically independent of healthy searches.”

According to the published paper, “Understanding circaseptan rhythms around health behaviors can yield critical public health gains.

“For instance, government-funded health promotion programs spend $76.2 billion annually and their cost-effectiveness can be improved by targeting the population on weekday(s) when more individuals are contemplating health habits.”

Morgan Johnson, MPH, from The Monday Campaigns and study co-author, noted that leveraging Monday is a simple, cost-effective way to nudge people towards healthier behavior.

“The challenge we face in public health is to help people sustain healthy behaviors over time. Since Monday comes around every seven days when people are ‘open to buy’ health, it can be used as a cue to help create healthy habits for life.”

Source: San Diego State University
Computer and health photo by shutterstock.

Babies More Likely to Tolerate Unfairness When It Benefits a Group Member

Sun, 04/20/2014 - 8:15am

In a new study, developed to test how race and fairness influence babies’ selection of a playmate, researchers found that 15-month-old babies mostly value fairness, unless an adult distributes toys in a way that benefits someone of their own race.

“Babies are sensitive to how people of the same ethnicity as the infant, versus a different ethnicity, are treated — they weren’t just interested in who was being fair or unfair,” said Monica Burns, co-author of the study and a former University of Washington psychology undergraduate student.

“It’s interesting how infants integrate information before choosing who to interact with, they’re not just choosing based on a single dimension,” she said.

For the study, published in the online journal Frontiers in Psychology, forty Caucasian 15-month-old babies sat on their parents’ laps and watched two Caucasian experimenters divide toys between recipients. One experimenter divided the toys equally, and the other divided the toys unequally.

Later, when the babies were given the chance to choose one of the experimenters as a playmate, 70 percent of the time infants chose the experimenter who distributed the toys fairly. This suggests that when individuals are the same race as the infant, babies choose fair over unfair individuals as playmates.

Next, the researchers investigated a more complex question: What would happen when an individual (of the same race as the infant) could benefit from unfairness?

In the next experiment, 80 Caucasian 15-month-old infants witnessed a fair experimenter and an unfair experimenter distribute toys to a white and an Asian recipient. Half of the babies saw the unfair experimenter give more to the Asian recipient; and the other half of the babies saw the experimenter give more to the white recipient.

When it was time to choose a playmate, babies seemed more tolerant of unfairness when the white recipient benefited from it. They picked the fair experimenter less often when the unfair experimenter gave more toys to the white recipient instead of the Asian recipient.

“It’s surprising to see these pro-social traits of valuing fairness so early on, but at the same time, we’re also seeing that babies have self-motivated concerns too,” said Jessica Sommerville Ph.D, University of Washington associate professor of psychology.

Sommerville is quick to point out that her findings do not mean that babies are racist. “Racism connotes hostility,” she said, “and that’s not what we studied.”

“If all babies care about is fairness, then they would always pick the fair distributor, but we’re also seeing that they’re interested in consequences for their own group members,” Sommerville said.

The results suggest that infants can take into account both race and social history (how a person treats someone else) when choosing the best playmate.

What the study does show is that babies use basic distinctions, including race, to start to “cleave the world apart by groups of what they are and aren’t a part of,” Sommerville said.

Source:  University of Washington

Scientists Pinpoint Brain’s Anti-Distraction Mechanism

Sun, 04/20/2014 - 7:30am

Researchers from Simon Fraser University have discovered that environmental and/or genetic factors may hinder or suppress a particular brain activity that helps prevent us from distraction. The discovery could revolutionize doctors’ perception and treatment of attention-deficit disorders.

The study, published in the Journal of Neuroscience, is the first to reveal that our brains rely on an active suppression mechanism to avoid being distracted by irrelevant information when we want to focus on a particular item or task.

John McDonald Ph.D, an associate professor of psychology and Canada Research Chair in Cognitive Neuroscience, and other scientists first discovered the existence of the mechanism in his lab in 2009. But, until now, it was still unknown how it helps us ignore visual distractions.

The study involved three experiments in which 47 students (mean age of 21) performed an attention-demanding visual search task. The researchers studied their neural processes related to attention, distraction, and suppression by recording electrical brain signals from sensors embedded in a cap.

“This is an important discovery for neuroscientists and psychologists because most contemporary ideas of attention highlight brain processes that are involved in picking out relevant objects from the visual field. It’s like finding Waldo in a Where’s Waldo illustration,” said John Gaspar, the study’s lead author.

“Our results show clearly that this is only one part of the equation and that active suppression of the irrelevant objects is another important part.”

Because of the increase in distracting consumer devices in our technology-driven, fast-paced society, the psychologists say their discovery could help scientists and clinicians better treat patients with distraction-related attention deficits.

“Distraction is a leading cause of injury and death in driving and other high-stakes environments,” notes senior author McDonald. “There are individual differences in the ability to deal with distraction. New electronic products are designed to grab attention. Suppressing such signals takes effort, and sometimes people can’t seem to do it.”

“Moreover, disorders associated with attention deficits, such as attention deficit hyperactivity disorder and schizophrenia, may turn out to be due to difficulties in suppressing irrelevant objects rather than difficulty selecting relevant ones.”

The researchers are now studying how we deal with distraction. They’re looking at when and why we can’t suppress potentially distracting objects, and why some of us are better at this than others.

“There’s evidence that attentional abilities decline with age and that women are better than men at certain visual attentional tasks,” said Gaspar, the study’s first author.

Source:  Simon Fraser University 

Religious Music Benefits Seniors’ Mental Health

Sun, 04/20/2014 - 6:45am

Older Christians who listen to religious music are less anxious about death and report increases in life satisfaction, self-esteem, and a sense of control over their lives, according to a new study.

“Religion is an important socioemotional resource that has been linked with desirable mental health outcomes among older U.S. adults,” the researchers stated in the study, which was published in The Gerontologist. “This study shows that listening to religious music may promote psychological well-being in later life.”

The study, “Listening to Religious Music and Mental Health in Later Life,” was conducted by Matt Bradshaw, Ph.D, of Baylor University; Christopher G. Ellison, Ph.D, of the University of Texas-San Antonio; Qijan Fang, MA, of Bowling Green State University; and Collin Mueller, MA, of Duke University.

The researchers analyzed data collected in 2001 and 2004 as part of the nationwide Religion, Aging, and Health Survey.

The data included 1,024 people at least 65 years old who were either black or white and English speaking. Responses were collected from practicing Christians, those who identified as Christians in the past but no longer practice any religion, and those not affiliated with any faith at any point in their lifetime.

The respondents were asked how often they listened to both religious music and gospel music on a scale ranging from “never” to “several times a day.”

Anxiety about death, life satisfaction, self-esteem, and sense of control were measured by how strongly a respondent agreed with a series of statements, such as “I find it hard to face up to the fact that I will die,” “These are the best years of my life,” “I take a positive attitude toward myself,” and “I have a lot of influence over most things that happen in my life.”

“Given that religious music is available to most individuals — even those with health problems or physical limitations that might preclude participating in more formal aspects of religious life — it might be a valuable resource for promoting mental health later in the life course,” the researchers wrote in the study.

Source: The Gerontological Society of America

 
Elderly woman listening to music photo by shutterstock.

Inhibited Babies More Likely to Become Anxious Adults

Sun, 04/20/2014 - 6:00am

Are nervous and inhibited babies more likely to become anxious adults? New research says yes. By following babies into their teens and beyond, researchers have been able to confirm the link between behavioral inhibition in young children and anxiety later in life.

“The inhibited child will sit and watch, but she doesn’t play alone or with others. The idea of being included appears to terrify her,” said developmental psychologist Koraly Pérez-Edgar Ph.D, associate professor of psychology at Penn state.

Her research over the years has shown that this kind of extreme shyness is often a predictor of anxiety later in life. She notes that the behavior of a shy child will evolve as they grow up, “but they can remain uncomfortable in their own skin in new social situations.”

It’s rare for a child to be clinically diagnosed with an anxiety disorder before adolescence. “Kids aren’t yet anxious, but can have the temperament that may predispose them to become anxious,” said Pérez-Edgar.

She is careful to note the distinction between normal separation anxiety, a common experience among two and three year-olds, and what might be called an anxious temperament.

“When [a behaviorally inhibited] baby is exposed to novel sensory information — it can be something as benign as one of those mobiles you put over the crib or a normal jack-in-the-box — a lot of babies giggle and laugh, they think it’s funny. But these babies are terrified, they cry and arch their backs — their systems have just said ‘danger, danger, danger,’” she said.

Later in life, this might translate into having a difficult time building relationships and socializing with peers.

Once a behavioral link had been established, researchers began to speculate about the neurology involved. Could extreme shyness be traced to differences in the brain? Developmental psychologist Jerome Kagan predicted that behaviorally inhibited babies might have an overly sensitive limbic system, and in particular an overly sensitive amygdala.

The amygdala is the seat of what is known as the fight-or-flight reaction. When the amygdala is overly-sensitive, it can cause anxiety. After the babies in the study became teenagers and were able to undergo magnetic resonance imaging (MRI) scans, Pérez-Edgar reports, “We were able to show that yes indeed, teenagers who as babies looked so fearful in the face of novelty, in fact their amygdalae did respond more vigorously.”

At this point, however, the direction of causation is still unknown. “Here we have a chicken versus egg situation,” says Pérez-Edgar. “Is it because you’re temperamentally reactive that your amygdala is overactive, or vice versa?”

Pérez-Edgar is currently conducting a study with children ages nine to 12 to observe how attention and temperament are linked to social behavior. As she points out, the amygdala is not only activated by fear, but is also known to be responsive to other social stimuli.

One way that the researchers are trying to help anxious children is through behavioral therapy: directing the children’s attention away from the source of anxiety. They hypothesize that by training the brain of a child to not seek out things that cause anxiety and by focusing attention elsewhere, their anxiety will lessen.

Source:  Penn State

 

Anxious and inhibited baby photo by shutterstock.

Compassion Helps Promote Better Health in Older Adults

Sat, 04/19/2014 - 8:15am

Older women, individuals who have initiative and those who have suffered a recent loss are more likely to be compassionate to strangers, according to new research from the University of California, San Diego School of Medicine.

Researchers note that compassionate behaviors are associated with better health and well-being as we age. They say that their findings offer insights into ways to help people with “deficits in compassion,” which puts them at risk for becoming lonely and isolated later in life.

“We are interested in anything that can help older people age more successfully,” said Lisa Eyler, Ph.D, a professor of psychiatry.

“We know that social connections are important to health and well-being, and we know that people who want to be kind to others garner greater social support. If we can foster compassion in people, we can improve their health and well-being, and maybe even longevity.”

The study is based on a survey of 1,006 randomly selected adults in San Diego County, aged 50 and over.

It identified three factors that were predictive of a person’s self-reported compassion: gender, recent suffering, and high mental resiliency.

Women, independent of their age, income, education, race, marital status, or mental health status, scored higher on the compassion test, on average, than men.

Higher levels of compassion were also observed among both men and women who had experienced a personal loss, such as a death in the family or illness, in the last year.

Finally, people who reported confidence in their ability to bounce back from hard times also reported more empathy toward strangers and joy from helping those in need.

“What is exciting is that we are identifying aspects of successful aging that we can foster in both men and women,” said Dilip Jeste, MD, Distinguished Professor of Psychiatry and Neurosciences, and director of the Sam and Rose Stein Institute for Research on Aging.

“Mental resiliency can be developed through meditation, mindfulness, and stress reduction practices. We can also teach people that the silver lining to adversity is an opportunity for personal growth.”

The study, funded in part by the National Institutes of Health, John A. Hartford Foundation, and Sam and Rose Stein Institute for Research on Aging, was published in the International Journal of Geriatric Psychiatry.

Source: University of California, San Diego

 

OlgaLis / Shutterstock.com

Effects of Childhood Bullying Still Evident 40 Years Later

Sat, 04/19/2014 - 7:30am

The negative effects of childhood bullying are still evident nearly 40 years later, according to new findings by researchers at King’s College London. The study, published in the American Journal of Psychiatry, is the first to investigate the repercussions of childhood bullying beyond early adulthood.

“Our study shows that the effects of bullying are still visible nearly four decades later. The impact of bullying is persistent and pervasive with health, social, and economic consequences lasting well into adulthood,” said lead author Dr. Ryu Takizawa from the Institute of Psychiatry at King’s College London.

Bullying is defined as the repeated harmful actions by children of a similar age, in which the victims find it difficult to defend themselves. The harmful results of bullying were consistent even when other factors were taken into account, including IQ, emotional and behavioral problems, parents’ socioeconomic status, and low parental involvement.

The study involved 7,771 children whose parents reported on their child’s experiences with bullying at ages seven and 11. The data was pulled from the British National Child Development Study which includes information on all children born in England, Scotland, and Wales during one week in 1958. The children were then followed up until the age of 50.

Just over a quarter of the children in the study (28 percent) had been bullied occasionally, and 15 percent had been bullied frequently — similar to today’s statistics.

Victims of childhood bullying were more likely to have worse physical and mental health and lower cognitive functioning at age 50. Sufferers of frequent bullying were also at an increased risk of depression, anxiety disorders, and suicidal thoughts.

Furthermore, those who had been bullied were more likely to have lower educational levels, with males being more likely to be unemployed and earn less. Bullied children were less likely to be in relationships as adults, less likely to have good social support, and more likely to report lower quality of life and overall life satisfaction.

“We need to move away from any perception that bullying is just an inevitable part of growing-up. Teachers, parents, and policy-makers should be aware that what happens in the school playground can have long-term repercussions for children,” said senior author Professor Louise Arseneault from the Institute of Psychiatry at King’s College.  

“Programs to stop bullying are extremely important, but we also need to focus our efforts on early intervention to prevent potential problems persisting into adolescence and adulthood.”

“Forty years is a long time, so there will no doubt be additional experiences during the course of these young people’s lives which may either protect them against the effects of bullying, or make things worse. Our next step is to investigate what these are,” said Arseneault.

Source:  King’s College London

 
 
Boy being bullied by another boy photo by shutterstock.

Apathy Could Be Indicator of Brain Disease in Older People

Sat, 04/19/2014 - 6:45am

Older people who are apathetic, but not depressed, may have smaller brain volumes than those without apathy, according to a new study.

“Just as signs of memory loss may signal brain changes related to brain disease, apathy may indicate underlying changes,” said Lenore J. Launer, Ph.D, with the National Institute on Aging at the National Institutes of Health in Bethesda, MD, and a member of the American Academy of Neurology.

“Apathy symptoms are common in older people without dementia. And the fact that participants in our study had apathy without depression should turn our attention to how apathy alone could indicate brain disease.”

For the study, 4,354 people, with an average age of 76 and without dementia, underwent an magnetic resonance imaging (MRI) scan. They were also asked questions that measure apathy symptoms, which include lack of interest, lack of emotion, dropping activities, and interests, preferring to stay at home and having a lack of energy.

The researchers used brain volume as a measure of accelerated brain aging. “Brain volume losses occur during normal aging, but in this study, larger amounts of brain volume loss could indicate brain diseases,” the scientists explained.

The study found that people with two or more apathy symptoms had 1.4 percent smaller gray matter volume and 1.6 percent less white matter volume compared to those who had less than two symptoms of apathy. Excluding people with depression symptoms did not change the results, the researchers noted.

Gray matter is where learning takes place and memories are stored in the brain, the scientists explained. White matter acts as the “communication cables” that connect different parts of the brain.

“If these findings are confirmed, identifying people with apathy earlier may be one way to target an at-risk group,” Launer said.

The study, supported by the National Institutes of Health, the National Institute on Aging, the Icelandic Heart Association, and the Icelandic Parliament, was published in Neurology, the medical journal of the American Academy of Neurology

Source: American Academy of Neurology

 
Apathetic elderly woman photo by shutterstock.