Tri-County Services

In The News

Syndicate content
Psychology, psychiatry and mental health news and research findings, every weekday.
Updated: 18 min 42 sec ago

Rat Study: High Fructose Diet Slows Brain Injury Recovery

Sat, 10/03/2015 - 8:45am

A diet high in processed fructose may impair the brain’s ability to heal after head trauma, according to a new rat study by neuroscientists at the University of California, Los Angeles (UCLA).

“Americans consume most of their fructose from processed foods sweetened with high-fructose corn syrup,” said Dr. Fernando Gomez-Pinilla, a professor of neurosurgery and integrative biology and physiology at UCLA’s David Geffen School of Medicine. “We found that processed fructose inflicts surprisingly harmful effects on the brain’s ability to repair itself after a head trauma.”

Although fructose occurs naturally in fruit, the inherent antioxidants, fiber, and other nutrients in the whole fruit prevent the same damage.

The findings add to the mounting evidence of the direct connection between nutrition and brain health. According to the Centers for Disease Control and Prevention, an estimated 1.7 million people suffer with traumatic brain injury (TBI) each year, resulting in 52,000 annual deaths.

For the study, laboratory rats were fed standard rat food and trained for five days to navigate a maze. Then they were randomly assigned to a group that was fed plain water or a group that was fed fructose-infused water for six weeks. The fructose was crystallized from corn in a dose simulating a human diet high in foods and drinks sweetened with high-fructose corn syrup.

A week later, the rats were anesthetized and underwent a brief pulse of fluid to the head to mimic the effects of human traumatic brain injury. After an additional six weeks, the researchers retested all the rats’ ability to recall the route and escape the maze.

The results were significant: the rats on the fructose diet took 30 percent longer to find the exit compared to those who drank plain water.

The fructose altered a wealth of biological processes in the animals’ brains after trauma. The sweetener interfered with the ability of neurons to communicate with each other, rewire connections after injury, record memories and produce enough energy to fuel basic functions.

“Our findings suggest that fructose disrupts plasticity — the creation of fresh pathways between brain cells that occurs when we learn or experience something new,” said Gomez-Pinilla, a member of the UCLA Brain Injury Research Center.

“That’s a huge obstacle for anyone to overcome — but especially for a TBI patient, who is often struggling to relearn daily routines and how to care for himself or herself.”

Prior studies have shown how fructose harms the body through its role in contributing to cancer, diabetes, obesity, and fatty liver. Gomez-Pinilla’s research is the latest in a UCLA body of work revealing the effects of fructose on brain function. Previously, his team was also the first to identify the negative impact fructose has on learning and memory.

“Our take-home message can be boiled down to this: Reduce fructose in your diet if you want to protect your brain,” Gomez-Pinilla stressed.

Made from corn starch, high fructose corn syrup is widely added as a sweetener and preservative to processed foods, soft drinks, condiments, applesauce, and baby food.

The average American consumed roughly 27 pounds of high-fructose corn syrup in 2014 — or just under eight teaspoons per day, according to the U.S. Department of Agriculture. That’s a drop from a decade ago, when Americans consumed more than 36 pounds of the syrup per year.

The findings are published in the Journal of Cerebral Blood Flow and Metabolism.

Source: UCLA

Abstract of the brain photo by shutterstock.

Study Probes the Pain of Internet Infidelity

Sat, 10/03/2015 - 8:00am

A new study explores the impact of Internet infidelity among couples. The findings reveal strong gender differences in how online cheating is perceived with women more strongly associating Internet activities with infidelity and finding these activities far more distressing.

The study, conducted by researchers from the Open University in the U.K., was based on an anonymous online survey of 20 to 73-year-olds who had direct experience with Internet infidelity, either having engaged in it themselves or having found out that their partner had done so.

The aim of the research was to improve understanding and awareness for both the public and marriage counselors at a time when there are growing opportunities to participate in activity online which could lead to infidelity.

The findings confirmed that many participants think that the Internet makes infidelity more likely. For example, the Internet makes covert contact with another person easy and has a disinhibiting effect, making it easier to engage in behavior that might be avoided in real life.

“I have a deep mistrust in the Internet, and feel it massively facilitates infidelity,” said one participant whose husband at the time had had an online affair. “My ex-husband is inherently a very shy man, but online he is able to act much more confidently and attract the attention of other women. I strongly believe he would not have had so many affairs without the Internet.”

The findings also showed that online cheating can be extremely addicting.

“I tried to stop but neither of us could, it would start again and since so easy, with all the technology we carry around it was an amazingly comforting and sexy thing to have,” said another participant. “With long working hours an online relationship is like fast food, ready when we are, naughty, cheap, very often eaten alone without the exhaustion of social niceties.”

Another participant wrote, “Probably — if we hadn’t have established & maintained any sort of contact online — the affair would not have started — as we very rarely bumped into each other.”

The findings also revealed that the effects of Internet infidelity can be as traumatic and wounding as face-to-face adultery, with many participants detailing their ongoing distress and describing the online infidelity as a relationship-ending event.

“What our research has revealed is that men and women do see Internet infidelity differently. But it is not just a gender divide — what is experienced as infidelity online can vary from person to person. What might be seen as casual chatting by one partner, is hurtful and disloyal to the other for instance,” said researcher Dr. Andreas Vossler.

“With the Internet and social media now being part of everyday life in the Western world, there are growing opportunities for partners to engage in online behaviors and activities that may be considered unfaithful in the context of a committed relationship (including e.g. cybersex, exchanging sexual self-images, online flirting, and dating).”

“This matters because infidelity commonly causes significant relationship distress and can have a negative and deteriorating effect on marriages and families,” said Vossler.

The evidence shows that couples in a committed relationship may, in order to prevent future misunderstandings, now have to think about expressing their attitudes towards social media and keeping it a topic for ongoing discussion — just as a couple might negotiate an agreement on the desire for children or marriage, noted researcher Dr. Naomi Moller.

Source: Open University


Man caught having an online affair photo by shutterstock.

Opioid Use Among Older Adults with Breathing Problems Raises Concerns

Sat, 10/03/2015 - 7:15am

Experts are becoming concerned about the increasing use of opioid medications among older adults with chronic obstructive pulmonary disease (COPD), according to a new study published today in the British Journal of Clinical Pharmacology. COPD is a progressive lung disease that makes it difficult to breathe.

“The new use of opioids was remarkably high among adults with COPD living in the community,” said Dr. Nicholas Vozoris, a respirologist at St. Michael’s Hospital. “The amount of opioid use is concerning given this is an older population, and older adults are more sensitive to narcotic side effects.”

For the study, researchers analyzed the records of more than 120,000 adults in Ontario age 66 and older with COPD. Multiple provincial healthcare administrative databases were analyzed at the Institute for Clinical Evaluative Sciences.

Between 2003 and 2012, 70 percent of those who were living in their own home were given a new opioid prescription, while about 55 percent of those living in long-term care homes received a new opioid prescription.

The findings also showed that older adults with COPD, especially those living in long-term care homes, were potentially using opioids excessively; meaning they were given multiple opioid prescriptions, early refills, and prescriptions that lasted more than 30 days.

Opioids, such as codeine, oxycodone, and morphine, might be prescribed more frequently among older adults with COPD to treat chronic muscle pain, breathlessness and insomnia, said Vozoris. Common side effects of opioids include falls and fractures, confusion, memory impairment, fatigue, constipation, nausea, vomiting, and abdominal pain.

“Sometimes patients are looking for what they think are quick fixes to chronic pain and chronic breathing problems,” said Vozoris. “And physicians sometimes believe that narcotics may be a quick fix to COPD symptoms.”

Vozoris said there was some evidence to suggest that opioids may harm lung health by reducing breathing rates and volume, which can result in decreased blood oxygen levels and higher carbon dioxide levels.

“This is a population that has chronic lung disease, and this drug class may also adversely affect breathing and lung health in people who already have chronically compromised lungs,” said Vozoris.

The majority of opioid medications were prescribed by family physicians, with about 88 percent of new prescriptions being a mixture of opioids and non-opioids, such as Percocet, Endocet and Lenoltec.

“Patients and prescribers should reflect on the way narcotics are being used in this older and respiratory-vulnerable population,” said Vozoris. “They should be more careful about when narcotics are used and how they’re being used.”

Source: St. Michael’s Hospital


Elderly being given a bottle of medication photo by shutterstock.

New Study Shows Beauty Really Is In the Eye of the Beholder

Sat, 10/03/2015 - 6:30am

A new study in twins shows that differences of opinion about attractiveness are the result of personal experiences unique to the individual.

Of course, there are some aspects of attractiveness that are pretty universal and may even be coded into our genes, according to the researchers. For example, they note that people tend to prefer faces that are symmetric.

But beyond such limited shared preferences, people really do have different “types,” according to the study, which was published in the Cell Press journal Current Biology.

“We estimate that an individual’s aesthetic preferences for faces agree about 50 percent, and disagree about 50 percent, with others,” write joint leaders of the study, Drs. Laura Germine of Massachusetts General Hospital and Harvard University and Jeremy Wilmer of Wellesley College.

“This fits with the common intuition that on the one hand, fashion models can make a fortune with their good looks, while on the other hand, friends can endlessly debate about who is attractive and who is not.”

While past research on the way people respond to faces has focused primarily on universal features of attraction, this new study focuses on where disagreements over facial attractiveness come from.

To tackle this question, the researchers first studied the face preferences of more than 35,000 volunteers who visited their science website They then used the insights gained to develop a test of the uniqueness of an individual’s face preferences.

They then tested the preferences of 547 pairs of identical twins and 214 pairs of same-sex, non-identical twins by having them rate the attractiveness of 200 faces.

Comparisons between identical and non-identical twins allowed the researchers to estimate the relative contribution of genes and environments to face preferences.

Prior studies of twins and families have shown that virtually every human trait — from personality to ability to interests — is, to some degree, genetically passed down from one generation to the next. In fact, the researchers even found this in an earlier study for another aspect of face processing: the ability to recognize faces.

In contrast, the new study shows that the origin of the “eye of the beholder” — the uniqueness of an individual’s face preferences — is mostly based on experiences, not genes. Those experiences, moreover, are highly specific to each individual, the researchers noted.

“The types of environments that are important are not those that are shared by those who grow up in the same family, but are much more subtle and individual, potentially including things such as one’s unique, highly personal experiences with friends or peers, as well as social and popular media,” Germine said.

In other words, it’s not about the school you went to, how much money your parents made, or who lived next door. That pretty face you see has a lot more to do with those experiences that are truly unique to you — the faces you’ve seen in the media, the unique social interactions you have every day of your life, or the face of your first boyfriend or girlfriend.

The researchers say that the large impact of personal experience on individual face preferences “provides a novel window into the evolution and architecture of the social brain.”

They add that future studies could look more closely at which aspects of the environment are most important in shaping our preferences for certain faces and for understanding where our preferences for other things — like art or music — come from.

Source: Cell Press

Infants with Drug Withdrawal Syndrome More Likely to Be Readmitted

Fri, 10/02/2015 - 8:30am

Infants born with neonatal abstinence syndrome (NAS) — an addiction to opioids due to exposure in the womb — are nearly two and a half times as likely to be readmitted to the hospital within the first month after discharge compared to healthy full-term infants, according to a new study by Vanderbilt University.

Drug withdrawal symptoms can occur shortly after delivery in infants whose mothers had been taking opioid pain relievers such as hydrocodone. Compared to other infants, those with drug withdrawal are more likely to experience respiratory complications, feeding difficulty, seizures, and low birth weight.

The research is part of an ongoing series of Vanderbilt studies that are investigating the far-reaching implications, short-term and long-term, of drug exposure and withdrawal in newborns. Previous findings have shown that rates of NAS have increased nearly fivefold in the past decade across the U.S. Still, little is known about infants with NAS after their initial hospitalization following birth.

In this latest study, the researchers wanted to find out whether infants with NAS are at an increased risk for hospital readmission within 30 days from discharge compared with uncomplicated term and late preterm newborns.

“The recent rise of neonatal abstinence syndrome led to efforts in many hospital systems to improve hospital care being delivered to infants with the syndrome. Our findings suggest that these improvements need to extend beyond the initial birth hospitalization to ensure a safe discharge home,” said lead investigator Stephen Patrick, M.D., MPH, MS, assistant professor of Pediatrics and Health Policy in the Division of Neonatology at Monroe Carell Jr. Children’s Hospital at Vanderbilt.

For the study, the researchers analyzed hospital discharge data for 2006-2009 from the New York State Inpatient Database (SID) and also looked at data for live births from the New York Department of Health. During that time, there were 700,613 uncomplicated term births and 51,748 late preterm births (born between 33 and 36 weeks gestation). Of those births, 1,643 infants were diagnosed with NAS.

The most common cause for readmission among infants with NAS was withdrawal, whereas the preterm newborns were most commonly readmitted for jaundice.

Patrick said future research and state-level policies should look for ways to lower the risk of hospital readmission for infants with NAS.

“As state and federal policymakers work towards strategies to improve outcomes for women with substance use disorder and their infants, it will be important to ensure that families are supported during the critical transition from hospital to home to limit the risk of hospital readmission. The findings of our study suggest that some families may benefit from additional post-discharge resources.”

The findings are published in the journal Hospital Pediatrics.

Source: Vanderbilt University Medical Center


Brain Imaging Helps to Map the Way We Think

Fri, 10/02/2015 - 7:45am

The human brain is an extremely complicated organ with many secrets. To improve understanding of how the brain makes and controls thoughts, researchers have converted structural brain imaging systems into “wiring diagrams” of connections between brain regions.

Three researchers from University of California Santa Barbara Department of Psychological & Brain Sciences (UCSB) — Michael Miller, Scott Grafton, and Matt Cieslak — used the structure of neural networks to learn the basic rules that govern which parts of the brain are most able to exert cognitive control over thoughts and actions.

This study is the first to provide a mechanistic explanation for how the frontal cortex exerts control over the trillions of individual neurons that allow people to stay focused on one task or switch to a radically different one.

The findings appear in the journal Nature Communications.

“Particular regions of your brain are predisposed to control your thoughts based on where they lie in relation to other regions,” said Miller, a UCSB psychology professor and co-author of the paper.

“The regions on the ‘outskirts’ can perform a very specific kind of control. They can move the system to distant states, like switching from working at your job to playing with your kids.”

This new research blends cutting-edge neuroscience with the emerging field of network science, which is often used to study social systems.

By applying control theory — a field traditionally used to study electrical and mechanical systems — the investigators show that being on the outskirts of the brain is necessary for the frontal cortex to dynamically control the direction of thoughts and goal-directed behavior.

“We need a basic theory of how the brain controls itself, and to get there, we suggest treating the brain as an engineering system,” said senior author Danielle Bassett, an Assistant Professor of Innovation at the University of Pennsylvania School of Engineering and Applied Science.

“Cognitive control is a lot like engineering control: You model the system’s dynamics by identifying key points; if I push on that one piece or pull this lever, I can offer a prediction of how it’s going to affect other parts of the network.”

By applying control theory equations to the wiring diagrams generated from brain scans, the researchers showed that the geographical and functional differences between regions of the brain are linked.

While the analysis cannot say whether the frontal cortex’s location or its role evolved first, it suggests that part of the frontal cortex’s ability to control executive function depends on its distance from other parts of the brain network.

“This study heralds a new wave of network science, grounded in rigorous control theory,” said co-author Grafton, director of UCSB’s Brain Imaging Center.

“When applied to state-of-the-art brain imaging data we begin to see some of the design tradeoffs inherent in the architecture of brain connections.”

Regions that are most interconnected — and therefore more internal to the network — are very good at moving the brain into nearby states — for example, from writing someone an email to speaking with that person on the phone.

“What’s particularly interesting if we look at where those inner nodes are, they’re all in ‘default mode’ regions, which are the regions that are active when you’re resting,” said Bassett. “This makes sense, because if you were engineering an optimal system, you would want to put its baseline where it can get to most of the places it has to go pretty easily.”

According to co-author Cieslak, this type of holistic understanding of the relationship between brain regions’ locations and their roles is necessary for tailoring better treatments for people who have lost executive function due to disease or injury.

Cieslak believes this fundamental understanding of how the brain controls its activity could help lead to better interventions for medical conditions associated with reduced cognitive control, such as autism, schizophrenia or dementia.

Source: University of California, Santa Barbara/EurekAlert

Video Game Features Key to Cognitive Improvements

Fri, 10/02/2015 - 7:00am

Emerging research suggests particular components or features of a video game appear to influence cognitive health.

Investigators explain that the specific content, dynamics, and mechanics of individual games determine their effects on the brain. As such, researchers believe action video games might have particularly positive benefits for improving cognition.

The paper appears in Policy Insights from the Behavioral and Brain Sciences, a Federation of Associations in Behavioral & Brain Sciences (FABBS).

“The term video games refers to thousands of quite disparate types of experiences, anything from simple computerized card games to richly detailed and realistic fantasy worlds, from a purely solitary activity to an activity including hundreds of others, etc.,” say the researchers.

They explain that a useful analogy is comparing concept of video games to the term food. In this context, one would never ask, “’What is the effect of eating food on the body?’ Instead, it is understood that the effects of a given type of food depend on the composition of the food such as the number of calories; the percentage of protein, fat, and carbohydrates; the vitamin and mineral content; and so on.”

In the study, Drs. C. Shawn Green and Aaron R. Seitz analyzed the science on the cognitive effects of video games. They explain that action video games — games that feature quickly moving targets that come in and out of view, include large amounts of clutter, and that requires the user to make rapid, accurate decisions — have particularly positive cognitive impacts, even when compared to “brain games,” which are created specifically to improve cognitive function.

“Action video games have been linked to improving attention skills, brain processing, and cognitive functions including low-level vision through high-level cognitive abilities. Many other types of games do not produce an equivalent impact on perception and cognition,” the researchers commented. “Brain games typically embody few of the qualities of the commercial video games linked with cognitive improvement.”

Green and Seitz noted that while action games in particular have not been linked to problems with sustaining attention, research has shown that total amount of video game play predicts poorer attention in the classroom.

Furthermore, video games are known to impact not only cognitive function, but many other aspects of behavior — including social functions. This impact can be either positive or negative depending on the content of the games.

“Modern video games have evolved into sophisticated experiences that instantiate many principles known by psychologists, neuroscientists, and educators to be fundamental to altering behavior, producing learning, and promoting brain plasticity.

Video games, by their very nature, involve predominately active forms of learning (i.e., making responses and receiving immediate informative feedback), which is typically more effective than passive learning,” say the researchers.

Source: Sage Publications/EurekAlert

Tweets from Mobile Devices Often Egocentric

Fri, 10/02/2015 - 6:15am

New research explores the way in which the ubiquitous mobile device has changed the way we interact with the world.

While it is now normal behavior to take selfies or live Tweet an event, investigators wanted to discover if a mobile device really be an extension of one’s self?

In their study, researchers at Goldsmiths, Bowdoin College, and the University of Maine found that tweets from mobile devices are more likely to employ egocentric language as opposed to non-mobile device Tweets.

Their findings have been published in the Journal of Communication.

For the study, researchers conducted an analysis of tweets to see if presentations of self are more likely to be more egocentric, negative/positive, gendered, or communal based on whether users were on a mobile device or using a web based platform.

Over the course of six weeks, the researchers collected 235 million tweets. Ninety percent of the top sources to access Twitter were coded to denote mobile, non-mobile, and mixed sources.

Researchers used social psychological methods to study the language use in tweets. This meant they analyzed the frequency and ratios of words traditionally associated with social and behavioral characteristics.

Investigators discovered mobile tweets are not only more egocentric in language than any other group, but that the ratio of egocentric to non-egocentric tweets is consistently greater for mobile tweets than from non-mobile sources.

They also did not find that mobile tweets were particularly gendered. Regardless of platform, tweets tended to employ words traditionally associated as masculine.

Previous studies have linked activities performed face-to-face (e.g. eating dinner) to tweets from a particular source. There has also been research that classify tweets as belonging to a particular sentiment by using word lists. The new study is one of the first to take a look at how mobile versus non-mobile plays a part in the language used on social media.

“Very little work has been done comparing how our social media activities vary from mobile to non-mobile. And as we increasingly use social media from mobile devices, the context in which one uses social media is a critical object of study,” said Murthy.

“Our work is transformative in this understudied field as we found that not all tweets are the same and the source of tweets does influence tweeting patterns, like how we are more likely to tweet with negative language from mobile devices than from web-based ones.”

Source: International Communication Association/EurekAlert

Life Satisfaction = Increases in Money/Hours Worked

Fri, 10/02/2015 - 5:30am

A German economist has developed a model that demonstrates a relationship between long-term income increases and personal satisfaction.

Professor Dr. Christian Bayer, from the Hausdorff Center for Mathematics at the University of Bonn, also found that overtime also affects personal levels of happiness — but in a negative way.

His findings appear in the American Economic Journal.

Expanding the question of “Does money bring happiness?” Bayer and colleague Professor Falko Jüssen investigated how increased income and workload influenced overall life satisfaction.

Their findings were clear: more money does make people happier — but only if there is a long-term increase in income. A temporary increase does not have any noticeable effect on an employee’s level of happiness, even if it is a large increase.

By contrast, a permanent increase in income results in a significant rise in well-being, even if the raise is small.

The researchers also identified a second important way in which professional life influences personal happiness: the number of hours that employees work.

“Those who consistently have to work more become less happy,” says Professor Bayer, an instructor and researcher at the Institute for Macroeconomics and Econometrics.

“This finding contradicts many other studies that conclude people are more satisfied when they have any job than none at all.” The new study suggests that the unemployed suffer from the lack of income not the lack of employment per se.

For their studies, the mathematical economists developed a new approach to analyze the link of income to personal levels of happiness. While earlier studies on this topic were based purely on static models, Professor Bayer and Professor Jüssen also included the dynamics of changing income levels.

As it turned out, that was a key step toward a better understanding of how income level and working hours affect well-being. Long-term income increases have a completely different effect on an employee’s satisfaction than temporary raise does. Previous studies had not taken this distinction into account, and treated all changes in income equally.

Experts believe the study proves that a functioning financial market is important for balancing out the effects of income fluctuations and extra work on a person’s well-being.

“Our findings show that wages and working hours have more to do with a worker’s happiness and/or unhappiness than was previously assumed,” says Prof. Bayer.

“So the formula for greater satisfaction in life seems to be: persistently more money while working the same number of hours.”

Source: University of Bonn/EurekAlert

Theater Program Improves Social Skills in Kids with Autism

Thu, 10/01/2015 - 8:30am

Children with autism who participated in a 10-week theater program experienced a significant increase in social skills compared to those who did not participate, according to a new study by Vanderbilt University. Children in the program showed improvements in social cognition, interaction, and communication.

Acting is like therapy for children with autism spectrum disorder, said lead researcher Blythe Corbett, Ph.D., an associate professor at Vanderbilt University and researcher at the Vanderbilt Kennedy Center. Acting is an interactive process that incorporates a variety of social skills, including observing, perceiving, interpreting and expressing thoughts, feelings, and ideas.

For the study, the children participated in a 10 week, 40 hour program called SENSE Theatre. The Social Emotional Neuroscience & Endocrinology (SENSE) program evaluates the social functioning of children with autism and related neurodevelopmental disorders.

The findings from Corbett’s new randomized control trial provide some convincing evidence of the benefits of theater for improving the social skills of children with autism.

“We measured many aspects of social ability and found significant treatment effects on social cognition, social interaction and social communication in youth with autism,” Corbett said.

The research involved 30 children ages eight to 14, with 17 randomly chosen for the experimental group and 13 in the control group. The treatment group exhibited notable improvements in the ability to identify and remember faces, which was confirmed by changes in brain patterns that arise when study participants saw a familiar face.

Children who participated in the theater program also showed more group play with children outside the treatment setting, as well as improvement in social communication at home and in the community. This improvement was evident for at least two months.

In addition to using theater techniques, such as role-playing and improvisation, the children enrolled in SENSE Theatre were paired with typically developing peer actors from the University School of Nashville.

These “expert models,” as Corbett calls them, are trained to provide a supportive, engaging, and dynamic learning environment for the children with autism, allowing them to practice and perform important social skills. In fact, the finale to the 40-hour program was the performance of a play in which participants and peers shared the stage in a unique collaboration between art and science.

“Peers can be transformative in their ability to reach and teach children a variety of fundamental social skills,” Corbett said. “And, combined with acting techniques that enhance our ability and motivation to communicate with others, the data suggests we may be setting the stage for lasting changes in how our children with autism perceive and interact with the social world.”

The research is published in the Journal of Autism and Developmental Disorders.

Source: Vanderbilt University Medical Center

Boy in theater photo by shutterstock.

MD & Nurse Training on Teen Risk Behavior Pays Off

Thu, 10/01/2015 - 7:45am

A new program that trains Australian doctors and nurses to better recognize risk-taking behavior in teens and young adults appears to be effective at identifying and reducing risky behaviors.

University of Melbourne researchers led the study that included 901 young people, doctors and nurses at 42 general practices in 15 urban and eight regional divisions in Victoria, Australia.

The health professionals were trained to screen and counsel young people aged 14 to 25-years-old for common risk factors among the age cohort. Behaviors included smoking, binge drinking, mental health problems, drug use, risky driving, and unsafe sex with responses obtained either with a survey or by verbal inquiry.

Researchers found that the risky behaviors were rarely assessed. For example, while many doctors are alert to mental health issues, hardly any screen for risky driving and partner abuse.

Remarkably, investigators found 90 percent of the 901 young people were engaging in one or more of the risky behaviors. Upon identification of the risky activities, health professionals recommended a course of action to minimize risk or a plan to stop the behavior.

After discussing these issues with the GP, the young people reported less illicit drug use and less risk for sexually transmitted illness after three months and fewer unplanned pregnancies at 12 months. The GPs were also able to detect more cases of partner abuse.

Encouragingly, almost all (97 percent) of young people in the study said they would be willing to discuss their personal lives with their doctor as a trusted source of information. A further 93 percent said they’d tell a friend to do the same.

Associate Professor Dr. Lena Sanci, of the Department of General Practice, was lead author on the study, which appears in journal PLOS One.

Sanci explains that adolescence and young adulthood are peak years for the onset of mental disorders, injuries, and reproductive health risks. Ironically, although risky drinking, smoking, drug use, and low rates of physical exercise in adulthood are usually established during these years, young people are the group most likely to be overlooked by the medical profession.

“Young people will come to the doctor for coughs, colds, and injuries, but not things like stopping smoking or reducing alcohol or talking about abuse in a relationship or learning about safer sex,” Sanci said.

“Perhaps it’s because they don’t view these things as health issues, or they’re embarrassed, or maybe they feel they should be able to cope on their own. Doctors are the perfect confidantes for teens, who may not want to talk about these health risks with their parents.

“We didn’t expect it to solve all the problems, but we did start a conversation that could help the young person manage the risks.

“We know that young people visit the doctor once or twice a year, so there are repeated opportunities to address multiple risks. And in this trial, overwhelmingly, young people welcome these discussions if they are raised sensitively by youth-friendly providers.”

The researchers recommend trainee doctors should be taught how to have these conversations with young people. They are currently working on an online screening tool to streamline the process.

Source: University of Melbourne/EurekAlert
Doctor talking with teenager photo by shutterstock.

How Those With Schizophrenia Misinterpret Social Cues

Thu, 10/01/2015 - 7:00am

People who suffer from schizophrenia often misinterpret social cues, which can lead to unpleasant and often paranoid or persecutory thoughts. A new study provides insight into this misperception.

Researchers believe their findings, published in the journal Psychological Medicine, could foster psychological interventions to assist people with schizophrenia better interpret social cues and perhaps ease related symptoms.

Investigator Dr. Sukhi Shergill of King’s College London, said, “Humans are social beings, often finding joy in interacting with others. While most attention is on talking with each other, non-verbal behavior such as gestures, body movement, and facial expression also play a very important role in conveying the message.

“However, the message being conveyed is not always clear, or perceived as a positive one, and an extreme example is evident in patients suffering from schizophrenia who show a strong tendency to misinterpret the intentions of other people in a malevolent manner.”

In the study, investigators studied the behavior of 54 participants, including 29 people with schizophrenia, as they viewed the body position and gestures of an actor on a silent video clip. The video included gestures such as putting a finger to the lips to indicate ‘be quiet’ or incidental movements such as scratching an eye.

Researchers found that patients with schizophrenia are able to interpret meaningful gestures and incidental movements as accurately as healthy subjects. However, when the direction of the gestures was ambiguous (i.e. not obviously directed at or away from them), they were much more likely to misinterpret the gestures as being directed towards them.

Investigators believe this could indicate an increased tendency to self-infer these ambiguous social cues or to “hyper-mentalize,” falsely inferring intent in the actions of others.

Both of these misinterpretations could strengthen paranoid thoughts experienced by patients with schizophrenia, said the study authors. Moreover, the patients’ confidence in their interpretation was found to be strongly associated with their tendency to experience hallucinatory symptoms.

“Our study offers a basis for psychological interventions aimed at improving gestural interpretation,” Shergill said. “It could also provide guidance for health professionals and care-givers on how to communicate with patients who have schizophrenia, in order to reduce misinterpretations of non-verbal behavior.”

Emerging technology can help to improve communication as well as enhance quality of life among individuals with schizophrenia.

“The recent advent of adaptable virtual-reality technology provides a means of investigating the psychological effects of gestural communication with greater flexibility, which may prove a boon for our future understanding of social deficits in schizophrenia,” said Shergill.

Source: Kings College London/EurekAlert
People talking photo by shutterstock.

Antipsychotics for Parkinson’s Psychosis May Be Dangerous

Thu, 10/01/2015 - 6:15am

A new study from the U.K. finds that antipsychotic drugs may increase the risk of death in people with Parkinson’s disease psychosis (PDP).

Researchers from the Institute of Psychiatry, Psychology & Neuroscience (IoPPN) at King’s College London found that people with PDP who were treated with antipsychotics were four times more likely to have died following three to six months of treatment than those who did not receive any antipsychotic medication.

Investigators also discovered that when people with PDP received antipsychotic medications they were more likely to experience serious health issues including cognitive decline, worsening of Parkinson’s symptoms, stroke, infections, and falls.

Study findings have been published in the Journal of Medical Directors Association (JAMDA).

Parkinson’s disease affects approximately seven to 10 million people worldwide and is characterized by progressive loss of motor function, psychiatric symptoms, and cognitive impairment.

Psychosis is a common and distressing group of psychiatric symptoms affecting people with Parkinson’s, usually manifesting as hallucinations and delusions.

PDP affects more than 50 percent of people with Parkinson’s at some point in their condition and antipsychotic drugs are often used to treat this psychosis. Researchers say, however, that there is little evidence to support their use.

In the new study, researchers examined more than 400 people with PDP, who were taking part in a separate trial, to assess the impact of antipsychotic medications on their overall health and wellbeing. Participants were categorized into two groups, those receiving antipsychotics and those who did not take any antipsychotic medications at any time during the study.

Professor Clive Ballard from the Wolfson Centre for Age-Related Diseases at the IoPPN, King’s College London, said, “Our findings clearly indicate serious risks associated with antipsychotics and highlight the need for greater caution in treating psychosis in Parkinson’s disease.

“Antipsychotics are known to be linked to serious harm in people with Alzheimer’s disease, and these findings show that a similar, although not identical, risk is seen in people with Parkinson’s.

“Our findings therefore strongly suggest that doctors, patients and family members should consider these risks very carefully when considering potential treatments for psychosis and any other behavioral symptom in people with Parkinson’s disease, such as agitation or aggression.

“Further research is required to develop new, better treatments for psychosis and other behavioral symptoms.”


Source: Kings College London/EurekAlert
Elderly man taking medication photo by shutterstock.

If Placebo Eases Depression, Real Meds Will Too

Thu, 10/01/2015 - 5:30am

New research finds that when it comes to treating depression, how well a person responds to a sham or fake medicine can be a predictor of how they will respond to actual medications.

That is, those who can muster their brain’s own chemical forces against depression appear to have an advantage in overcoming its symptoms with help from a medication.

However, for those whose brain chemistry doesn’t react as much to a fake medicine, or placebo, the active drug may provide substandard benefits.

University of Michigan Medical School researchers believe the finding may explain the variation in treatment response and resiliency that challenges depression patients and their care teams. The discovery also opens up the door to new research on how to amplify the brain’s natural response in new ways to improve depression treatment.

Investigators believe the new insight could also help those developing and testing new drugs, helping them correct for the placebo effect that gets in the way of measuring a drug’s true effect. The study comes from a team that has studied the placebo effect for more than a decade, using sophisticated brain scanning techniques in healthy people.

They were pioneers in showing that the brain’s natural “painkiller” system — called the mu-opioid system — responded to pain when patients got a placebo. Investigators also studied the genetic variation that makes certain people more likely to respond to sham painkillers.

In the new study, researchers studied the brain chemistry of 35 people with untreated major depression, who agreed to try what they thought was a new depression drug, before receiving actual drugs already approved to treat depression.

The team found that participants who reported improvement of depression symptoms after getting the placebo also had the strongest mu-opioid response in brain regions involved in emotion and depression. And these individuals were also more likely to experience even fewer symptoms once they got a real drug.

In fact, response to placebo predicted nearly half of the variation between individuals in total response to the entire study, including actual drug treatment.

“This is the first objective evidence that the brain’s own opioid system involved in response to both antidepressants and placebos, and that variation in this response is associated with variation in symptom relief,” said the paper’s first author, Marta Pecina, M.D., Ph.D.

“This finding gives us a biomarker for treatment response in depression — an objective way to measure neurochemical compounds involved in response,” she continues. “We can envision that by enhancing placebo effects, we might be able to develop faster-acting or better antidepressants.”

Research team leader Jon-Kar Zubieta, M.D., Ph.D., believes the placebo effect in the study came not only from participants’ belief that they were receiving a real drug, but also from the sheer impact of being in a treatment environment.

Even as scientists work to understand this effect further, clinicians who treat people with depression may want to take heed of the findings, he notes. Receiving care in a treatment environment supports use of talk therapies and other forms of personalized therapy.

“These results suggest that some people are more responsive to the intention to treat their depression, and may do better if psychotherapies or cognitive therapies that enhance the clinician-patient relationship are incorporated into their care as well as antidepressant medications,” he said.

“We need to find out how to enhance the natural resiliency that some people appear to have.”

Studies testing antidepressants against placebos suggest that 40 percent of response is due to the placebo effect. To drug developers, this is a nuisance. But to placebo researchers, it’s like catnip.

“If 40 percent of people recover from a chronic illness without a medication, I want to know why,” said Zubieta.

“And if you respond to a medication and half your response is due to a placebo effect, we need to know what makes you different from those who don’t respond as well.” This could include genetic effects that are still to be discovered.

The new findings were made using position emission tomography, or PET, scanning, and a substance that attaches to the receptors on brain cells that mu-opioid molecules bind to.

The novel research design, termed a single-blind randomized crossover approach, meant that the participants went in knowing that they wouldn’t be told full details about the purpose of the study until the end.

Participants initially received two weeks of placebo pill treatment; but during one of those weeks, each was told they were taking a substance that is believed to activate internal mechanisms and may have antidepressant properties.

At the end of this week, they also came for a brain scan and received an injection of harmless salt water that they were told might have fast-acting antidepressant properties. After these two weeks and scan, they were prescribed a real antidepressant.

Source: University of Michigan

Primary Care Screening Needed for Teen Depression, Suicide Risk

Wed, 09/30/2015 - 8:30am

A new paper finds that psychosocial assessment and mental health screening of teens during routine health care visits can literally be a life-saver.

Nursing researchers with the University of Texas at Arlington and Texas Woman’s University contend that depression and suicide risk screening can assist health care providers in preventing suicides in teens.

Sharolyn K. Dihigo, R.N., D.N.P.,  and Barbara Gray, Ph.D., R.N., recently examined available research to determine what screening tools nurse practitioners and others in primary care settings should access during “well” visits with teenage patients.

Their paper appears in The Nurse Practitioner journal as part of a continuing education series, and it comes as the nation observes Suicide Prevention Awareness month.

The World Health Organization announced the number one reason for illness and disability among teens and preteens is depression, and suicide is ranked as number three, Dihigo said.

“Our article could not have come at a better time. We are trying to get the word out and educate other healthcare providers to recognize the signs of depression and intervene to prevent these suicides.”

It is estimated that 80 percent of all 13- to 18-year-olds are seen in a primary care setting each year, but often busy health care providers fail to correctly identify those teenagers with a mental health condition. That’s because symptoms of depression in teens, such as moodiness, increased sadness, or changes in appetite or school attendance, can be easily overlooked as a “normal:” part of puberty.

Gray said completion of psychosocial assessment and mental health screening of adolescents during routine health care visits “is an important component in the detection of risk factors that contribute to suicidal thoughts and behaviors.”

To complete their piece, Dihigo and Gray reviewed numerous articles, fact sheets, national recommendation statements, and 23 studies done by other researchers.

The team concluded that advanced planning and preparation can lead to a systematic, effective way to manage patients at risk for suicide, whether it is immediate referral for hospitalization or referral to a therapist and initiation of a safety plan. Suicide risk screening tools are available and can be administered in a time-efficient manner.

Most tools are free of charge and require little training to administer. Some tools screen for several disorders, while others focus on specific screening questions for one type of mental illness.

For example, the Patient Health Questionnaire for Adolescents (PHQ-A) assesses for potential problems, such as anxiety, substance abuse, mood, or eating disorders. The Pediatric Symptom Checklist (PSC), or Pediatric Symptom Checklist — Youth Report (Y-PSQ), screens broadly for emotional and behavioral psychosocial concerns.

Source: University of Texas, Arlington

Study Finds Symptoms Outweigh Stigma for Teens at High Risk for Psychosis

Wed, 09/30/2015 - 7:45am

A new philosophy in care for young people labeled at risk for mental illness involves early intervention before the onset of full-blown psychosis.

However, despite the obvious benefit of schizophrenia prevention, the potential harm and risks inherent in identifying and labeling young people at risk has been unknown.

Now, a new study discovers that young people identified as at clinical risk for psychosis reported a greater stigma associated with the symptoms that led them to seek help than the risk label, or the stigma of coming to a specialized clinic.

The study is the first to address the separate effects of symptoms and labeling on stigma in young people identified as at clinical high risk for schizophrenia and related psychotic disorders.

The findings, by researchers at Columbia University’s Mailman School of Public Health and New York State Psychiatric Institute Findings are published online in the journal Schizophrenia Research.

“The clinical high-risk state is an incredibly important advance in psychiatry that enables identification of at-risk youth prior to development of full psychosis,” said Lawrence H. Yang, Ph.D., associate professor of epidemiology at the Mailman School of Public Health and first author.

“We were able to distinguish feelings of stigma due to attending a specialized high-risk clinic from the stigma of having symptoms and experiences. While the stigma of symptoms and experiences appear greater, the results indicate that both forms of stigma provide targets for intervention.”

For many, being identified as being at-risk to develop psychosis is a false alarm as fewer than one in three young people identified as at-risk develop psychosis. The vast majority, therefore, either has residual symptoms or improves entirely.

“Because there is the risk of having ‘false positives,’ it is especially important to demonstrate that stigma induced by the ‘at risk’ label appears less than that of symptoms,” said Yang.

“But even for the true positives — those one in three that do develop psychosis — it is important to learn that the stigma of symptoms is a relatively stronger contributor to stigma, as such it is precisely the stigma that would be reduced by early intervention.”

The new paper reports the findings from a prospective cohort study at the New York State Psychiatric Institute at Columbia University at the Center of Prevention and Evaluation, or COPE — a comprehensive program that offers treatment and resources to participants about early symptoms and risk of schizophrenia.

Upon joining COPE through community identification in clinics and schools, young people were told that while they were at increased risk for psychosis as compared with the general population, it was likely that they would not develop psychosis.

They were also told that if they did develop psychosis, they would receive immediate treatment, which tends to be effective. In this study, young people were asked about their stigma experiences on average about 11 months after they entered the COPE program.

Yang is also the principal investigator of a multi-site five-year project currently funded by the National Institutes of Health that is building upon the current study to understand stigma better in the clinical high risk state for psychosis.

This project, which is being conducted at New York State Psychiatric Institute, Beth Israel Deaconess-Harvard Medical Center, and Maine Medical Center, will enable Yang to corroborate these initial findings, as well as to examine whether vulnerability to stigma is affected by social cognition, like recognizing others’ intents and emotions in their facial expressions and in what they say.

“Regarding labeling-related stigma, our findings indicate that similar to other psychiatric conditions — awareness of stereotypes was relatively high, and feelings of shame were significant,” noted Yang.

“However, the fact that there were also positive emotions associated with identification — such as increased relief and understanding– and with coming to a specialized high-risk clinic indicates the beneficial effects of being identified as clinical high-risk.”

“This study confirms that the young people we identified as at risk for psychosis were more troubled by the symptoms they were having than by any label given to them,” said Cheryl Corcoran, M.D., senior author and Columbia University assistant professor of Clinical Psychiatry and a research scientist at the New York State Psychiatric Institute.

“We are also encouraged to learn how much these young people resist or disagree with pervasive negative stereotypes of psychosis or schizophrenia and that this relative lack of stigma associated with a risk label might mean that more young people will seek out the treatment and services they need.

Our task then is to develop the best treatments we can to reduce the risk of psychosis, and make them widely available to the very teens and young adults who need them.”

Source: Columbia University’s Mailman School of Public Health/EurekAlert

Birth Weight Affects Social Trust in Adulthood

Wed, 09/30/2015 - 7:00am

A new study has found that our birth weight tends to correlate with our trust levels as adults. Specifically, low birth weight is tied to low levels of social trust in adulthood, while high birth weight is associated with high levels of trust, according to researchers from Aarhus School of Business and Social Sciences (Aarhus BSS) at Aarhus University, Denmark.

“Social trust is extremely important for society. In many ways, it is what keeps society together. When we sort our waste, when we vote, when we pay our taxes, it’s all a function of how much trust we have in one another,” says lead researcher Michael Bang Petersen from the Department of Political Science at Aarhus BSS.

“Therefore, it’s fascinating that we can trace trust all the way back to the embryonic stage. It helps us understand why some people involve themselves more than others in society, and why some are less involved.”

In one survey, a large number of participants were asked if they believed that “a person cannot be too careful when dealing with other individuals” or if “most people can be trusted.” Responses that reflected a low level of trust tended to correlate with low birth weight.

The link remained even after researchers controlled for genetic and environmental factors as measured by the birth weight of siblings. Siblings share family environment and, on average, 50 percent of the genes. If the correlation between low birth weight and low trust remains even after taking genetics and family environment into account, it supports the idea that factors linked to the embryonic stage have an impact on adult life.

The researchers were inspired by the fact that studies often look back at childhood for explanations of adult behavior. They decided to go one step further in showing that even the time before birth can affect the development of the individual.

“A lot of earlier research suggests that experiences in early childhood affect how you react psychologically as an adult. We wanted to investigate if experiences in the embryonic stage also have an impact on psychological patterns in adulthood,” said Petersen.

The findings could serve as an argument for ensuring safe and ample material conditions for women during pregnancy. More importantly, they represent an instance of basic research which adds to our understanding of man and his role in modern society, and helps us appreciate the factors that decide how we interact with each other.

Who feels a sense of belonging, and who feels detached? Who wants to contribute to the community, and who sees less of a reason to do his share? These are quite fundamental questions.

“Our findings match findings in disciplines such a biology and psychology which have shown, on the one hand, that women’s ability to care for their children is highly dependent on the amount of social support they receive from their surroundings, and, on the other hand, have demonstrated that children are very much influenced by signals from the environment about the kind of world we’re living in, and whether it’s a cold and uncaring place, or a safe one,” said researcher Lene Aarøe, also from the Department of Political Science at Aarhus BSS.

“Social trust is at the very core of modern society and shapes how citizens interact. By achieving a better understanding of the factors that lead to social trust we also get closer to understanding the basic elements that ensure social coherence,” said Aarøe.

The article is published in the journal Psychological Science.

Source: Aarhus University


Caregiving Roles as Kids May Impair Parental Attunement

Wed, 09/30/2015 - 6:15am

Emerging research suggests growing up without excessive caregiving burdens is important for future mothers.

Specifically, a new study suggests mothers who took on burdensome caregiving roles as children — and weren’t allowed to just “be kids” — tend to be less sensitive to their own children’s needs.

The findings by a Michigan State University professor suggest these parents do not understand appropriate child development and end up parenting in a similar harmful manner in which they were raised.

The research appears online in advance of print publication in the Journal of Family Psychology.

“If your childhood was defined by parents expecting you to perform too much caregiving without giving you the chance to develop your own self-identity, that might lead to confusion about appropriate expectations for children and less accurate knowledge of their developmental limitations and needs as infants,” said Amy K. Nuttall, Ph.D., lead author on the study.

“If mothers don’t understand their children’s needs,” Nuttall concluded, “they’re not able to respond to them appropriately.”

Burdensome, adult-like caregiving, or “parentification,” can involve routine parenting and disciplining of one’s siblings, excessive chores and responsibilities around the house, and serving as the main emotional support system for parents.

Researchers surveyed 374 pregnant women from low-income households in four U.S. cities asking them about their upbringing. After birth, the mothers’ parenting techniques were observed several times during an 18-month period.

Mothers who engaged in excessive, adult-like caregiving as children were less likely to respond warmly and positively to their infant’s needs and interests and to put their child’s need for exploration and independence over their own agenda.

A previous study led by Nuttall, which also appeared in the Journal of Family Psychology, found the children of mothers who engaged in excessive caregiving during childhood went on to display behavioral problems.

Researcher believe the studies have important implications for developing parent-education programs for mothers who were overburdened by caregiving roles in childhood.

To improve child-raising skills, Nuttall believes instruction about infant development might be best served in prenatal classes. This setting may be preferred as women are more likely to attend prenatal classes than parenting classes offered after birth.

“Prenatal parenting classes may be particularly useful for teaching accurate knowledge of child development and appropriate expectations about children’s abilities even before mothers give birth and begin parenting,” Nuttall said.

Source: Michigan State University/EurekAlert

Cellphones Can Damage Romantic Relationships

Wed, 09/30/2015 - 5:30am

A provocative new study suggests our trusted partner and confidant — the cell phone — can harm interpersonal relationships and lead to higher levels of depression.

Baylor University researchers James A. Roberts, Ph.D., and Meredith David, Ph.D., conducted two separate surveys, accounting for a total of 453 adults in the U.S., with the intention of learning the relational effects of “Pphubbing,” or “partner phone snubbing.”

Pphubbing is described in the study as the extent to which people use or are distracted by their cellphones while in the company of their relationship partners.

“What we discovered was that when someone perceived that their partner phubbed them, this created conflict and led to lower levels of reported relationship satisfaction,” Roberts said.

“These lower levels of relationship satisfaction, in turn, led to lower levels of life satisfaction and, ultimately, higher levels of depression.”

The first survey of 308 adults helped Roberts and David develop a “Partner Phubbing Scale,” a nine-item scale of common smartphone behaviors that respondents identified as snubbing behaviors.

The resulting scale includes statements such as:

  • My partner places his or her cellphone where they can see it when we are together;
  • My partner keeps his or her cellphone in their hand when he or she is with me;
  • My partner glances at his/her cellphone when talking to me;
  • If there is a lull in our conversation, my partner will check his or her cellphone.

The development of the scale is significant, the study states, because it demonstrates that “Pphubbing is conceptually and empirically different from attitude toward cellphones, partner’s cellphone involvement, cellphone conflict, and cellphone addiction.”

The second survey of 145 adults measured Pphubbing among romantic couples. This was done, in part, by asking those surveyed to respond to the nine-item scale developed in the first survey.

Other areas of measurement in the second survey included cellphone conflict, relationship satisfaction, life satisfaction, depression, and interpersonal attachment style (e.g., “anxious attachment” describes people who are less secure in their relationship).

Results of the survey showed that:

  • 46.3 percent of the respondents reported being phubbed by their partner;
  • 22.6 percent said this phubbing caused conflict in their relationships;
  • 36.6 percent reported feeling depressed at least some of the time.
  • Overall, only 32 percent of respondents stated that they were very satisfied with their relationship, the study shows.

    “In everyday interactions with significant others, people often assume that momentary distractions by their cell phones are not a big deal,” David said. “However, our findings suggest that the more often a couple’s time spent together is interrupted by one individual attending to his/her cellphone, the less likely it is that the other individual is satisfied in the overall relationship.

    “Specifically, momentary distractions by one’s cellphone during time spent with a significant other likely lowers the significant other’s satisfaction with their relationship, and could lead to enhanced feelings of depression and lower well-being of that individual.

    “Thus, when spending time with one’s significant other, we encourage individuals to be cognizant of the interruptions caused by their cellphones, as these may well be harmful to their relationship.”

    Roberts explained that those with anxious attachment styles (less secure in their relationship) were more bothered (reported higher levels of cellphone conflict) than those with more secure attachment styles (more secure in their relationship). In addition, lower levels of relationship satisfaction — stemming, in part, from being Pphubbed — led to decreased life satisfaction that, in turn, led to higher levels of depression.

    Given the ever-increasing use of smartphones to communicate between romantic partners, the study helps to understand how the use of smartphones can impact not only satisfaction with romantic relationships, but also personal well-being, Roberts said.

    “When you think about the results, they are astounding,” Roberts said. “Something as common as cellphone use can undermine the bedrock of our happiness: our relationships with our romantic partners.”

    The study is published in the journal Computers in Human Behavior.

    Source: Baylor University/EurekAlert

CBT Can Ease Heart Failure Patients’ Depression

Tue, 09/29/2015 - 8:30am

A new study reveals that a cognitive-behavioral therapy intervention that targeted both depression and heart failure self-care was partly successful.

Researchers discovered CBT was effective for depression but not for heart failure self-care or physical functioning, as compared to enhanced usual care.

Heart failure occurs when the heart does not pump as well as it should. The condition requires aggressive self-care in the form of medication management, diet, and appropriate levels of activity.

Heart failure is one of the most common reasons for hospitalization, and care for the condition is very expensive. Depression and inadequate self-care heighten the risk of hospitalization and death in patients with the illness.

Self-care includes behaviors that maintain physical functioning and prevent acute exacerbations, such as following a low-sodium diet, exercising, and taking prescribed medications, according to background information in the article.

In the study, published online by JAMA Internal Medicine, Kenneth E. Freedland, Ph.D., of the Washington University School of Medicine, St. Louis, and colleagues randomly assigned 158 outpatients with heart failure and major depression to cognitive-behavioral therapy (CBT) delivered by experienced therapists plus usual care (UC; n = 79) or UC alone (n = 79).

Usual care was enhanced in both groups with a structured heart failure education program delivered by a cardiac nurse. The intervention treatment followed standard CBT manuals and a supplemental manual on CBT for cardiac patients.

The intensive phase of the intervention consisted of up to six months of weekly one hour sessions. Sessions tapered to biweekly and then monthly between the end of intensive (weekly) treatment and six months post-randomization.

One hundred thirty-two (84 percent) of the participants completed the six month posttreatment assessments; 60 (76 percent) of the UC and 58 (73 percent) of the CBT participants completed every follow-up assessment.

Six-month depression scores were lower in the CBT than the UC group. CBT did not improve heart failure self-care or physical functioning, but it did improve anxiety, fatigue, social functioning and quality of life, and additional analysis suggested that the intervention might help to decrease the hospitalization rate in clinically depressed patients.

The finding that CBT was successful for depression is important; CBT could possibly be used to manage depression even if antidepressant therapy is unsuccessful. Alternatively, CBT may be used alone for depression care rather than pharmaceutical management.

“The results suggest that CBT is superior to usual care for depression in patients with HF,” the researchers write.

“Further research is needed on interventions to improve depression, self-care, physical functioning, and quality of life in patients with HF and comorbid major depression.”

Source: The JAMA Network Journals/EurekAlert