Tri-County Services

In The News

Syndicate content
Psychology, psychiatry and mental health news and research findings, every weekday.
Updated: 52 min 26 sec ago

Survey Finds Depression, Burnout Common Among MDs in US

2 hours 15 min ago

A new Medscape report finds that nearly two-thirds of U.S. physicians feel burned out, depressed or both, with higher rates reported among women and mid-career physicians.

Based on the responses of more than 15,000 practicing physicians from 29 specialties, the Medscape National Report on Physician Burnout and Depression found that 42 percent of physicians are burned out, 15 percent are depressed and 14 percent are both burned out and depressed.

The report defined burnout as feelings of physical, emotional or mental exhaustion, frustration or cynicism about work, and doubts about one’s experience and the value of one’s work.

Among those who are depressed, 12 percent of physicians said they “feel down,” while 3 percent said they experience serious depression. Most of the surveyed physicians say their depression is due to their work. In fact, a separate Medscape survey on Physician Lifestyle and Happiness found that most physicians are happy when they aren’t working.

Regarding burnout, the highest rates were found among family physicians, intensivists, internists, neurologists, and ob-gyns. The lowest rates were among plastic surgeons, dermatologists, pathologists, and ophthalmologists. Burnout rates were higher among women (48 vs. 38 percent for men) and physicians ages 45-50 (50 vs. 35 percent for younger physicians and 41 percent for those ages 55-69.)

The majority of physicians (56 percent) said that fewer bureaucratic tasks and fewer hours spent working (39 percent) would help alleviate burnout. About one-third said more money and a more manageable work schedule would make a difference.

Studies have shown a significant link between higher levels of physician burnout and lower levels of patient safety and quality of care and the new report seems to confirm this. For example, one in three depressed physicians said they are more easily exasperated by patients; 32 percent said they were less engaged with patients; and 29 percent admitted to being less friendly.

In addition, nearly 15 percent of depressed physicians admitted that their depression might lead them to make mistakes they wouldn’t normally make, and 5 percent linked their depression to errors they had made that could have harmed a patient.

An even greater number of physicians said that depression negatively affects their colleague relationships, with 42 percent reporting exasperation, another 42 percent indicating less engagement, and 37 percent reporting they express their frustrations in front of staff or peers.

“The Medscape Report on Physician Burnout and Depression shows that there is still much to be done to support physicians around these issues,” said Leslie Kane, M.A., senior director of Medscape’s Business of Medicine.

“Physicians are still struggling with the impact of burnout. Additionally, depression among physicians is a concern. Experts are beginning to view both conditions as interrelated, with burnout perhaps being a type of depression that physicians feel more comfortable acknowledging.”

Most physicians do not seek professional help for either burnout or depression. To cope, about half of all physicians choose healthy strategies, such as exercise and talking with family or friends. On the other hand, about one-third eat junk food, and one in five drink alcohol or binge eat.

Source: Medscape / DKC

 

Cognitive Training for Some Older Adults May Improve Memory

3 hours 1 min ago

A new Canadian study finds that older adults with mild cognitive impairment were able to significantly improve memory with a specific cognitive training program.

The research appears in the Journal of the American Geriatrics Society.

Mild cognitive impairment (MCI) affects people who are in the early stages of dementia or Alzheimer’s disease. People with MCI may have mild memory loss or other difficulties completing tasks that involve cognitive abilities.

MCI may eventually develop into dementia or Alzheimer’s disease. Depression and anxiety also can accompany MCI, and these conditions can increase the risk of mental decline as people age.

In the study, scientists from research centers in Montreal and Quebec City investigated whether cognitive training, a medication-free treatment, could improve MCI.

Studies show that activities that stimulate your brain, such as cognitive training, can protect against a decline in your mental abilities. Even older adults who have MCI can still learn and use new mental skills.

In the study, researchers recruited 145 older adults around the age of 72 from Canadian memory clinics. The participants had been diagnosed with MCI, and were assigned to one of three groups. Each group included four or five participants, and met for eight weekly sessions for 120 minutes.

The three groups were:

• Cognitive training group. Members of this group participated in the MEMO program (MEMO stands for a French phrase that translates to “training method for optimal memory”). They received special training to improve their memory and attention span.
• Psychosocial group. Participants in this group were encouraged to improve their general well-being. They learned to focus on the positive aspects of their lives and find ways to increase positive situations.
• Control group. Participants had no contact with researchers and didn’t follow a program.

During the time the training sessions took place, 128 of the participants completed the project. After six months, 104 completed all the sessions they were assigned.

People in the MEMO group increased their memory scores by 35 to 40 percent, said Sylvie Belleville, Ph.D., a senior author of the study. “Most importantly, they maintained their scores over a six-month period.”

What’s more, the improvement was the largest for older adults with “delayed recall.” This means memory for words measured just 10 minutes after people have studied them. Because delayed memory is one of the earliest signs of Alzheimer’s disease, this was a key finding.

Importantly, the instruction instilled a new skill to solidify memory.

Specifically, those who participated in the MEMO group said they used the training they learned in their daily lives. The training gave them different ways to remember things.

For example, they learned to use visual images to remember names of new people, and to use associations to remember shopping lists. These lessons allowed them to continue maintaining their memory improvements after the study ended. The people in the psychosocial group and the control group didn’t experience memory benefits or improvement in their mood.

Source: American Geriatrics Society

Collaboration Can Be Enhanced with Simple Measures

3 hours 45 min ago

New research finds the key to getting people to work together effectively could be as simple as giving them the flexibility to choose their collaborators and the comfort of working with people they know.

However, it’s important to recognize that cooperation between humans makes no sense, said Dr. David Melamed, an assistant professor of sociology at The Ohio State University. Melamed is the lead author of the study, which appears in the journal Proceedings of the National Academy of Sciences.

“From an evolutionary perspective, cooperation shouldn’t exist between people – you always do better by not cooperating because then people can’t rip you off or take advantage of you,” Melamed said.

“Especially in a one-time interaction, it’s essentially paying a cost for someone else to benefit, and researchers have been working for a long time to understand why people evolved to work together.”

In the new study, Melamed and his co-authors sought to discover the connections or environment that help people collaborate most willingly.

To answer their questions, they found participants through the Amazon Mechanical Turk website – a service that allows researchers and others to hire or recruit people from around the world for a variety of purposes. For this study, all participants were from the United States.

Those who agreed to participate played online games in which each player started out with 1,000 monetary units that translated to $1 in real money that they could pocket. If one player agreed to pay another player 50 monetary units, that second person would actually acquire 100 units.

“So, if you essentially agreed to give up five cents, someone else gained 10 cents,” Melamed said.

Each of the 16-round games examined in the study included about 25 participants, some of whom participated in multiple games with different scenarios. In all, 810 people participated in the research.

Some of the games generated random networks, where certain people could interact. Others included clustered networks, in which a small group had multiple connections. This setting was designed to mimic real life, where humans often run in packs socially and at work.

And the networks were either static or dynamic. In static networks, a player could interact only with the assigned partners for the duration. In the dynamic networks, participants could cut ties with another player and form new connections.

Furthermore, some of the games included reputation information. Participants were labeled based on their history of willingness to share money. The idea was to test whether those known to collaborate were favored by other players based on reputation – a factor shown in previous research to play a significant role in whether a person is likely to partner with another.

Melamed and his research partners were surprised to find that reputation played no role in collaboration in this study. The findings might have departed from prior studies because of the difference in size and study design, he said, explaining that much of the previous work in this area has been conducted in groups of 100 or fewer and mostly involved student subjects. The Turk network used for the new study has been shown to be representative of the U.S. population in terms of age, race and other factors, Melamed said, and introduced players who had no previous connections.

Collaboration rates overall were high – and highest when the participants were operating in clusters and had the ability to drop a partner in favor of another.

“What really seems to matter is the ability to alter the structure of a network,” Melamed said. “And the pattern of relationships also made a difference. Those in a known cluster with multiple connections collaborated more, which seems intuitive if you think about how we interact in the real world.”

The findings from this study could have important implications in a variety of settings, including the workplace and the battlefield, Melamed said.

“Applying what we learned could help encourage cooperation,” he said.

The U.S. Army, which supported the study, could use this type of information to better develop strong, cooperative teams in the field, Melamed said, adding that the armed forces could also use the science to seek ways to undermine enemy forces.

Source: Ohio State University

Study Finds Violent Video Games Do Not Make Adults More Violent

4 hours 30 min ago

U.K. researchers say they have found no evidence to support the theory that video games make players more violent (at least among adults).

This topic has been debated for over two decades as the dominant model of learning in games is built on the idea that exposing players to concepts, such as violence in a game, makes those concepts easier to use in real life.

Although learning by exposure concept – known as “priming” – is thought to lead to changes in behavior, previous studies have provided mixed conclusions.

For the new study, published in the journal Computers in Human Behavior, University of York investigators performed a series of experiments with more than 3,000 participants. Their findings suggest that video game concepts do not prime players to behave in certain ways and that increasing the realism of violent video games does not necessarily increase aggression in game players.

In the new investigation, researchers increased the sample size (compared to previous studies) by expanding the number of participants in experiments. They then compared different types of gaming realism to explore whether more conclusive evidence could be found.

In one study, participants played a game where they had to either be a car avoiding collisions with trucks or a mouse avoiding being caught by a cat. Following the game, the players were shown various images, such as a bus or a dog, and asked to label them as either a vehicle or an animal.

Dr. David Zendle, from the University’s Department of Computer Science, said, “If players are ‘primed’ through immersing themselves in the concepts of the game, they should be able to categorize the objects associated with this game more quickly in the real world once the game had concluded.

“Across the two games we didn’t find this to be the case. Participants who played a car-themed game were no quicker at categorizing vehicle images, and indeed in some cases their reaction time was significantly slower.”

In a separate but connected study published in the journal Entertainment Computing, the team investigated whether realism influenced the aggression of game players. Research in the past has suggested that the greater the realism of the game the more primed players are by violent concepts, leading to antisocial effects in the real world.

Dr Zendle said: “There are several experiments looking at graphic realism in video games, but they have returned mixed results. There are, however, other ways that violent games can be realistic, besides looking like the ‘real world’, such as the way characters behave for example.

“Our experiment looked at the use of ‘ragdoll physics’ in game design, which creates characters that move and react in the same way that they would in real life. Human characters are modeled on the movement of the human skeleton and how that skeleton would fall if it was injured.”

In this case, the experiment compared player reactions to two combat games — one that used ‘ragdoll physics’ to create realistic character behavior and one that did not — in an animated world that nevertheless looked real.

Following the game, the players were asked to complete word puzzles called ‘word fragment completion tasks’, where researchers expected more violent word associations would be chosen for those who played the game that employed more realistic behaviors.

They compared the results of this experiment with another test of game realism, where a single bespoke war game was modified to form two different games. In one of these games, enemy characters used realistic soldier behaviors, whilst in the other game they did not employ realistic soldier behavior.

Said Zendle, “We found that the priming of violent concepts, as measured by how many violent concepts appeared in the word fragment completion task, was not detectable.

“There was no difference in priming between the game that employed ‘ragdoll physics’ and the game that didn’t, as well as no significant difference between the games that used ‘real’ and ‘unreal’ solider tactics.

“The findings suggest that there is no link between these kinds of realism in games and the kind of effects that video games are commonly thought to have on their players.”

Zendle explains that a follow-study study is now needed into other aspects of realism to see if this has the same result. “What happens when we consider the realism of by-standing characters in the game, for example, and the inclusion of extreme content, such as torture?”

Moreover, the theories were only tested on adults, so more work is needed to understand whether a different effect is evident in children players, Zendle said.

Source: University of York/ScienceDirect

Bilingualism May Aid Autistic Children

Wed, 01/17/2018 - 7:45am

New research suggests being bilingual may help a child with autism spectrum disorder (ASD) shift task, a skill that is often difficult for kids with autism.

Canadian researchers said the finding reflects emerging if debatable research that suggests being bilingual may offer cognitive advantages.

“This is a novel and surprising finding,” said Prof. Aparna Nadig, the senior author of the paper, from the School of Communication Sciences and Disorders at McGill University. The study appears in the journal Child Development.

“Over the past 15 years there has been a significant debate in the field about whether there is a ‘bilingual advantage’ in terms of executive functions. Some researchers have argued convincingly that living as a bilingual person and having to switch languages unconsciously to respond to the linguistic context in which the communication is taking place increases cognitive flexibility.

“But no one has yet published research that clearly demonstrates that this advantage may also extend to children on the autism spectrum. And so it’s very exciting to find that it does.”

The researchers arrived at this conclusion after comparing how easily 40 children between the ages of six and nine, with or without ASD, who were either monolingual or bilingual, were able to shift tasks in a computer-generated test. There were 10 children in each category.

In the study, the children were initially asked to sort a single object appearing on a computer screen by color (i.e., sort blue rabbits and red boats as being either red or blue) and were then asked to switch and sort the same objects instead by their shape (i.e., sort blue rabbits and red boats by shape regardless of their color).

The researchers found that bilingual children with ASD performed significantly better when it came to the more complex part of the task-shifting test relative to children with ASD who were unilingual.

Investigators believe this finding has potentially far-reaching implications for the families of children with ASD.

“It is critical to have more sound evidence for families to use when making important educational and child-rearing decisions, since they are often advised that exposing a child with ASD to more than one language will just worsen their language difficulties,” said Ana Maria Gonzalez-Barrero, the paper’s first author.

“But there are an increasing number of families with children with ASD for whom using two or more languages is a common and valued practice and, as we know, in bilingual societies such as ours in Montreal, speaking only one language can be a significant obstacle in adulthood for employment, educational, and community opportunities.”

Despite the small sample size, the researchers believe that the “bilingual advantage” they saw in children with ASD has highly significant implications and should be studied further.

They plan to follow the children with ASD that they tested in this study over the next three to five years to see how they develop. The researchers want to see whether the bilingual advantage they observed in the lab may also be observed in daily life as the children age.

Source: McGill University

TV Ads May Induce Kids to Binge on Junk Food

Wed, 01/17/2018 - 7:00am

Researchers have discovered that teenagers who watch more than three hours of commercial TV a day are more likely to eat hundreds of extra junk food snacks.

UK investigators found that incessant TV ads for unhealthy, high calorie food could lead teens to eat more than 500 extra snacks like crisps, biscuits, and fizzy drinks throughout the course of a single year compared to those who watch less TV.

Energy and other carbonated drinks high in sugar, fast food, and chips were some of the foods which were more likely to be eaten by teens who watched a lot of TV with adverts.

The report, based on a YouGov survey, questioned 3,348 young people in the U.K. between the ages of 11-19 on their TV viewing habits and diet.

When teens watched TV without commercials researchers found no link between screen time and likelihood of eating more junk food. This suggests that the advertisements on commercial TV may be driving youngsters to snack on more unhealthy food.

The report is also the biggest ever U.K. study to assess the association of TV streaming on diet.

Researchers found that teens who said they regularly streamed TV shows with ads were more than twice as likely (139 percent) to drink fizzy drinks compared to someone with low advert exposure from streaming TV, and 65 percent more likely to eat more ready meals than those who streamed less TV.

Regularly eating high calorie food and drink — which usually has higher levels of fat and sugar — increases the risk of becoming overweight or obese.

Although not commonly recognized, obesity is the second biggest preventable cause of cancer after smoking, and is linked to 13 types of cancer including bowel, breast, and pancreatic.

Dr. Jyotsna Vohra, a lead author on the study from Cancer Research UK, explains, “This is the strongest evidence yet that junk food adverts could increase how much teens choose to eat. We’re not claiming that every teenager who watches commercial TV will gorge on junk food but this research suggests there is a strong association between advertisements and eating habits.

Vohra believes government regulation should reduce or prevent junk food commercial being shown during programs that are popular with young people, such as talent shows and football matches.

“Our report suggests that reducing junk food TV marketing could help to halt the obesity crisis.”

The Obesity Health Alliance recently published a report which found that almost 60 percent of food and drink adverts shown during programs popular with adults and four to 16 year old’s were for unhealthy foods which would be banned from children’s TV channels.

Professor Linda Bauld, Cancer Research UK’s prevention expert, said, “Obese children are five times more likely to remain obese as adults which can increase their risk of cancer later in life.

Source: Cancer Research UK

Wealth May Drive Desire for Short-Term Relationships

Wed, 01/17/2018 - 6:15am

A new U.K. study finds that after being exposed to the prospect of wealth, many people tend to prefer more short-term relationships than they did previously. The researchers suggest that a resource-rich environment may — at least in part — help reduce the fear of raising a child alone.

For the study, researchers at Swansea University analyzed the relationship preferences of 151 heterosexual male and female volunteers (75 men and 76 women) by asking them to look at photos of 50 potential partners and to indicate whether they would prefer a long or short-term relationship with each one.

Next, the participants looked at several images of luxury items commonly related to financial wealth, such fast cars, jewelry, mansions, and money.

Finally, the participants looked at the same images of potential partners and sorted them by their preferred relationship type again. The findings show that, after seeing the images of wealth, both male and female participants chose more partners for short-term relationships compared to the original result; an increase of about 16 percent.

“Not all people prefer long-term committed relationships,” says Dr. Andrew G. Thomas, who led the study. “Evolutionary psychologists believe that whether someone prefers a short-term relationship over a long-term one depends partly on their circumstances, such as how difficult it might be to raise children as a single parent.”

“Importantly, when those circumstances change, we expect people to change their preferences accordingly,” Thomas said. “What we have done with our research is demonstrate this change in behavior, for the first time, within an experimental setting. After participants were given cues that the environment had lots of resources, they became more likely to select individuals for a short-term relationship.”

The researchers hypothesize that this occurs because humans have evolved the ability to read the environment, and in turn, adjust the types of relationships they prefer.

“For example, in environments which have lots of resources, it would have been easier for ancestral mothers to raise children without the father’s help. This made short-term mating a viable option for both sexes during times of resource abundance. We believe modern humans also make these decisions,” said Thomas.

Importantly, the researchers also found that participants changed their relationship preferences after being shown photos of dangerous animals and videos of people interacting with babies.

“We also found that other types of cues had an effect. When the participants were given cues that the environment contained young children, they were more likely to select individuals for a long-term relationship,” said Thomas.

“Dangerous environments seemed to cause both men and women to choose more long-term partners, though some women chose more short-term partners instead.”

The study is published in in the journal Evolution and Human Behaviour. It was conducted in collaboration with Dr. Steve Stewart-Williams from the University of Nottingham.

Source: Swansea University

Does the Internet Change Beliefs About Religious Affiliations?

Wed, 01/17/2018 - 5:30am

New research finds that the digital environment, specifically the Internet, may decrease the likelihood of a person affiliating with a religious tradition or believing that only one religion is true.

The Baylor University study suggests Internet use encourages religious “tinkering.”

“Tinkering means that people feel they’re no longer beholden to institutions or religious dogma,” explains Baylor sociologist and researcher Dr. Paul K. McClure.

“Today, perhaps in part because many of us spend so much time online, we’re more likely to understand our religious participation as free agents who can tinker with a plurality of religious ideas — even different, conflicting religions — before we decide how we want to live.”

For example, while many Millennials have been influenced by their Baby Boomer parents when it comes to religion, the Internet exposes them to a broader array of religious traditions and beliefs and may encourage them to adjust their views or experiment with their beliefs, perhaps adopting a less exclusive view of religion, McClure said.

His study — “Tinkering with Technology and Religion in the Digital Age” — appears in the Journal for the Scientific Study of Religion.

The study also found that television viewing was linked to religion, but in a different way — lower religious attendance and other religious activities that take time.

However, McClure noted that lower religious attendance of TV viewers may be because some are ill, injured, immobile, or older and incapable of taking part, and some may simply watch television to pass the time.

In 2010, when this survey was first conducted, people were spending more time on average watching television, but that has changed today as more people are spending time online or on their smartphones instead, McClure said.

“Both TV and the Internet require time, and the more time we spend using these technologies, the less time we have to participate in religious activities or with more traditional communities,” he said.

In his research, McClure analyzed used data from Wave III of Baylor Religion Survey, a survey of 1,714 adults nationwide ages 18 and older. Gallup Organization administered the surveys, with a variety of questions, in fall 2010.

In the data analyzed by McClure, participants were asked:

  • How often they took part in religious activities, among them religious attendance, church socials, religious education programs, choir practice, Bible study, prayer groups, and witnessing/sharing faith.
  • How much they agreed on a scale of one to four with the statements “All of the religions in the world are equally true” and “All around the world, no matter what religion they call themselves, people worship the same God.”
  • How many hours a day they spent surfing the Internet and how many hours they spent watching TV.
  • What religious group(s) they were affiliated with, including a category of “none.”

The analysis also took into account such variables as age, race, gender, education, place of residence, and political party. While those factors had varying impact on religious beliefs, despite the differences, “the more time one spends on the Internet, the greater the odds are that that person will not be affiliated with a religion,” McClure said.

While the Internet is nearly 26 years old, 87 percent of American adults use it, compared with before 1995, when fewer than 15 percent were online, according to a 2014 report by the Pew Forum Internet Project.

Sociologists debate how Internet use affects people.

“Some see it as a tool to improve our lives; others see it as a new kind of sociocultural reality,” McClure said.

Scholars point out that the Internet may corral people into like-minded groups, similar to how Google customizes search results and advertisements based on prior search history.

Additionally, many congregations — some 90 percent, according to previous research — use email and websites for outreach, and more than a third have both an Internet and Facebook presence.

Other scholars have found that when people choose ways to communicate, some often choose a less intimate way — such as texting rather than talking.

McClure noted that sociological research about the impact of the Internet is difficult for scholars because its swift changes make it a moving target.

“In the past decade, social networking sites have mushroomed, chat rooms have waned, and television and web browsing have begun to merge into one another as live streaming services have become more popular,” McClure said.

McClure admits his study has limitations as he only measured the amount of time people spent on the Internet, not what they were doing online. But the research may benefit scholars seeking to understand how technologies shape religious views.

“Whether through social media or the sheer proliferation of competing truth-claims online, the Internet is the perfect breeding ground for new ‘life-worlds’ that chip away at one’s certainty,” McClure said.

Source: Baylor University/EurekAlert

Severe Bullying in Childhood May Hike Teen Mental Health Risk

Tue, 01/16/2018 - 7:00am

New research finds that teens who were severely bullied as children by peers are at higher risk of mental health issues including suicidal thoughts and behaviors.

Investigators discovered the severity of the victimization is the primary factor toward development of subsequent mental health issues.

Canadian investigators reviewed data from the Quebec Longitudinal Study of Child Development on 1363 children born in 1997/98 who were followed until age 15 years. Researchers assessed children, based on self-reporting about peer victimization, at ages six, seven, eight, 10, 12, and 13 years.

Study participants came from a range of socioeconomic backgrounds, family structures, with slightly more females (53 percent) than males. They were categorized into none/low victimization, moderate victimization, and severe victimization.

The study appears in the Canadian Medical Association Journal (CMAJ).

“Our findings showed a general tendency, in about 15 percent of the children, of being exposed to the most severe levels of victimization from the beginning of their education until the transition to high school,” writes Dr. Marie-Claude Geoffroy, McGill University, Montréal, Quebec, with coauthors.

“Those children were at greater risk of debilitating depressive/dysthymic symptoms or anxiety and of suicidality in adolescence than less severely victimized children, even after we accounted for a plethora of confounders assessed throughout childhood.”

Children who experienced severe peer victimization were more than twice as likely to report depression or low moods at age 15 compared with those who experienced low or no victimization, and three times more likely to report anxiety.

Most troubling, the severe victimization group was almost 3.5 times more likely to report serious suicidal thoughts or suicide attempts compared with the none/low group.

Children who experienced moderate victimization were not at increased risk of reporting mental health problems.

About 59 percent of participants had experienced some peer victimization in the first years of elementary school, although it generally declined as the children grew older.

“Although peer victimization starts to decrease by the end of childhood, individuals in the severe trajectory group were still being exposed to the highest level of victimization in early adolescence,” write the authors.

“Our results, along with those of many other studies, suggest that severe peer victimization may contribute to the development of mental health problems in adolescence. Therefore, it is important to prevent severe victimization early in the lifespan,” they say.

They urge starting antibullying initiatives before children enter school.

Source: Canadian Medical Journal

Bipolar Subtypes May Have Distinct Origins

Tue, 01/16/2018 - 6:15am

A new Swedish study may help settle the controversy over the relationship between bipolar I and bipolar II disorders. Despite genetic overlap between these two bipolar subtypes, the findings show that each type tends to cluster within families, suggesting that they are distinct disorders with different biological origins.

The study, published in the journal Biological Psychiatry, also found unique differences between the two conditions. For example, although bipolar I tends to show up among males and females somewhat equally, bipolar II is more prominent among females. In addition, bipolar I tends to cluster in families with schizophrenia, which is not the case for bipolar disorder II.

“Hopefully, our findings increase awareness of the need for refined distinctions between subtypes of mood disorder,” said study leader Dr. Jie Song of the department of clinical neuroscience at Karolinska Institutet in Sweden.

According to Song, the findings go against the common notion among many clinicians that bipolar II is merely a milder form of bipolar I. The proposed distinction between the subtypes has significant implications for patient treatment strategies.

“We have tended to view the two forms of bipolar disorder as variants of the same clinical condition. However, this new study highlights important differences in the heritable risk for these two disorders,” said Dr. John Krystal, editor of Biological Psychiatry.

The research is the first nationwide family study to investigate the differences between the two main subtypes of bipolar disorder. The researchers looked at the occurrence of each subtype among families from the Swedish national registers.

Although there is a strong genetic relationship between bipolar I and bipolar II disorders, the new findings suggest these conditions are not totally separate. Specifically, the family occurrence for each subtype was found to be stronger than co-occurrence between the subtypes, indicating that bipolar I and bipolar II disorders tend to run in families separately, rather than occurring together.

“Within the context of our emerging appreciation of polygenic risk, where gene variations are implicated in several disorders, the new findings point to only partial overlap in the risk mechanisms for these two forms of bipolar disorder,” said Krystal.

The research also offers some additional clues that bipolar I and II disorders have distinct origins. Only bipolar disorder II showed gender differences. For example, the proportion of females to males was higher in bipolar disorder II but not bipolar disorder I. In addition, bipolar I clustered together in families with schizophrenia, which was not apparent for bipolar disorder II.

Song says that future research is needed to characterize new biomarkers to help improve treatment and prognosis.

Source: Elsevier

Review of Genetic Biomarkers Improves Depression Therapy

Tue, 01/16/2018 - 5:30am

Unfortunately, prescription of an effective anti-depressant medication is often a hit and miss proposition as an individual’s genetic make-up often determines if the medication works or not.

New research suggests checking a person’s genetic profile before prescribing an anti-depressant medication can help with the selection of the most effective medication.

The study finds that the failure of the current generation of antidepressant medications — selective serotonin reuptake inhibitors (SSRIs) occur because of genetic variations within the gene that encodes the CYP2C19 enzyme.

Researchers assessed the effectiveness of the SSRI escitalopram (Lexapro) and discovered variants in this gene results in extreme differences in the levels of escitalopram within a person’s blood profile — often limiting effectiveness.

Accordingly, prescribing the dose of escitalopram based on a patient’s specific genetic constitution would greatly improve therapeutic outcomes. The study, conducted at Karolinska Institutet in Sweden in association with researchers at Diakonhjemmet Hospital in Oslo, Norway, appears in the American Journal of Psychiatry.

Pharmaceutical treatment of depression commonly makes use of selective serotonin reuptake inhibitors (SSRIs) of which escitalopram is the most frequently administered clinically.

However, escitalopram therapy is currently limited by the fact that some patients do not respond well to the drug, while others develop adverse reactions requiring discontinuation of treatment.

In order to individualize drug therapy, researchers are attempting to establish genetic biomarkers that can predict an individual’s response to drugs.

In a recent study, it was discovered that variation in the gene encoding the enzyme responsible for escitalopram metabolism (CYP2C19) is very important in this respect.

Individuals with a variant of the gene promoting increased enzyme expression had blood levels of escitalopram too low to impact the depression symptoms, whereas patients with a defective CYP2C19 gene reached drug levels which were too high.

Overall, one third of the 2,087 study participants achieved escitalopram blood levels that were either too high or too low.

Interestingly, the researchers found that 30 percent of the patients carrying gene variants causing excessive or inadequate enzyme levels switched to other drugs within one year, in contrast with only 10 to 12 percent of patients carrying the common gene.

“Our study shows that genotyping of CYP2C19 could be of considerable clinical value in individualizing doses of escitalopram so that a better all-round antidepressive effect could be achieved for the patients,” says Professor Magnus Ingelman-Sundberg who led the study together with Professor Espen Molden.

“Because CYP2C19 is involved in the metabolism of many different SSRIs, the finding is also applicable to other types of antidepressants.”

Source: Karolinska Institutet

Disruption of Neural Networks May Impact Fibromyalgia Pain

Mon, 01/15/2018 - 7:45am

New research finds that hyperreactive brain networks could play a part in the hypersensitivity of fibromyalgia.

The study suggests the human nervous system has common characteristics to an electric power grid whereby a small disruption in one area of the network can cause the entire network to go awry.

Investigators from the University of Michigan and Pohang University of Science and Technology in South Korea discovered that patients with fibromyalgia have brain networks primed for rapid, global responses to minor changes.

This abnormal hypersensitivity, called explosive synchronization (ES), is a finding that has been observed in other network phenomena across nature. The discovery of ES in the brains of people with fibromyalgia helps to explain why widespread, chronic pain is often experienced.

The paper, published in Scientific Reports, is only the second study to detail ES in human brain data.

“For the first time, this research shows that the hypersensitivity experienced by chronic pain patients may result from hypersensitive brain networks,” said co-senior author Richard Harris, Ph.D., associate professor of anesthesiology at Michigan Medicine.

“The subjects had conditions similar to other networks that undergo explosive synchronization.”

In ES, a small stimulus can lead to a dramatic synchronized reaction in the network, as can happen with a power grid failure (that rapidly turns things off) or a seizure (that rapidly turns things on).

This phenomenon was, until recently, studied in physics rather than medicine. Researchers say it’s a promising avenue to explore in the continued quest to determine how a person develops fibromyalgia.

“As opposed to the normal process of gradually linking up different centers in the brain after a stimulus, chronic pain patients have conditions that predispose them to linking up in an abrupt, explosive manner,” says first author UnCheol Lee, Ph.D., a physicist and assistant professor of anesthesiology at Michigan Medicine. These conditions are similar to other networks that undergo ES, including power grids, Lee says.

The researchers recorded electrical activity in the brains of 10 female participants with fibromyalgia. Baseline EEG results showed hypersensitive and unstable brain networks, Harris says.

Importantly, there was a strong correlation between the degree of ES conditions and the self-reported intensity of chronic pain reported by the patients at the time of EEG testing.

Lee’s research team and collaborators in South Korea then used computer models of brain activity to compare stimulus responses of fibromyalgia patients to the normal condition.

As expected, the fibromyalgia model was more sensitive to electrical stimulation than the model without ES characteristics, Harris said.

“We again see the chronic pain brain is electrically unstable and sensitive,” he said.

Harris noted this type of modeling could help guide future treatments for fibromyalgia. Since ES can be modeled essentially outside of the brain or in a computer, researchers can exhaustively test for influential regions that transform a hypersensitive network into a more stable one. These regions could then be targeted in living humans using noninvasive brain modulation therapies.

George Mashour, M.D., Ph.D., co-senior author and professor of anesthesiology at Michigan Medicine, said, “This study represents an exciting collaboration of physicists, neuroscientists and anesthesiologists. The network-based approach, which can combine individual patient brain data and computer simulation, heralds the possibility of a personalized approach to chronic pain treatment.”

Source: University of Michigan

Early Symptom Patterns May Help ID Young People at Risk of Bipolar Disorder

Mon, 01/15/2018 - 7:00am

New research suggests two patterns of early symptoms appear to precede then predict bipolar disorder (BD),  and may help to identify young persons at increased risk of developing the illness.

One pattern of early BD consists mainly of symptoms and features associated with mood disorders — termed a characteristic “homotypic” pattern. The other predictive pattern or “heterotypic” pattern, includes other symptoms such as anxiety and disruptive behavior. Environmental risk factors and exposures can also contribute to BD risk.

The authors reviewed and analyzed data from 39 studies of early symptoms and risk factors for later development of BD. Their analysis focused on high-quality evidence from prospective studies in which data on early symptoms and risk factors were gathered before BD was diagnosed.

BD is commonly preceded by early depression or other symptoms of mental illness, sometimes years before BD develops — often indicated by onset of mania or hypomania.

Nevertheless, the authors note that “the prodromal (early) phase of BD remains incompletely characterized, limiting early detection of BD and delaying interventions that might limit future morbidity.”

The evidence reviewed suggested two patterns of early symptoms that “precede and predict” later BD. A homotypic pattern consisted of affective or mood-associated symptoms that are related to, but fall short of, standard diagnostic criteria for BD.

These symptoms may include mood swings, relatively mild symptoms of excitement, or major depression, sometimes severe and with psychotic symptoms.

The authors note that homotypic symptoms have “low sensitivity;” that is, most young people with these mood symptoms do not later develop BD.

However, this symptom pattern also had “moderate to high specificity;” homotypic symptoms do occur in many patients who go on to develop BD.

The heterotypic pattern consisted of other types of prodromal or potential early symptoms, such as early anxiety and disorders of attention or behavior.

This pattern had low sensitivity and specificity: relatively few patients with such symptoms develop BD, while many young people without heterotopic symptoms do develop BD.

The study findings also associate several other factors with an increased risk of developing BD, including preterm birth, head injury, drug exposures (especially cocaine), physical or sexual abuse, and other forms of stress. However, for most of these risk factors, both sensitivity and specificity are low.

Although many elements of the reported patterns of prodromal symptoms and risk factors have been identified previously, the study increases confidence that they are related to the later occurrence of BD.

The researchers note that the findings of high-quality data from prospective studies are “encouragingly similar” to those of previous retrospective and family-risk studies.

“There was evidence of a wide range of [psychiatric] symptoms, behavioral changes, and exposures with statistically significant associations with later diagnoses of BD,” the authors concluded.

With further study, the patterns of prodromal symptoms and risk factors may lead to new approaches to identifying young persons who are likely to develop BD, and might benefit from early treatment. The investigators add that predictive value might be even higher with combinations of multiple risk factors, rather than single predictors.

The analysis appears in the Harvard Review of Psychiatry. The research team was led by Ciro Marangoni, M.D., at the Department of Mental Health, Mater Salutis Hospital, Legnato, Italy; Gianni L. Faedda, M.D., Director of the Mood Disorder Center of New York, N.Y.; and Professor Ross J. Baldessarini, M.D., Director of the International Consortium for Bipolar & Psychotic Disorders Research at McLean Hospital in Belmont, Mass.

Source: Wolters Kluwer/EurekAlert

Gene Variants May Predict Response to Antidepressant

Mon, 01/15/2018 - 6:15am

In order to individualize drug therapy, researchers have been investigating potential genetic biomarkers that might help predict an individual’s response to medications. Now a new study finds that certain genetic variations may help determine whether selective serotonin reuptake inhibitors (SSRIs) will be effective in people with depression.

The findings, published in the The American Journal of Psychiatry, show that variations within the gene responsible for the metabolism of escitalopram (the SSRI Lexapro) can result in extreme differences in the levels of the drug achieved in patients, often either too low or too high. Therefore, prescribing the dose of escitalopram based on a patient’s specific genetic constitution would greatly improve therapeutic outcomes in these cases.

The study, which involved 2,087 patients, was conducted at Karolinska Institutet in Sweden in association with researchers at Diakonhjemmet Hospital in Oslo, Norway.

SSRIs are one of the most commonly prescribed pharmaceutical treatments for depression with escitalopram being the most frequently administered clinically. However, escitalopram therapy is currently limited by the fact that many patients do not respond well to the drug. In fact, some people  develop adverse reactions requiring discontinuation of treatment.

During the study, researchers discovered that variations in the gene encoding the enzyme responsible for escitalopram metabolism (CYP2C19) plays a very important role. Patients with a variant of the gene promoting increased enzyme expression had blood levels of escitalopram too low to impact the depression symptoms, while patients with a defective CYP2C19 gene reached drug levels which were too high.

Overall, one-third of the study participants achieved escitalopram blood levels that were either too high or too low.

Importantly, the researchers found that 30 percent of the patients carrying gene variants causing excessive or inadequate enzyme levels switched to other drugs within one year, in contrast with only 10 to 12 percent of patients carrying the common gene.

“Our study shows that genotyping of CYP2C19 could be of considerable clinical value in individualizing doses of escitalopram so that a better all-round antidepressive effect could be achieved for the patients,” said Professor Magnus Ingelman-Sundberg at Karolinska Institutet’s Department of Physiology and Pharmacology, who led the study together with Professor Espen Molden.

“Because CYP2C19 is involved in the metabolism of many different SSRIs, the finding is also applicable to other types of antidepressants.”

Major depression is among the most common and severe health problems in the world. At least eight to 10 percent of the U.S. population suffers from major depression at any given time. It is characterized by a persistently depressed mood and loss of interest in activities.

Source: Karolinska Institutet

Stigma of Mental Illness Linked to Mix of Beliefs About Causes

Mon, 01/15/2018 - 5:30am

A new study finds that campaigns to treat mental illness as a disease and remove stigma may be lacking because people also tend to believe that other factors such as bad character may play a role, muddying the picture.

Baylor University researchers focused on stigma toward individuals suffering from depression, schizophrenia, and alcoholism.

“Individuals who endorse biological beliefs that mental illness is ‘a disease like any other’ also tend to endorse other, non-biological beliefs, making the overall effect of biological beliefs quite convoluted and sometimes negative,” said lead author Matthew A. Andersson, Ph.D.

The study is published in the American Sociological Association’s journal Society and Mental Health.

Findings suggest that beliefs about causes of mental illness could be addressed in public campaigns and by policymakers in different and more beneficial ways than they are now, according to Andersson and co-author Sarah K. Harkness, Ph.D., assistant professor of sociology at the University of Iowa.

Although many in the mental health community — including the U.S. Department of Health and Human Services — see the shift in views toward genetic or chemical causes as encouraging, mental illness unfortunately still draws negative social reactions, researchers said.

That reaction often is measured by how much people want to keep a distance from those dealing with mental illness or viewed as being potentially dangerous.

The study analyzed data from the 2006 General Social Survey which presented a random sample of 1,147 respondents with theoretical situations involving individuals suffering from symptoms of depression, schizophrenia, or alcoholism.

Respondents then completed six items from the General Social Survey about how likely they thought it was that certain factors had caused the mental health problem. Those factors included:

  • Bad character
  • A chemical imbalance in the brain
  • The way he or she was raised
  • Stressful circumstances in his or her life
  • A genetic or inherited problem
  • God’s will

Researchers then measured stigma by asking respondents how willing they would be to have a person like the one in the vignette (1) move next door; (2) start working closely with them on a job; (3) marry into their family; (4) spend an evening socializing with them; (5) become their friend; or (6) move into a newly established group home in their neighborhoods for people in that condition.

“There’s a debate about whether biological beliefs in genetic causation or chemical causation lower stigma as long as we aren’t blaming bad character, too,” Andersson said. “That’s an unknown and part of the reason for this study. For all three illnesses examined here, how important is it to look at how multiple beliefs about the nature of illness combine to produce stigma? That’s what we were trying to figure out.”

What the study found was that the most common combination of viewpoints about both depression and schizophrenia was that they are caused by chemical imbalance, stressful life circumstances, and genetic abnormality. Not included as root causes were bad character, upbringing, or religious or divine causes, the authors said.

That combination of opinions was held by about 23 percent of respondents who considered the scenario about a depressed individual; and 25 percent of those who were presented with the scenario about a schizophrenic, the researchers said.

In contrast, among respondents who were presented with the scenario about an alcoholic, the most common combination of beliefs about causes included bad character, chemical imbalance, the way one was raised, stress, and genetic abnormality. That combination — held by 27 percent of respondents — attributes alcoholism to all causes except for religious or divine forces.

“One specific piece of advice is clear for combatting stigma toward depression or alcoholism: Bad character or personal weakness needs to be absolved explicitly for biological explanations to reduce stigma effectively,” Andersson said. “But for schizophrenia, the role of an individual’s character in stigmatization is far less clear, likely because of the relative severity and rarity of the illness.”

The study adds to the knowledge of how subtle but widely held theories about mental health may contribute to stigmatizing the mentally ill, Andersson said.

“Re-working anti-stigma policy initiatives around the belief patterns we linked to lowered stigma may help increase the social acceptance of people who suffer from these illnesses,” he said.

While researchers focused on the six mental illness attributions used in the General Social Survey, future research delving into other, more specific beliefs about causes — such as marital or family troubles, work stressors, various brain dysfunctions, or specific negative life events — could prove valuable, Andersson said.

Source: Baylor University

Gambling Teens Exhibit Low Academic Performance

Sun, 01/14/2018 - 8:45am

Young teens who gamble are at greater risk of struggling in school, according to a new Canadian study published in Springer’s Journal of Gambling Studies.

The study was led by Frank Vitaro of the University of Montreal, Sainte-Justine Hospital Research Center and the Research Unit on Children’s Psychosocial Maladjustment in Canada.

The long-term population-based study involved 766 Canadian teens who were assessed at the ages of 14 and 17 through self-reports and responses from their parents who answered questions about their gambling habits and academic performance.

The researchers chose to focus on how many different types of gambling activities the teens participated in, rather than how often they gambled. This is because more diverse gambling habits have been found to better predict whether a person will develop gambling problems.

Data on the social status and structure of the families in which the teens were raised was also gathered from their parents. This took account of the level of education that the children’s parents had reached, and the jobs they held.

A significant, albeit modest, correlation was found between a teenager who gambled at the age of 14 and 17 and his or her subsequent academic performance. Teens who were already gambling regularly by the time they were 14 years old saw the greatest drop in their academic performance in the years following.

For one thing, teenagers’ gambling activities after school hours often take up much of the time they might otherwise have spent on school-related work, said Vitaro. Many gamblers are also known to skip classes.

In addition, when adolescents are in the gambling scene, they are often exposed to antisocial peer groups, which in turn might diminish school engagement and school performance, either directly or through an increase in behavioral and social problems.

“Our results also confirm the pervasive role of socio-familial risk, which has been related to both elevated levels of gambling involvement and low academic performance among adolescents in previous studies,” said Vitaro, who adds that personal factors such as impulsivity also play a role.

“From a clinical perspective, these findings suggest that children living in an unfavorable environment and manifesting high levels of impulsivity should be targeted for early prevention purposes,” said Vitaro. “Failing early prevention, reducing gambling involvement may also curb to some extent the decline in academic performance.”

Source: Springer

Your Preferred Workout Setting May Reveal Your Personality Type

Sun, 01/14/2018 - 8:00am

Do you prefer to workout at the gym or outdoors? The answer might reveal something about your personality.

A new study has found that your preferred exercise setting — whether at a gym or outdoors — is closely tied to specific personality traits. For example, extroverted types and those who rely on objective logic and/or regimens are more likely to prefer working out at the gym.

On the other hand, creative people — particularly those who enjoy working with new ideas — as well as individuals who focus more on feelings and values rather than logic may be much better suited to outdoor activities such as cycling and running.

The findings are being presented by John Hackston, chartered psychologist and head of Thought Leadership at OPP, at the British Psychological Society’s annual conference of the Division of Occupational Psychology in Stratford-upon-Avon.

“The most important piece of advice to come out of this research is that there is not one type of exercise that is suited to everyone,” said Hackston.

“There can be pressure to follow the crowd to the gym or sign up to the latest exercise fad, but it would be much more effective for them to match their personality type to an exercise plan that is more likely to last the test of time.

The study involved more than 800 people from a range of businesses across several countries. The researchers found that people with extraverted personality types were more likely to prefer exercising at the gym.

More creatively minded staff, particularly those who enjoy working with new ideas, were much better suited to outdoor activities such as cycling and running when compared to a structured gym regime.

In addition, those with a preference for objective logic were also more likely to stick with a regimented exercise plan than those who view feelings and values as being more important.

“We were keen to investigate how organisations could help their staff’s development through exercise, finding that matching an individual’s personality type to a particular type of exercise can increase both the effectiveness and the person’s enjoyment of it,” said Hackston.

“Organisations can help their staff to improve their fitness using this research, with increased fitness potentially leading to lower illness-related absences and increased employee satisfaction.”

Source: British Psychological Society (BPS)

Rain Can Aid Republicans on Election Day

Sun, 01/14/2018 - 7:15am

Bad weather affects U.S. voter turnout and election outcomes, with past research showing that the Republican Party has the advantage.

Now, a new study by researchers at Dartmouth College and the Australian National University finds that the Republican Party’s advantage when it rains may be due, in part, to voters changing their minds on not only whether to vote, but who to vote for.

The study’s findings revealed that at least one percent of voting age adults in the U.S. who would have voted for a Democrat had the weather been good, voted instead for a Republican on rainy election days.

The change in party preference may be attributed to a psychological behavior, where voters may be more averse to risk during poor weather conditions, according to the researchers.

Earlier studies identified a correlation between ideological and political orientations in which conservatives or Republicans tend to be more averse to risk than liberals or Democrats.

The new study was based on a statistical analysis that drew on compositional electoral data: the voter share for the Democratic candidate, the voter share for the Republican candidate, and the abstention rate, the sum of which should be 100 percent, researchers explained. When this compositional nature of election outcomes was taken into account, the research team discovered a more nuanced effect of rainfall — how voters’ preferences may change with bad weather.

The researchers point out that past studies looking at how rain affects people’s decisions to go to the polls or abstain from voting have focused on how voter turnout tends to be higher among Republicans than among Democrats. However, they argue that this only partially explains the alleged Republican advantage.

The study was published in American Politics Research.

Source: Dartmouth College

Contact with Nature Can Enhance Mental Health of City Dwellers

Sun, 01/14/2018 - 6:30am

A new U.K. study has found that people living in the city have higher levels of mental well-being when they are in contact with nature, including being outdoors, seeing views of the trees and the sky, and hearing the birds sing.

These beneficial effects of nature are particularly strong in people with greater levels of impulsivity who are at greater risk of mental health issues.

For the study, researchers at King’s College London, landscape architects J & L Gibbons and the art foundation Nomad Projects developed a smartphone-based app called Urban Mind, which was designed to determine how exposure to natural features in cities can impact a person’s mental well-being.

The Urban Mind app monitored 108 participants who collectively completed 3,013 assessments over a one-week period. In each assessment, participants answered several questions about their current environment and momentary mental wellbeing. GPS-based geotagging was used to monitor their exact location throughout the one week trial.

The findings show significant immediate and time-lagged associations with better mental wellbeing for several natural features: trees, the sky, and birdsong. These effects were still evident several hours after exposure to trees, the sky, and birdsong had taken place, suggesting long-lasting benefits.

The researchers wanted to know whether the beneficial effects of nature might vary from one person to another, depending on their risk of developing poor mental health. To assess this, each participant was rated on “trait impulsivity,” a psychological measure of one’s tendency to behave with little forethought or consideration of the consequences. Trait impulsivity can be a predictor of higher risk of developing addictive disorders, attention-deficit hyperactivity disorder, antisocial personality disorder and bipolar disorder.

This revealed that the positive impact of nature on mental well-being was greater in people with higher levels of trait impulsivity and a higher risk of developing mental health issues.

“These findings suggest that short-term exposure to nature has a measurable beneficial impact on mental well-being,” said Dr. Andrea Mechelli from the department of psychosis studies at the Institute of Psychiatry, Psychology & Neuroscience at King’s College London.

“The interaction of this effect with trait impulsivity is intriguing, as it suggests that nature could be especially beneficial to those individuals who are at risk of poor mental health. From a clinical perspective, we hope this line of research will lead to the development of low-cost scalable interventions aimed at promoting mental health in urban populations.”

Source: King’s College London

Writing A To-Do List May Help You Fall Asleep

Sat, 01/13/2018 - 9:30am

Writing a to-do list at bedtime may help you fall asleep, according to a new study.

“We live in a 24/7 culture in which our to-do lists seem to be constantly growing and causing us to worry about unfinished tasks at bedtime,” said lead author Michael K. Scullin, Ph.D., director of Baylor University’s Sleep Neuroscience and Cognition Laboratory and assistant professor of psychology and neuroscience.

“Most people just cycle through their to-do lists in their heads, so we wanted to explore whether the act of writing them down could counteract nighttime difficulties with falling asleep.”

Some 40 percent of American adults report difficulty falling asleep at least a few times each month, according to the National Sleep Foundation.

For the study, researchers recruited 57 university students, then compared sleep patterns of students who took five minutes to write down upcoming duties versus students who chronicled completed activities.

“There are two schools of thought about this,” Scullin said. “One is that writing about the future would lead to increased worry about unfinished tasks and delay sleep, while journaling about completed activities should not trigger worry. The alternative hypothesis is that writing a to-do list will ‘offload’ those thoughts and reduce worry.”

While anecdotal evidence exists that writing a bedtime list can help you fall asleep, the Baylor study used overnight polysomnography, the “gold standard” of sleep measurement, Scullin said. With that method, researchers monitor electrical brain activity using electrodes.

Participants stayed in the lab on a week night to avoid weekend effects on bedtime and because on a weekday night, they probably had unfinished tasks to do the next day, Scullin said.

They were divided into two randomly selected groups and given five-minute writing assignments before retiring. One group was asked to write down everything they needed to remember to do the next day or over the next few days; the other was asked to write about tasks completed during the previous few days.

Students were instructed they could go to bed at 10:30 p.m.

“We had them in a controlled environment,” Scullin said. “We absolutely restricted any technology, homework, etc. It was simply lights out after they got into bed.”

Scullin noted that while the sample size was appropriate for an experimental, laboratory-based polysomnography study, a larger future study would be of value.

“Measures of personality, anxiety, and depression might moderate the effects of writing on falling asleep, and that could be explored in an investigation with a larger sample,” he said. “We recruited healthy young adults, and so we don’t know whether our findings would generalize to patients with insomnia, though some writing activities have previously been suggested to benefit such patients.”

The study was published in the American Psychological Association’s Journal of Experimental Psychology.

Source: Baylor University