Banning cellphones in schools reaps the same benefits as extending the school year by five days, according to a study co-authored by an economist at The University of Texas at Austin.
“New technologies are typically thought of as improving productivity, however this is not always the case,” said Richard Murphy, an assistant professor of economics. “When technology is multipurpose, such as cellphones, it can be both distracting and disruptive.”
Murphy and Louis-Philippe Beland, an assistant professor of economics at Louisiana State University, measured the impact of mobile phones on student performance by surveying 91 schools in four English cities (Birmingham, London, Leicester and Manchester) before and after strict cellphone policies were implemented.
By comparing student exam records and mobile phone policies from 2001 to 2013, researchers noted a significant growth in student achievement in classrooms that banned cellphones, with student test scores improving by 6.41 percent points of a standard deviation. This made them 2 percentage points more likely to pass the required exams at the end of high school, researchers explained.
“We found the impact of banning phones for these students equivalent to an additional hour a week in school, or to increasing the school year by five days,” Murphy said.
Low-achieving students benefited most from the ban, with test scores increasing by 14.23 percent points of a standard deviation — a gain that was double compared with that of average students — making them 4 percentage points more likely to pass the exams.
Likewise, the ban greatly benefitted special education needs students and those eligible for free school meals, improving exam scores 10 and 12 percent points of a standard deviation respectively.
However, researchers found that strict cellphone policies had little effect on both high-achieving students and 14-year-olds, suggesting that high achievers are less distracted by mobile phones and younger teens own and use phones less often.
“This means allowing phones into schools would be the most damaging to low-achieving and low-income students, exacerbating any existing learning inequalities,” Murphy said. “Whilst we cannot test the reason why directly, it is indicative that these students are distracted by the presence of phones, and high-ability students are able to concentrate.”
Though phone ownership among English teens is high — 90.3 percent owned a mobile phone by 2012 — results are likely to be significant in U.S. schools where 73 percent of teenagers own a mobile phone, Murphy said.
“Banning cell phones in schools would be a low-cost way for schools to reduce educational inequality,” Murphy said. “However, these findings do not discount the possibility that mobile phones could be a useful learning tool if their use is properly structured. Regardless, these results show that the presence of cellphones in schools cannot be ignored.”
Source: UNIVERSITY OF TEXAS AT AUSTIN
A new University of Washington study finds that cell phone use at playgrounds is a significant source of parental guilt, as well as a powerful distraction when children try to get caregivers’ attention or ask to them to watch a monkey bar trick for the hundredth time.
The largest group of parents, nannies and adult babysitters — 44 percent — felt they ought to restrict cell phone use while watching children at playgrounds but felt guilty for failing to live up to those ideals, researchers found. They also observed that caregivers absorbed in their phones were much less attentive to children’s requests than when they were chatting with friends or caring for other children.
The most common mobile phone uses on playgrounds were texting with friends and family, taking pictures and emailing. Only 28 percent of caregivers reported using their phones to do work, and there was no appreciable difference in mobile phone use between male and female caregivers.
The study, presented last month at the Association for Computing Machinery’s CHI 2015 conference in Seoul, Korea, documented more than 40 hours of interactions at playgrounds in north Seattle and collected data from 466 adult caregivers. While other studies have anonymously observed parental cell phone use at playgrounds and fast food restaurants, this is the first to also interview parents, nannies and others about their phone use while caring for children in public places.
“Concerns on this topic are very prevalent, and a lot of people report feeling guilty about their own behaviors,” said lead author Alexis Hiniker, a doctoral student in the UW’s human centered design and engineering department. “But there’s also a group who resents the idea that they should have to put their phones away when their child is safe and happily engaged in something else. There were strong opinions and very divergent opinions, for sure.”
The researchers found that boredom often trumped guilt or fear of being judged and was the single biggest driver prompting people to dig cell phones out of their pockets or purses.
The study also found that adults commonly overestimated how responsive they were to children’s requests while using their phones. Many parents recognized that being absorbed in their phone dilutes their attention, but many also believed that a child’s request to push them on the swing or settle a dispute readily drew them back into the present moment.
Yet in 32 instances when researchers observed a child trying to interrupt an adult using a cell phone, the caregiver completely failed to respond, speak or look away from the phone 56 percent of the time.
That level of absorption was unusual compared to other activities. In 70 instances when a child tried to get the attention of a caregiver who was chatting with a friend, helping a sibling or simply staring into space, only 11 percent of those adults failed to respond to the child’s request.
On the other hand, the total amount of observed cell phone use in Seattle playgrounds was relatively low. Nearly two-thirds of caregivers spent less than 5 percent of their time at the park using a phone, and many phone interactions lasted less than 10 seconds.
“Phones do distract us and that’s something to be aware of, but I think it’s not nearly as bad as some people have made things out to be,” said co-author Julie Kientz, associate professor of human centered design and engineering and director of the UW Computing for Healthy Living and Learning Lab. “Plenty of people are being really attentive parents and thinking deeply about these issues.”
Indeed, caregivers reported using their phones twice as often for childcare-related activities such as taking pictures to share with spouses or grandparents (88 percent), arranging to meet people later in the day (79 percent) or checking the time (75 percent). Adults generally felt less guilty about these types of activities compared to working (28 percent) or playing games (5 percent), even though the child-related phone use may be just as distracting.
Caregivers generally fell into three camps: Those who felt it was completely appropriate to engage in adult-focused activities like checking email or reading on their phones while children are engaged in playground activities (28 percent), those who felt it was important to eliminate or minimize their own phone use while watching children and lived up to those ideals (24 percent) and those who felt they should restrict phone use but weren’t able to do so (44 percent).
The fact that the largest group of caregivers had misgivings about smartphone absorption while parenting suggests that phone and app designers might consider making it easier for some users to disengage, researchers said. Tools like a “parenting” mode with limited functionality, scrolling options that end after five or 10 items or even a simple password screen can create natural breaks that prompt users check in with the outside world.
“The more we bring every type of computing experience with us everywhere, the more sophisticated designers are going to have to be,” Hiniker said. “If users experience technology as so attention-grabby that they can’t maintain the balanced life experience that they want, then they’re going to turn away.”
Source: UNIVERSITY OF WASHINGTON
Fans of homebrewed beer and backyard distilleries already know how to employ yeast to convert sugar into alcohol. But a research team led by bioengineers at the University of California, Berkeley, has gone much further by completing key steps needed to turn sugar-fed yeast into a microbial factory for producing morphine and potentially other drugs, including antibiotics and anti-cancer therapeutics.
Over the past decade, a handful of synthetic-biology labs have been working on replicating in microbes a complex, 15-step chemical pathway in the poppy plant to enable production of therapeutic drugs. Research teams have independently recreated different sections of the poppy’s drug pathway using E. coli or yeast, but what had been missing until now were the final steps that would allow a single organism to perform the task from start to finish.
In a new study to appear in the Monday, May 18, advanced online publication of the journalNature Chemical Biology, UC Berkeley bioengineer John Dueber teamed up with microbiologist Vincent Martin at Concordia University in Québec to overcome that hurdle by replicating the early steps in the pathway in an engineered strain of yeast. They were able to synthesize reticuline, a compound in poppy, from tyrosine, a derivative of glucose.
“What you really want to do from a fermentation perspective is to be able to feed the yeast glucose, which is a cheap sugar source, and have the yeast do all the chemical steps required downstream to make your target therapeutic drug,” said Dueber, the study’s principal investigator and an assistant professor of bioengineering. “With our study, all the steps have been described, and it’s now a matter of linking them together and scaling up the process. It’s not a trivial challenge, but it’s doable.”
Paving the path from plants to microbes
The qualities that make the poppy plant pathway so challenging are the same ones that make it such an attractive target for research. It is complex, but it is the foundation upon which researchers can build new therapeutics. Benzylisoquinoline alkaloids, or BIAs, are the class of highly bioactive compounds found in the poppy, and that family includes some 2,500 molecules isolated from plants.
Perhaps the best-known trail in the BIA pathway is the one that leads to the opiates, such as codeine, morphine and thebaine, a precursor to oxycodone and hydrocodone. All are controlled substances. But different trails will lead to the antispasmodic papaverine or to the antibiotic precursor dihydrosanguinarine.
“Plants have slow growth cycles, so it’s hard to fully explore all the possible chemicals that can be made from the BIA pathway by genetically engineering the poppy,” said study lead author William DeLoache, a UC Berkeley Ph.D. student in bioengineering. “Moving the BIA pathway to microbes dramatically reduces the cost of drug discovery. We can easily manipulate and tune the DNA of the yeast and quickly test the results.”
The researchers found that by repurposing an enzyme from beets that is naturally used in the production of their vibrant pigments, they could coax yeast to convert tyrosine, an amino acid readily derived from glucose, into dopamine.
With help from the lab of Concordia University’s Vincent Martin, the researchers were able to reconstitute the full seven-enzyme pathway from tyrosine to reticuline in yeast.
“Getting to reticuline is critical because from there, the molecular steps that produce codeine and morphine from reticuline have already been described in yeast,” said Martin, a professor of microbial genomics and engineering. “Also, reticuline is a molecular hub in the BIA pathway. From there, we can explore many different paths to other potential drugs, not just opiates.”
Red flag for regulators
The study authors noted that the discovery dramatically speeds up the clock for when homebrewing drugs could become a reality, and they are calling for regulators and law enforcement officials to pay attention.
“We’re likely looking at a timeline of a couple of years, not a decade or more, when sugar-fed yeast could reliably produce a controlled substance,” said Dueber. “The time is now to think about policies to address this area of research. The field is moving surprisingly fast, and we need to be out in front so that we can mitigate the potential for abuse.”
In a commentary to be published in Nature and timed with the publication of this study, policy analysts call for urgent regulation of this new technology. They highlight the many benefits of this work, but they also point out that “individuals with access to the yeast strain and basic skills in fermentation would be able to grow the yeast using the equivalent of a homebrew kit.”
They recommend restricting engineered yeast strains to licensed facilities and to authorized researchers, noting that it would be difficult to detect and control the illicit transport of engineered yeast strains.
While such controls may help, Dueber said, “An additional concern is that once the knowledge of how to create an opiate-producing strain is out there, anyone trained in basic molecular biology could theoretically build it.”
Another target for regulation would be the companies that synthesize and sell DNA sequences. “Restrictions are already in place for sequences tied to pathogenic organisms, like smallpox,” said DeLoache. “But maybe it’s time we also look at sequences for producing controlled substances.”
Source: UNIVERSITY OF CALIFORNIA – BERKELEY
– Researchers determined DNA sequences from the Y chromosomes of 334 men belonging to 17 populations from Europe and the Middle East
– Study shows that almost two out of three (64%) modern European men belong to just three young paternal lineage
– Male-specific population expansion was widespread, and surprisingly recent, focusing interest on the Bronze Age
Geneticists from the University of Leicester have discovered that most European men descend from just a handful of Bronze Age forefathers, due to a ‘population explosion’ several thousand years ago.
The project, which was funded by the Wellcome Trust, was led by Professor Mark Jobling from the University of Leicester’s Department of Genetics and the study is published in the prestigious journal Nature Communications.
The research team determined the DNA sequences of a large part of the Y chromosome, passed exclusively from fathers to sons, in 334 men from 17 European and Middle Eastern populations.
This research used new methods for analysing DNA variation that provides a less biased picture of diversity, and also a better estimate of the timing of population events.
This allowed the construction of a genealogical tree of European Y chromosomes that could be used to calculate the ages of branches. Three very young branches, whose shapes indicate recent expansions, account for the Y chromosomes of 64% of the men studied.
Professor Jobling said: “The population expansion falls within the Bronze Age, which involved changes in burial practices, the spread of horse-riding and developments in weaponry. Dominant males linked with these cultures could be responsible for the Y chromosome patterns we see today.”
In addition, past population sizes were estimated, and showed that a continuous swathe of populations from the Balkans to the British Isles underwent an explosion in male population size between 2000 and 4000 years ago.
This contrasts with previous results for the Y chromosome, and also with the picture presented by maternally-inherited mitochondrial DNA, which suggests much more ancient population growth.
Previous research has focused on the proportion of modern Europeans descending from Paleolithic — Old Stone Age — hunter-gatherer populations or more recent Neolithic farmers, reflecting a transition that began about 10,000 years ago.
Chiara Batini from the University of Leicester’s Department of Genetics, lead author of the study, added: “Given the cultural complexity of the Bronze Age, it’s difficult to link a particular event to the population growth that we infer. But Y-chromosome DNA sequences from skeletal remains are becoming available, and this will help us to understand what happened, and when.”
Source: UNIVERSITY OF LEICESTER
Bad news for relentless power-seekers the likes of Frank Underwood on House of Cards: Climbing the ladder of social status through aggressive, competitive striving might shorten your life as a result of increased vulnerability to cardiovascular disease. That’s according to new research by psychologist Timothy W. Smith and colleagues at the University of Utah. And good news for successful types who are friendlier: Attaining higher social status as the result of prestige and freely given respect may have protective effects, the researchers found.
Smith presented the findings at the annual meeting of the American Psychosomatic Society this week in Savannah, GA. The Utah researchers conducted four studies to gauge the health effects of the hostile-dominant personality style compared with the warm-dominant style.
In surveys with 500 undergraduate volunteers, hostile-dominant types reported greater hostility and interpersonal stress. Warm-dominant types tended to rank themselves as higher in social status. Both styles were associated with a higher personal sense of power.
The psychologists also monitored the blood pressure of 180 undergraduates as they reacted to stressful conversations with others who were scripted to act deferentially or dominantly. Hostile-dominant types experienced significant increases in blood pressure when interacting with a dominant partner, but not with a deferential one. Previous studies have found that increased blood pressure reactivity to stress puts people at risk for cardiovascular disease.
In a third study with 94 young, married couples, Smith and colleagues found that hostile-dominance in men was linked with higher blood pressure recorded throughout the day with a wearable monitor, but not among women. Warm-dominance in women predicted lower blood pressure, but not in men.
Among 154 older, married couples (average age of 63), a warm-dominant style was associated with less conflict and more support. A hostile-dominant style was associated with more severe atherosclerosis in men and women, as measured by coronary artery calcification. Hostile-dominance was also linked with greater marital conflict and lower marital support.
“It’s not a style that wears well with other people,” Smith says. The good news is that people can take steps to change a hostile personality style. “Something usually has to fall apart first before they are willing to entertain that option,” Smith says. “But there is some evidence that it is possible to teach old dogs new tricks, and if you do, it can reduce coronary risk.”
Source: UNIVERSITY OF UTAH
In the study, published online in the journal Evolution and Human Behaviour, 92 women studying in the UK were presented with hypothetical profiles of the opposite sex, representing varying levels of heroism in different contexts such as warfare, sport and business. They were then asked a series of questions designed to determine how attracted they were to the different profiles.
Women were more likely to find a soldier attractive, and were more inclined to date him, if he had been awarded a medal for bravery in combat. Interestingly, whether or not a non-decorated soldier had seen combat in a warzone or remained in the UK did not have a statistically significant effect on his attractiveness.
Displays of heroism in other fields, such as in sports or in business, also had no effect on how likely women were to find them attractive.
In a subsequent experiment by the researchers, 159 women and 181 men studying in Holland took part in a similar exercise to determine their level of sexual attraction to the opposite sex. This time, the soldier profiles displayed various levels of bravery, either in combat or by helping in a natural disaster zone.
Again, heroism in combat increased women’s levels of sexual attraction towards male soldiers, but heroism in a disaster zone had no impact. Female heroes, both in combat and in disaster zones, were deemed less attractive by men than their non-hero counterparts.
“This provides evidence for the hypothesis that gender differences in intergroup conflict can have an evolutionary origin, as only males seem to benefit from displaying heroism,” says Joost Leunissen, a psychologist at the University of Southampton and co-author of the study. “In light of the physical dangers and reproductive risks involved, participating in intergroup aggression might not generally be a viable reproductive strategy for women.
“Heroism also seems to be a context-specific signal, as it only had an effect on attractiveness in a setting of intergroup conflict. Indeed, soldiers who displayed heroism were only considered to be more attractive when this was displayed in a warfare context and not in another situation which is frequently associated with the army – helping during and after natural disasters.”
The experiments supplement a historical analysis undertaken by the research team, which looked at numbers of children fathered by US Medal of Honor recipients in World War II compared to the numbers of children fathered by regular veterans. The analysis shows Medal of Honor recipients had an average of 3.18 children, while regular veterans averaged 2.72 children, suggesting decorated war heroes sired more offspring than other veterans.
Joost comments: “Raids, battles, and ambushes in ancestral environments, and wars in modern environments, may provide an arena for men to signal their physical and psychological strengths. Of course, women may not always witness these heroic acts in person, but such information is likely to be widely communicated within a tribal community, particularly when the actions of male warriors are outstandingly brave.”
Source: UNIVERSITY OF SOUTHAMPTON
A researcher at the Georgia Institute of Technology has used Twitter as a lens to look into the lives of nearly 1,000 people who used the site to announce their wedding engagement. By comparing tweets before and after, the study was able to determine how people changed their online personas following the proposal. Some differences were split along gender lines. Others identified how people alter the words they use on Twitter after they are engaged.
The study followed 923 people who used “#engaged” to announce in 2011. The research team then looked at each person’s tweets in the nine-month period before the engagement and 12 months afterward (2 million total tweets). They were also compared to a random sampling of tweeters during the same time frame (12 million tweets).
After people got engaged, tweets with the word “I” or “me” dropped by 69 percent. They were replaced with “we” and “us.” There was barely any change within the control group.
“People began to paint themselves as a couple, rather than as individuals,” said Munmun de Choudhury, a Georgia Tech associate professor in the School of Interactive Computing who led the study. “They’re going through a major change in life, and it shows on social media as they adapt to society’s expectations of their marital identity.”
Similarly, tweets using familial words such as “future-in-laws” and “children” jumped by 219 percent after the proposal (although men tended to wait until after marriage to tweet family-based words).
The study also noticed that men and women gush about each other differently.
The most frequent terms used by females when tweeting about their significant other were tied to emotion (for example, they “love” their “wonderful” fiancé). Men are more likely to use physical descriptors such as sexy, beautiful or gorgeous when talking about their fiancée.
De Choudhury and co-author Michael Massimi also noticed that engaged people are much more likely to think and tweet about the future. Instead of using past-tense verbs, future-tense verbs surged by 62 percent after engagement.
“People are more likely to post that they ‘are going on a date night tonight’ rather than tweeting that they already did so,” said Massimi, a former postdoctoral fellow at Microsoft Research Cambridge. “They’re looking forward to the future in their real lives and boasting about it on social media too.”
This is the first empirical study of engagement in social media. It centered on the anthropological concept of liminality – a phase people undergo when they transition from one role in society to another.
“Twitter can be a powerful tool that can mirror our thoughts and how we’re actually feeling,” said de Choudhury, who has done similar social media studies on mothers and postpartum depression. “This isn’t based on what they told us they did. It’s a reliable record ¬- it’s what they actually did.”
Source: GEORGIA INSTITUTE OF TECHNOLOGY
An estimated 9 percent of adults in the U.S. have a history of impulsive, angry behavior and have access to guns, according to a study published this month in Behavioral Sciences and the Law. The study also found that an estimated 1.5 percent of adults report impulsive anger and carry firearms outside their homes.
Angry people with ready access to guns are typically young or middle-aged men, who at times lose their temper, smash and break things, or get into physical fights, according to the study co-authored by scientists at Duke, Harvard, and Columbia universities.
Study participants who owned six or more firearms were also far more likely than people with only one or two firearms to carry guns outside the home and to have a history of impulsive, angry behavior.
“As we try to balance constitutional rights and public safety regarding people with mental illness, the traditional legal approach has been to prohibit firearms from involuntarily-committed psychiatric patients,” said Jeffrey Swanson, Ph.D., professor in psychiatry and behavioral sciences at Duke Medicine and lead author of the study.
“But now we have more evidence that current laws don’t necessarily keep firearms out of the hands of a lot of potentially dangerous individuals.”
The researchers analyzed data from 5,563 face-to-face interviews conducted in the National Comorbidity Study Replication (NCS-R), a nationally representative survey of mental disorders in the U.S. led by Harvard in the early 2000s.
The study found little overlap between participants with serious mental illnesses and those with a history of impulsive, angry behavior and access to guns.
“Gun violence and serious mental illness are two very important but distinct public health issues that intersect only at their edges,” Swanson said.
Researchers found that anger-prone people with guns were at elevated risk for a range of fairly common psychiatric conditions such as personality disorders, alcohol abuse, anxiety, and post-traumatic stress, while only a tiny fraction suffered from acute symptoms of major disorders such as schizophrenia and bipolar disorder.
Fewer than one in 10 angry people with access to guns had ever been admitted to a hospital for a psychiatric or substance abuse problem, the study found. As a result, most of these individuals’ medical histories wouldn’t stop them from being able to legally purchase guns under existing mental-health-related restrictions.
“Very few people in this concerning group suffer from the kinds of disorders that often lead to involuntary commitment and which would legally prohibit them from buying a gun,” said Ronald Kessler, Ph.D., professor of health care policy at Harvard and principal investigator of the NCS-R survey.
Kessler, Swanson and co-authors reason that looking at a prospective gun buyer’s history of misdemeanor convictions, including violent offenses and multiple convictions for impaired driving, could be more effective at preventing gun violence in the U.S. than screening based on mental health treatment history.
As for those who already own or have access to firearms, the researchers suggest the data could support “dangerous persons” gun removal laws, like those in Connecticut and Indiana, or a “gun violence restraining order” law like California recently enacted. Such laws give family members and law enforcement a legal tool to immediately seize guns and prevent gun or ammunition purchases by people who show warning signs of impending violence.
In 2012, more than 59,000 people were injured by the intentional use of firearms, and another 11,622 were killed in violent gun incidents, according to the Centers for Disease Control and Prevention.
Source: DUKE UNIVERSITY MEDICAL CENTER
A Saint Louis University research review article suggests people are hardwired to fall out of love and move onto new romantic relationships.
“Our review of the literature suggests we have a mechanism in our brains designed by natural selection to pull us through a very tumultuous time in our lives,” said Brian Boutwell, Ph.D., associate professor of criminology and criminal justice and associate professor of epidemiology at Saint Louis University. “It suggests people will recover; the pain will go away with time. There will be a light at the end of the tunnel.”
Boutwell and his colleagues examined the process of falling out of love and breaking up, which they call primary mate ejection, and moving on to develop a new romantic relationship, which they call secondary mate ejection.
Drawing largely upon the field of evolutionary psychology, they say men and women might break up for different reasons. For instance, a man is more likely to end a relationship because a woman has had a sexual relationship with another man. For evolutionary reasons, men should be wired to try and avoid raising children that aren’t genetically their own, the authors say.
“Men are particularly sensitive to sexual infidelity between their partner and someone else,” Boutwell said. “That’s not to say women don’t get jealous, they certainly do, but it’s especially acute for men regarding sexual infidelity.”
On the other hand, a woman may be more likely to break up if her partner has been emotionally unfaithful partly because of evolutionary reasons. Over the deep time of evolution, natural selection has designed mate ejection in females to avoid the loss of resources, such as help in raising a child and physical protection, that their mates provide.
Sometimes both men and women end a relationship for the same reason. “For instance, neither gender tends to tolerate or value cruelty on the part of their partner,” Boutwell said.
In addition, some people might be more likely than others to fall out of love or have problems moving. The ability to break up and find someone new to love lies along a continuum, influenced by environmental and genetic factors.
Brain imaging studies of men and women who claimed to be deeply in love also provided important clues about dealing with breakups. Functional MRIs showed an increase in neuronal activity in the parts of the brain – the pleasure areas – that also become active with cocaine use.
“Helen Fisher’s work has revealed that this circuitry in the brain, which is deeply associated with addictive behaviors, also is implicated in the feelings associated with romantic attraction and may help explain the attachment that often follows the initial feelings of physical infatuation with a potential mate. Think of it as that initial feeling of falling in love, when you want to constantly be around the other person, almost in an addictive way,” Boutwell said.
Falling out of love, Boutwell contends, might be compared to asking a cocaine addict to break his or her habit.
“To sever that bond and move on is a huge ask of a person,” he said. “Ultimately, trying to move on from a former mate may be similar in some ways to an attempt at breaking a drug habit.”
Building off the drug addiction analogy, Boutwell examined studies about the brains of former cocaine addicts to try to predict how the brains of those who are breaking a relationship habit might look. Images of the brains of those no longer using cocaine showed a larger volume of gray matter in various brain regions, which were markedly different from images of brains of active cocaine users.
“We might argue that different regions of the brain act in a way that once that addiction has been severed, then help to facilitate a person moving on and finding a new partner,” Boutwell extrapolated. “A person might initially pursue their old mate–in an attempt to win back their affection. However, if pursuit is indeed fruitless, then the brains of individuals may act to correct certain emotions and behaviors, paving the way for people to become attracted to new mates and form new relationships.”
Conducting functional MRI studies that examine the brains of men and women who have rebounded from a relationship and fallen in love again would provide additional evidence to lend credibility to or dismiss the addiction hypothesis, he added.
In an additional attempt to understand what is going on inside the brain when a relationship ends, Boutwell examined research regarding the impact of a group of antidepressant medications called selective serotonin reuptake inhibitors (SSRIs) on romantic love. The use of SSRIs can potentially lower levels of dopamine, norepinephrine and testosterone, which might stifle romantic feelings and sexual interest.
“This is not to say that people should cease using their anti-depressants without consulting their doctors. That could be potentially tragic and a very bad decision,” Boutwell said. “Rather, like any medication, it is important to fully understand the side effects. In this case, those side effects might impinge on the intimate feelings of one partner towards another.”
Boutwell urged more research into lost love to better understand the difficulties that can creep into a romantic relationship.
“If we better understand mate ejection, it may offer direct and actionable insight into ways in which couples can save a relationship that might otherwise come to stultifying and abrupt halt,” he said.
Source: SAINT LOUIS UNIVERSITY