Nov 212014
 
 
 
 
 
Nov 212014
 

A study just released by Columbia University’s Mailman School of Public Health found that children who were exposed to antibiotics in the second or third trimester of pregnancy had a higher risk of childhood obesity at age 7. The research also showed that for mothers who delivered their babies by a Caesarean section, whether elective or non-elective, there was a higher risk for obesity in their offspring.

Study findings are published online in the International Journal of Obesity.

Although previous studies have shown that antibiotics administered early in life may be associated with increased risk of obesity in childhood, this is the first study reporting that maternal antibiotic use in the second or third trimester of pregnancy increases the risk of offspring obesity. Antibiotics affect microbes in the mother and may enter fetal circulation via the placenta. Researchers are beginning to understand that the bacteria that normally inhabit our colon have important roles in maintaining our health and imbalances in these bacterial populations can cause a variety of illnesses. Disturbances in the normal, transmission of bacteria from the mother to the child are thought to place the child at risk for several health conditions, including obesity.

The study is based on data of healthy, non-smoking, pregnant women who were recruited for the Northern Manhattan Mothers and Children Study from prenatal clinics at New York-Presbyterian Hospital and Harlem Hospital Center between 1998 and 2006. Of 727 mothers enrolled in the study, 436 mothers and their children were followed until 7 years of age. Of these 436 children, 16 percent had mothers who used antibiotics in the second or trimester. This work is part of the Columbia Center for Children’s Environmental Health’s efforts to understand how to promote healthy growth and development through out childhood and adolescence.

The children exposed to antibiotics in this timeframe had an 84-percent higher risk of obesity, compared with children who were not exposed.

“Our findings on prenatal antibiotics and risk for offspring obesity are novel, and thus warrant replication in other prospective cohort studies,” said Noel Mueller, PhD, postdoctoral research fellow at Columbia University’s Mailman School of Public Health and Institute of Human Nutrition. “If these findings hold up, they suggest new mechanisms through which childhood growth trajectories are influenced at the earliest stages of development. Our findings should not discourage antibiotic use when they are medically needed, but it is important to recognize that antibiotics are currently overprescribed.”

Independent of prenatal antibiotic usage, delivery by Caesarean section was also associated with a 46-percent higher risk of childhood obesity. The researchers controlled for maternal age, ethnicity, birth weight, sex, breastfeeding in the first year, and gestational antibiotics or delivery mode.

“Our findings are consistent with a series of papers that looked at data on Caesarean section. While earlier studies suggested that childhood outcomes differ by whether the Caesarean section was elective or non-elective, we did not observe such evidence,” said Andrew Rundle, DrPH, associate professor of Epidemiology at the Mailman School of Public Health. “Thus, our findings provide new evidence in support of the hypothesis that Caesarean section independently contributes to the risk of childhood obesity.”

Similar to antibiotic use during pregnancy, Caesarean section birth is thought to reduce the normal transmission of bacteria from the mother to the child and to disturb the balance of bacteria in the child. “Strategies to reduce medically unnecessary C-sections and to provide the infant with health promoting bacteria after C-section need to be researched,” noted Dr. Mueller.

“Further research is needed on how mode of delivery, antibiotic use during pregnancy and other factors influence the establishment of the ecosystem of bacteria that inhabit each of us,” said Dr. Rundle. “This research will help us understand how to create an early platform to support the healthy growth and development of children.”

Source: Columbia University’s Mailman School of Public Health

Nov 212014
 

Modern hand dryers are much worse than paper towels when it comes to spreading germs, according to new University of Leeds research.

Scientists from the University of Leeds have found that high-powered ‘jet-air’ and warm air hand dryers can spread bacteria in public toilets. Airborne germ counts were 27 times higher around jet air dryers in comparison with the air around paper towel dispensers.

The study shows that both jet and warm air hand dryers spread bacteria into the air and onto users and those nearby.

The research team, led by Professor Mark Wilcox of the School of Medicine, contaminated hands with a harmless type of bacteria called Lactobacillus, which is not normally found in public bathrooms. This was done to mimic hands that have been poorly washed.

Subsequent detection of the Lactobacillus in the air proved that it must have come from the hands during drying. The experts collected air samples around the hand dryers and also at distances of one and two metres away.

Air bacterial counts close to jet air dryers were found to be 4.5 times higher than around warm air dryers and 27 times higher compared with the air when using paper towels. Next to the dryers, bacteria persisted in the air well beyond the 15 second hand-drying time, with approximately half (48%) of the Lactobacilli collected more than five minutes after drying ended. Lactobacilli were still detected in the air 15 minutes after hand drying.

Professor Wilcox said: “Next time you dry your hands in a public toilet using an electric hand dryer, you may be spreading bacteria without knowing it. You may also be splattered with bugs from other people’s hands.

“These findings are important for understanding the ways in which bacteria spread, with the potential to transmit illness and disease.”

Source: University of Leeds

Nov 202014
 

This image was created by independently manipulated high and low signals of health and intelligence. Credit: Spisak, B. et al Front. Aging Neuro (2014)

This image was created by independently manipulated high and low signals of health and intelligence.
Credit: Spisak, B. et al Front. Aging Neuro (2014)

People look for candidates with a healthy complexion when choosing a leader, but don’t favor the most intelligent-looking candidates except for positions that require negotiation between groups or exploration of new markets. These results are published in the open-access journal Frontiers in Human Neuroscience.

Brian Spisak from the VU University Amsterdam and colleagues studied people’s implicit preferences for traits of leaders, such as health, intelligence, and attractiveness, and how they look for information about these qualities in the physical appearance of others.

The researchers focused on facial traits because these provide a wealth of information about individuals. For example, in women as well as men, caring and cooperative personalities are statistically more likely to have a more “feminine” face, due to higher estrogen levels, while aggressive risk-takers tend to have higher testosterone levels and a more “masculine” face.

They asked 148 women and men to imagine that they were selecting a new CEO for a company and to repeatedly pick between two photos of male faces. For each choice, the participants were given a job description that specified the CEO’s main challenge. This was either to drive aggressive competition, renegotiate a key partnership with another company, lead the company’s shift into a new market, or oversee the stable, sustained exploitation of non-renewable energy.

In each choice, both photos were of the same man, whose face had been digitally transformed. His face had been made to look more or less intelligent while his complexion was changed to look more or less healthy.

A stronger general preference for health than intelligence was found. The participants chose more healthy-looking faces over less healthy-looking faces in 69% of trials, and this preference was equally strong irrespective of the future CEO’s main challenge. More intelligent-looking faces were only preferred over less intelligent-looking faces for the two challenges that would require the most diplomacy and inventiveness: renegotiating the partnership and exploring the new market.

“Here we show that it always pays for aspiring leaders to look healthy, which explains why politicians and executives often put great effort, time, and money in their appearance. If you want to be chosen for a leadership position, looking intelligent is an optional extra under context-specific situations whereas the appearance of health appears to be important in a more context-general way across a variety of situations,” says Spisak, lead author of the paper and Assistant Professor at the Department of Management and Organization of VU University Amsterdam.

 

Source: Frontiers

Nov 202014
 

People with mild cognitive impairment (MCI) are at increased risk of converting to Alzheimer’s disease within a few years, but a new study warns the risk increases significantly if they suffer from anxiety.

The findings were reported on Oct. 29 online by The American Journal of Geriatric Psychiatry, ahead of print publication, scheduled for May 2015.

Led by researchers at Baycrest Health Sciences’ Rotman Research Institute, the study has shown clearly for the first time that anxiety symptoms in individuals diagnosed with MCI increase the risk of a speedier decline in cognitive functions – independent of depression (another risk marker). For MCI patients with mild, moderate or severe anxiety, Alzheimer’s risk increased by 33%, 78% and 135% respectively.

The research team also found that MCI patients who had reported anxiety symptoms at any time over the follow-up period had greater rates of atrophy in the medial temporal lobe regions of the brain, which are essential for creating memories and which are implicated in Alzheimer’s.

Until now, anxiety as a potentially significant risk marker for Alzheimer’s in people diagnosed with MCI has never been isolated for a longitudinal study to gain a clearer picture of just how damaging anxiety symptoms can be on cognition and brain structure over a period of time. There is a growing body of literature that has identified late-life depression as a significant risk marker for Alzheimer’s. Anxiety has historically tended to be subsumed under the rubric of depression in psychiatry. Depression is routinely screened for in assessment and follow-up of memory clinic patients; anxiety is not routinely assessed.

“Our findings suggest that clinicians should routinely screen for anxiety in people who have memory problems because anxiety signals that these people are at greater risk for developing Alzheimer’s,” said Dr. Linda Mah, principal investigator on the study, clinician-scientist with Baycrest’s Rotman Research Institute, and assistant professor in the Department of Psychiatry at the University of Toronto. Dr. Mah is also a co-investigator in a multi-site study lead by the Centre for Addiction and Mental Health, and partially funded by federal dollars (Brain Canada), to prevent Alzheimer’s in people with late-life depression or MCI who are at high risk for developing the progressive brain disease.

“While there is no published evidence to demonstrate whether drug treatments used in psychiatry for treating anxiety would be helpful in managing anxiety symptoms in people with mild cognitive impairment or in reducing their risk of conversion to Alzheimer’s, we think that at the very least behavioural stress management programs could be recommended. In particular, there has been research on the use of mindfulness-based stress reduction in treating anxiety and other psychiatric symptoms in Alzheimer’s –and this is showing promise,” said Dr. Mah.

The Baycrest study accessed data from the large population-based Alzheimer’s Disease Neuroimaging Initiative to analyze anxiety, depression, cognitive and brain structural changes in 376 adults, aged 55 – 91, over a three-year period. Those changes were monitored every six months. All of the adults had a clinical diagnosis of amnestic MCI and a low score on the depression rating scale, indicating that anxiety symptoms were not part of clinical depression.

MCI is considered a risk marker for converting to Alzheimer’s disease within a few years. It is estimated that half-a-million Canadians aged 65-and-older have MCI, although many go undiagnosed. Not all MCI sufferers will convert to Alzheimer’s – some will stabilize and others may even improve in their cognitive powers.

The Baycrest study has yielded important evidence that anxiety is a “predictive factor” of whether an individual with MCI will convert to Alzheimer’s or not, said Dr. Mah. Studies have shown that anxiety in MCI is associated with abnormal concentrations of plasma amyloid protein levels and T-tau proteins in cerebrospinal fluid, which are biomarkers of Alzheimer’s. Depression and chronic stress have also been linked to smaller hippocampal volume and increased risk of dementia.

Source: Baycrest Centre for Geriatric Care

Nov 202014
 

People can gauge the accuracy of their decisions, even if their decision making performance itself is no better than chance, according to a new study published in Psychological Science, a journal of the Association for Psychological Science.

In the study, people who showed chance-level decision making still reported greater confidence about decisions that turned out to be accurate and less confidence about decisions that turned out to be inaccurate. The findings suggest that the participants must have had some unconscious insight into their decision making, even though they failed to use the knowledge in making their original decision, a phenomenon the researchers call “blind insight.”

“The existence of blind insight tells us that our knowledge of the likely accuracy of our decisions — our metacognition — does not always derive directly from the same information used to make those decisions, challenging both everyday intuition and dominant theoretical models of metacognition,” says researcher Ryan Scott of the University of Sussex in the UK.

Metacognition, the ability to think about and evaluate our own mental processes, plays a fundamental role in memory, learning, self-regulation, social interaction, and signals marked differences in mental states, such as with certain mental illnesses or states of consciousness.

“Consciousness research reveals many instances in which people are able to make accurate decisions without knowing it, that is, in the absence of metacognition” says Scott. The most famous example of this is blindsight, in which people are able to discriminate visual stimuli even though they report that they can’t see the stimuli and that their discrimination judgments are mere guesses.

Scott and colleagues wanted to know whether the opposite scenario — metacognitive insight in the absence of accurate decision making — could also occur:

“We wondered: Can a person lack accuracy in their decisions but still be more confident when their decision is right than when it’s wrong?” Scott explains.

The researchers looked at data from 450 student volunteers, aged 18 to 40. The volunteers were presented with a “short-term memory task” in which they were shown strings of letters and were asked to memorize them. After the memory task, the researchers revealed that the order of the letters in the strings actually obeyed a complex set of rules.

The participants were then shown a new set of letter strings, half of which followed the same rules, and were asked to classify which of the strings were “correct.” For each string, they rated whether or not it followed the rules and how confident they were in that judgment.

To explore the relationship between decision making and metacognition, the researchers examined data from participants whose performance was at or below chance for the first 75% of the test strings (inaccurate decision makers) and data from participants who performed significantly above chance over the same proportion of trials (accurate decision makers).

Looking at the data from the remaining 25% of trials, the researchers found that, despite their overall chance-level performance, inaccurate decision makers made reliable confidence judgments about their decisions. In fact, the reliability of their confidence judgments did not differ from the reliability of confidence judgments made by accurate decision makers.

In other words, the participants exhibited the opposite dissociation to blindsight: They knew when they were wrong, despite being unable to make accurate judgments. The researchers decided to name the phenomenon “blind insight” to reflect that relationship.

Taken together, these findings do not support the type of bottom-up, hierarchical model of metacognition proposed by many researchers. Using signal detection theory, such models hold that low-level sensory signals drive first-order judgments (e.g., “Is this correct?”) and, ultimately, second-order metacognitive judgments (e.g., “How confident am I about whether this is correct?”).

In this study, however, there was no reliable signal driving decision making for inaccurate decision makers; thus, according to the established models, there would be no signal available to drive second-order confidence judgments. The fact that confidence was found to be greater for correct responses demonstrates that such a hierarchical model is flawed. Based on these findings, the researchers argue that there must be other pathways that lead to metacognitive insight, and a radical revision of models of metacognition is required.

Source: Association for Psychological Science

Nov 202014
 

Seeing the world through rose colored glasses.A new study has shown for the first time how people can be trained to “see” letters of the alphabet as colours in a way that simulates how those with synaesthesia experience their world.

The University of Sussex research, published today (18 November 2014) in Scientific Reports, also found that the training might potentially boost IQ.

Synaesthesia is a fascinating though little-understood neurological condition in which some people (estimated at around 1 in 23) experience an overlap in their senses. They “see” letters as specific colours, or can “taste” words, or associate sounds with different colours.

A critical debate concerns whether the condition is embedded in our genes, or whether it emerges because of particular environmental influences, such as coloured-letter toys in infancy.

While the two possibilities are not mutually exclusive, psychologists at the University’s Sackler Centre for Consciousness Science devised a nine-week training programme to see if adults without synaesthesia can develop the key hallmarks of the condition.

They found, in a sample study of 14, that not only were the participants able to develop strong letter-colour associations to pass all the standard tests for synaesthesia, most also experienced sensations such as letters seeming “coloured” or having individual personas (for instance, “x is boring”, “w is calm”).

One of the most surprising outcomes of the study was that those who underwent the training also saw their IQ jump by an average of 12 points, compared to a control group that didn’t undergo training.

Dr Daniel Bor, who co-led the study with Dr Nicolas Rothen, says: “The main implication of our study is that radically new ways of experiencing the world can be brought about simply through extensive perceptual training.

“The cognitive boost, although provisional, may eventually lead to clinical cognitive training tools to support mental function in vulnerable groups, such as Attention Deficit Hyperactivity (ADHD) children, or adults starting to suffer from dementia.”

Dr Rothen adds: “It should be emphasised that we are not claiming to have trained non synaesthetes to become genuine synaesthetes. When we retested our participants three months after training, they had largely lost the experience of ‘seeing’ colours when thinking about the letters. But it does show that synaesthesia is likely to have a major developmental component, starting for many people in childhood.”

 

Source: University of Sussex

Nov 202014
 

marathon woman runWhen we look at our lives, we tend to break them up into chapters, rather like the seasons of a TV box set. Potential dividers come in many forms, including the dawn of a new year, or the start of a new job. But if those events act as a marker between episodes, it is the decades of our lives that represent the more profound end of one series or season and the start of the next, the British Psychological Society reports.

According to the psychologists Adam Alter andHal Hershfield, when we’re on the cusp of one of these boundaries – in other words, when our age ends in a “9”, such as 29, 39, 49 or 59 – we are particularly prone to reflect on the meaning of our lives. If we don’t like what we see, their new results suggest we take drastic action, either fleeing life’s emptiness, or setting ourselves new goals.

The pair began by looking at data from the World Values Survey. Based on answers from 42,063 adults across 100 nations, they found that people with an age ending in 9 (the researchers call these people “9-enders”) were more likely than people of other ages to say that they spent time thinking about the meaning and purpose of their lives.

In another study, participants prompted to imagine and write about how they would feel the night before entering a new decade, tended to say they would think about the meaning of their life more than did other participants who’d been prompted to write about the night before their next birthday, or to write about tomorrow.

At the dawn of a new decade, how does this focus on life’s meaning affect our behaviour? Alter and Hershfield say that for some people it can lead to “maladaptive behaviours”. They looked at data from an online dating website that caters for people who are seeking extramarital affairs. Among over 8 million male users of the site, 9-enders were over-represented by 17.88 per cent relative to what you’d expect if participation were randomly distributed by age. The same was true, though to a lesser extent, for female users of the site.

For some people, the self-reflection triggered by the prospect of entering a new decade is more than they can bear. Alter and Hershfield also examined suicide data collected between 2000 and 2011 by the US Center for Disease Control and Prevention. They found that 9-enders take their own lives with a greater frequency than people whose ages end in any other digit.

It seems the “crisis of meaning” triggered by the prospect of a new decade can also lead people to set themselves new goals. When the researchers looked at data on the Athlinks website, they found that among 500 first-time marathon runners, 9-enders were over-represented by 48 per cent. The same site also contained evidence of 9-enders investing greater effort into their training and performance. Focusing on data from runners in their twenties, thirties and early forties who’d run a marathon at the end of a decade and also in the preceding and following two years, the researchers found that people achieved better times, by an average of 2.3 per cent, when they were aged 29 or 39 than when they were one or two years younger.

The researchers said there’s a growing literature that suggests “although people age continually, the passage of time is more likely to influence their thoughts and actions at some ages than others.” They added: “Here we find that people are significantly more likely to consider whether their lives are meaningful as they approach the start of a new decade.”

Nov 202014
 

A Georgia Tech professor is offering an alternative to the celebrated “Turing Test” to determine whether a machine or computer program exhibits human-level intelligence. The Turing Test – originally called the Imitation Game – was proposed by computing pioneer Alan Turing in 1950. In practice, some applications of the test require a machine to engage in dialogue and convince a human judge that it is an actual person.

Creating certain types of art also requires intelligence observed Mark Riedl, an associate professor in the School of Interactive Computing at Georgia Tech, prompting him to consider if that might lead to a better gauge of whether a machine can replicate human thought.

“It’s important to note that Turing never meant for his test to be the official benchmark as to whether a machine or computer program can actually think like a human,” Riedl said. “And yet it has, and it has proven to be a weak measure because it relies on deception. This proposal suggests that a better measure would be a test that asks an artificial agent to create an artifact requiring a wide range of human-level intelligent capabilities.”

To that end, Riedl has created the Lovelace 2.0 Test of Artificial Creativity and Intelligence.

For the test, the artificial agent passes if it develops a creative artifact from a subset of artistic genres deemed to require human-level intelligence and the artifact meets certain creative constraints given by a human evaluator. Further, the human evaluator must determine that the object is a valid representative of the creative subset and that it meets the criteria. The created artifact needs only meet these criteria but does not need to have any aesthetic value. Finally, a human referee must determine that the combination of the subset and criteria is not an impossible standard.

The Lovelace 2.0 Test stems from the original Lovelace Test as proposed by Bringsjord, Bello and Ferrucci in 2001. The original test required that an artificial agent produce a creative item in such a way that the agent’s designer cannot explain how it developed the creative item. The item, thus, must be created in such a way that is valuable, novel and surprising.

Riedl contends that the original Lovelace test does not establish clear or measurable parameters. Lovelace 2.0, however, enables the evaluator to work with defined constraints without making value judgments such as whether the artistic object created surprise.

Riedl’s paper, available here, will be presented at Beyond the Turing Test, an Association for the Advancement of Artificial Intelligence (AAAI) workshop to be held January 25 – 29, 2015, in Austin, Texas.

Source: Georgia Institute of Technology

Nov 202014
 

Karen Rudolph, professor of psychology

Photo by L. Brian Stauffer A new study by Karen Rudolph indicates that boys and girls who mature early are at higher risk of several adverse outcomes, including depression. Rudolph is a professor of psychology at Illinois.

Youth who enter puberty ahead of their peers are at heightened risk of depression, although the disease develops differently in girls than in boys, a new study suggests.

Early maturation triggers an array of psychological, social-behavioral and interpersonal difficulties that predict elevated levels of depression in boys and girls several years later, according to research by led by psychology professor Karen D. Rudolph at the University of Illinois.

Rudolph and her colleagues measured pubertal timing and tracked levels of depression among more than 160 youth over a four-year period. During their early teenage years, the youth in the study completed annual questionnaires and interviews that assessed their psychological risk factors, interpersonal stressors and coping behaviors. Parents also reported on their children’s social relationships and difficulties.

Published online by the journal Development and Psychopathology, the study is one of the first research projects to confirm that early puberty heightens risk for depression in both sexes over time and to explain the underlying mechanisms.

“It is often believed that going through puberty earlier than peers only contributes to depression in girls,” Rudolph said. “We found that early maturation can also be a risk for boys as they progress through adolescence, but the timing is different than in girls.”

Youth who entered puberty ahead of their peers were vulnerable to a number of risks that were associated with depression. They had poorer self-images; greater anxiety; social problems, including conflict with family members and peers; and tended to befriend peers who were prone to getting into trouble, the researchers found.

Levels of depression among early-maturing girls were elevated at the beginning of the study and remained stable over the next three years. These adverse effects were persistent in early maturing girls, who remained at a distinct disadvantage, even as peers caught up to them in physical development, Rudolph said.

“In girls, early maturation seems to trigger immediate psychological and environmental risks and consequent depression,” Rudolph said. “Pubertal changes cause early maturing girls to feel badly about themselves, cope less effectively with social problems, affiliate with deviant peers, enter riskier and more stressful social contexts and experience disruption and conflict within their relationships.”

Early maturation did not appear to have these immediate adverse effects on boys, who showed significantly lower levels of depression at the outset than their female counterparts. However, these differences dissipated over time, such that by the end of the fourth year, early maturing boys didn’t differ significantly from their female counterparts in their levels of depression.

“While early maturation seemed to protect boys from the challenges of puberty initially, boys experienced an emerging cascade of personal and contextual risks – negative self-image, anxiety, social problems and interpersonal stress – that eventuated in depression as they moved through adolescence,” Rudolph said.

Although the study examined the risk factors as independent measures, it’s possible that these elements mutually reinforce each other over time, the researchers said.

“But it’s important to note, as we find in our work, that only some teens are vulnerable to the effects of early maturation, particularly those with more disruption in their families and less support in their peer relationships,” Rudolph said.

Psychology professors Sharon F. Lambert of George Washington University; Misaki N. Natsuaki of University of California, Riverside; and Wendy Troop-Gordon of North Dakota State University were co-authors on the study.

Source:  University of Illinois

  • RSSRSS
  • Social Slider
  • RSS