Jul 312014
 

The shocking news of an Ohio teen who died of a caffeine overdose in May highlighted the potential dangers of the normally well-tolerated and mass-consumed substance. To help prevent serious health problems that can arise from consuming too much caffeine, scientists are reporting progress toward a rapid, at-home test to detect even low levels of the stimulant in most beverages and even breast milk.

Their report appears in ACS’ Journal of Agricultural and Food Chemistry.

Mani Subramanian and colleagues note that caffeine’s popularity as a “pick-me-up” has led to it being added to more than 570 beverages and 150 food products, including gums and jelly beans. It also comes in a pure powder form that consumers can use themselves to spike drinks and food. In small amounts, most people can handle caffeine without a problem. But excessive doses can lead to serious health problems, including insomnia, hallucinations, vitamin deficiency, several types of cancer and in rare cases, death. Subramanian’s team wanted to develop a quick and easy way for consumers to determine whether the caffeine levels in their foods and drinks fall within a safe range.

They tested an enzyme called caffeine dehydrogenase and found that it could detect caffeine in a variety of drinks — with the exception of teas — within one minute. Also, it was sensitive enough to pick up on caffeine’s presence at concentrations as low as 1 to 5 parts per million, the maximum limit the Food and Drug Administration advises for nursing mothers. They say that their method could be integrated into a dip-stick type of test, like over-the-counter pregnancy tests, that could be used at home.

Source: American Chemical Society

Jul 312014
 

Researchers at Inserm, led by Claude Gronfier, have, for the first time, conducted a study under real conditions on the body clocks of members of the international polar research station Concordia. The researchers have shown that a particular kind of artificial light is capable of ensuring that their biological rhythms are correctly synchronised despite the absence of sunlight. The full significance of this result can be appreciated with the knowledge that disturbance to this biological clock causes problems with sleep, alertness, cardiovascular problems and even depression.

These results, published in Plos-One, could be rapidly transformed into practical applications for working environments that are dimly to moderately lighted (polar research stations, thermal and nuclear power stations, space missions, offices with no windows, etc.). They could enable the design of lighting strategies intended to maintain the health, productivity and safety of staff.
The system that allows our body to regulate a certain number of vital functions over a period of about 24 hours is called the body clock (or circadian rhythm). Located deep within the brain, it consists of 20,000 neurons whose pulsatile activity controls the sleep/wake cycle, body temperature, heart rate, the release of hormones, etc. The cycle determined by the internal clock lasts spontaneously between 23.5 to 24.5 hours, depending on the individual. In order to function correctly, it refers to the signals that it receives from the external world and that it interprets as indicators for the purpose of constantly resynchronising itself every 24 hours.
This is why the intake of food, physical exercise and the external temperature, for example, are said to be ‘time setters’. The most important ‘time setter’, however, is light. After inappropriate exposure to light, your entire body clock is thrown out of order with consequences for cognitive functions, sleep, alertness, memory, cardiovascular functions, etc.

 
For the first time, scientists have been able to study under real conditions how various types of artificial light influence the way the biological clock behaves in situations where the natural light is insufficient. For nine weeks of the polar winter (no sunlight during the day), the staff of the international polar station Concordiawere alternately exposed to a standard white light and a white light enriched with blue wavelengths (a particular kind of fluorescent light that is perceived as white by the visual system). For the purposes of the study, the researchers asked the staff not to change their day-to-day habits, particularly the times they got up and went to bed.

Once a week, samples of saliva were taken in order to measure the rates of melatonin (central hormone) secreted by each of the individuals.
The details of the results show that an increase in sleep, better reactions and more motivation were observed during the ‘blue’ weeks. Moreover, while the circadian rhythm tended to shift during the ‘white’ weeks, no disturbance in rhythm was observed during the ‘blue’ weeks. In addition, the effects did not disappear with the passage of time.

On a general level, the study shows that an optimised light spectrum enriched with short wavelengths (blue) can enable the circadian system to synchronise correctly and non-visual functions to be activated in extreme situations where sunlight is not available for long stretches of time.

The effectiveness of such lighting is due to the activation of melanopsin-containing ganglion cells discovered in 2002 in the retina. These photoreceptor cells are basically essential to the transmission of light information to a large number of so-called ‘non-visual’ centres in the brain.

 

‘Although the benefits of “blue light” for the biological clock have already been demonstrated in the past, all the studies were conducted under conditions that are difficult to reproduce under real conditions’, explained Claude Gronfier, the main author of this work.
These results could quickly lead to practical applications. In working environments where the intensity of the light is not sufficient (polar research stations, thermal and nuclear power stations, space missions, offices with no windows, etc.), they could enable the design of lighting strategies intended to maintain the health, productivity and safety of staff.

‘Beyond a professional context, we envisage this strategy more broadly as a practical approach to the treatment of problems with the circadian rhythms of sleep and non-visual functions in conditions where the lighting is not optimal.’

 

 

What should be remembered from this work is as follows:

White light enriched with blue is more effective than the standard white light that is found in offices and homes for the purpose of synchronising the biological clock and activating the non-visual functions that are essential to the correct functioning of the body. It is thus not necessary to use blue lights or even LEDs to obtain positive effects.

The effectiveness of this light does not require high levels of illumination as is the case in the photic approaches to the treatment of problems with the circadian rhythms of sleep or seasonal affective disorder (5,000 to 10,000 lux are recommended in these approaches.)

Due to its effectiveness, this light does not require sessions of exposure to it (between 30 minutes and two hours are recommended in the photic approaches previously mentioned). In this study, the light comes from the lighting of the rooms being used.

The effects of this lighting approach do not disappear with the passage of time. This study shows that the effects are the same from the first to the ninth week of observation.

Gronfier-1-2-466x330

Gronfier-1-466x305

Composition of standard white light and light enriched with blue On the left, the spectrum of the white light consists of roughly equal parts of red and green (about 40%), then blue (12%) and infra-red waves (4%). On the right, the proportions are different (42% blue and 14% red). Nevertheless, with the naked eye, a human will perceive a white light in both cases.

Source: Inserm

Jul 312014
 

A study of high school students by University of Adelaide psychology researchers has shed new light on the links between insomnia-related mental health conditions among teens.

School of Psychology PhD student Pasquale Alvaro surveyed more than 300 Australian high school students aged 12-18 to better understand their sleep habits, mental health condition and the time of day they were most active (known as their “chronotype”).

The results, now published in the journal Sleep Medicine, may have implications for the clinical treatment of teens experiencing sleep and mental health issues.

“People with insomnia find it difficult to fall asleep or stay asleep for as long as they need to. This is a widespread sleep disorder among the general public, and in most countries about 11% of teens aged 13-16 years experience insomnia at some stage,” Mr Alvaro says.

“There is a growing awareness among the scientific community that insomnia, depression and anxiety disorders are linked with each other, and these disorders contain overlapping neurobiological, psychological, and social risk factors.

“Having insomnia in addition to anxiety or depression can further intensify the problems being experienced with each individual disorder. It can lead to such problems as alcohol and drug misuse during adolescence,” he says.

Mr Alvaro’s study found that the presence of insomnia was independently linked with depression, generalized anxiety disorder and panic disorder among teens.

Teens who were more active in the evenings were more likely to have depression and/or insomnia. This group was also more likely to have obsessive-compulsive disorder, separation anxiety, and social phobia, although these disorders were often not independently linked with insomnia.

“These findings suggest that the ‘eveningness’ chronotype – being more active in the evenings – is an independent risk factor for insomnia and depression. This is important because adolescents tend to develop a preference for evenings, which sometimes becomes a syndrome whereby they keep delaying going to sleep,” Mr Alvaro says.

“Based on our evidence, we believe that prevention and treatment efforts for insomnia and depression should consider this combination of mental health, sleep, and the eveningness chronotype, in addition to current mainstream behavioral approaches. Prevention and treatment efforts for anxiety subtypes should also consider focusing on insomnia and depression.”

Source: University of Adelaide 

Jul 312014
 

Blood testJohns Hopkins researchers say they have discovered a chemical alteration in a single human gene linked to stress reactions that, if confirmed in larger studies, could give doctors a simple blood test to reliably predict a person’s risk of attempting suicide.

The discovery, described online in The American Journal of Psychiatry, suggests that changes in a gene involved in the function of the brain’s response to stress hormones plays a significant role in turning what might otherwise be an unremarkable reaction to the strain of everyday life into suicidal thoughts and behaviors.

“Suicide is a major preventable public health problem, but we have been stymied in our prevention efforts because we have no consistent way to predict those who are at increased risk of killing themselves,” says study leader Zachary Kaminsky, Ph.D., an assistant professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine. “With a test like ours, we may be able to stem suicide rates by identifying those people and intervening early enough to head off a catastrophe.”

For his series of experiments, Kaminsky and his colleagues focused on a genetic mutation in a gene known as SKA2. By looking at brain samples from mentally ill and healthy people, the researchers found that in samples from people who had died by suicide, levels of SKA2 were significantly reduced.

Within this common mutation, they then found in some subjects an epigenetic modification that altered the way the SKA2 gene functioned without changing the gene’s underlying DNA sequence. The modification added chemicals called methyl groups to the gene. Higher levels of methylation were then found in the same study subjects who had killed themselves. The higher levels of methylation among suicide decedents were then replicated in two independent brain cohorts.

In another part of the study, the researchers tested three different sets of blood samples, the largest one involving 325 participants in the Johns Hopkins Center for Prevention Research Study found similar methylation increases at SKA2 in individuals with suicidal thoughts or attempts. They then designed a model analysis that predicted which of the participants were experiencing suicidal thoughts or had attempted suicide with 80 percent certainty. Those with more severe risk of suicide were predicted with 90 percent accuracy. In the youngest data set, they were able to identify with 96 percent accuracy whether or not a participant had attempted suicide, based on blood test results.

The SKA2 gene is expressed in the prefrontal cortex of the brain, which is involved in inhibiting negative thoughts and controlling impulsive behavior. SKA2 is specifically responsible for chaperoning stress hormone receptors into cells’ nuclei so they can do their job. If there isn’t enough SKA2, or it is altered in some way, the stress hormone receptor is unable to suppress the release of cortisol throughout the brain. Previous research has shown that such cortisol release is abnormal in people who attempt or die by suicide.

Kaminsky says a test based on these findings might best be used to predict future suicide attempts in those who are ill, to restrict lethal means or methods among those a risk, or to make decisions regarding the intensity of intervention approaches.

He says that it might make sense for use in the military to test whether members have the gene mutation that makes them more vulnerable. Those at risk could be more closely monitored when they returned home after deployment. A test could also be useful in a psychiatric emergency room, he says, as part of a suicide risk assessment when doctors try to assess level of suicide risk.

The test could be used in all sorts of safety assessment decisions like the need for hospitalization and closeness of monitoring. Kaminsky says another possible use that needs more study could be to inform treatment decisions, such as whether or not to give certain medications that have been linked with suicidal thoughts.

“We have found a gene that we think could be really important for consistently identifying a range of behaviors from suicidal thoughts to attempts to completions,” Kaminsky says. “We need to study this in a larger sample but we believe that we might be able to monitor the blood to identify those at risk of suicide.”

Source: Johns Hopkins Medicine

 

Jul 312014
 

 

Illustration by Barbara Nicholson.

Illustration by Barbara Nicholson.

Media and marketing experts have long sought a reliable method of forecasting responses from the general population to future products and messages. According to a study conducted at The City College of New York, it appears that the brain responses of just a few individuals are a remarkably strong predictor.

By analyzing the brainwaves of 16 individuals as they watched mainstream television content, researchers were able to accurately predict the preferences of large TV audiences, up to 90 % in the case of Super Bowl commercials. The findings appear in a paper entitled, “Audience Preferences Are Predicted by Temporal Reliability of Neural Processing,” published July 29, 2014, in “Nature Communications.” 

“Alternative methods, such as self-reports are fraught with problems as people conform their responses to their own values and expectations,” said Dr. Jacek Dmochowski, lead author of the paper and a postdoctoral fellow at City College during the research. However, brain signals measured using electroencephalography (EEG) can, in principle, alleviate this shortcoming by providing immediate physiological responses immune to such self-biasing. “Our findings show that these immediate responses are in fact closely tied to the subsequent behavior of the general population,” he added.

Dr. Lucas ParraHerbert Kayser Professor of Biomedical Engineering in CCNY’s Grove School of Engineering and the paper’s senior author explained that: “When two people watch a video, their brains respond similarly – but only if the video is engaging. Popular shows and commercials draw our attention and make our brainwaves very reliable; the audience is literally ‘in-sync’.”

In the study, participants watched scenes from “The Walking Dead” TV show and several commercials from the 2012 and 2013 Super Bowls. EEG electrodes were placed on their heads to capture brain activity. The reliability of the recorded neural activity was then compared to audience reactions in the general population using publicly available social media data provided by the Harmony Institute and ratings from the “USA Today’s” Super Bowl Ad Meter.

“Brain activity among our participants watching “The Walking Dead” predicted 40% of the associated Twitter traffic,” said Professor Parra. “When brainwaves were in agreement, the number of tweets tended to increase.” Brainwaves also predicted 60% of the Nielsen Ratings that measure the size of a TV audience.

The study was even more accurate (90 percent) when comparing preferences for Super Bowl ads. For instance, researchers saw very similar brainwaves from their participants as they watched a 2012 Budweiser commercial that featured a beer-fetching dog. The general public voted the ad as their second favorite that year. The study found little agreement in the brain activity among participants when watching a GoDaddy commercial featuring a kissing couple. It was among the worst rated ads in 2012. 

The CCNY researchers collaborated with Dr. Matthew Bezdek and Dr. Eric Schumacher from the Georgia Institute of Technology to identify which brain regions are involved and explain the underlying mechanisms. Using functional magnetic resonance imaging (fMRI), they found evidence that brainwaves for engaging ads could be driven by activity in visual, auditory and attention brain areas. 

“Interesting ads may draw our attention and cause deeper sensory processing of the content,” said Dr. Bezdek, a postdoctoral researcher at Georgia Tech’s School of Psychology. 

Apart from applications to marketing and film, Professor Parra is investigating whether this measure of attentional draw can be used to diagnose neurological disorders such as attention deficit disorder or mild cognitive decline. Another potential application is to predict the effectiveness of online educational videos by measuring how engaging they are.

Also see our other story on this research 

Source: The City College of New York

Jul 302014
 

Stock index on the monitor.A team of researchers from Warwick Business School and Boston University have developed a method to automatically identify topics that people search for on Google before subsequent stock market falls.

Applied to data between 2004 and 2012, the method shows that increases in searches for business and politics preceded falls in the stock market. The study, ‘Quantifying the semantics of search behavior before stock market moves,’ was published in the Proceedings of the National Academy of Sciences.

The researchers suggest that this method could be applied to help identify warning signs in search data before a range of real world events.

“Search engines, such as Google, record almost everything we search for,” said Chester Curme, Research Fellow at Warwick Business School and lead author of the study. “Records of these search queries allow us to learn about how people gather information online before making decisions in the real world. So there’s potential to use these search data to anticipate what large groups of people may do.

“However, the number of possible things people could search for is huge. So an important challenge is to identify what types of words may be relevant to behaviours of interest.”

In previous studies, Curme and his colleagues, Tobias Preis and Suzy Moat of Warwick Business School, and H. Eugene Stanley of Boston University, have demonstrated that usage data from Google and Wikipedia may contain early warning signs of stock market moves. However, these findings relied on the researchers choosing an appropriate set of keywords, in particular those related to finance.

In order to enable algorithms to automatically identify patterns in search activity that might be related to subsequent real world behaviour, the team quantified the meaning of every single word on Wikipedia. This allowed the researchers to categorize words into topics, so that a “business” topic may contain words such as “business”, “management”, and “bank”. The algorithm identified a broad selection of topics, ranging from food to architecture to cricket.

The team then used Google Trends to see how often each week thousands of these words were searched for by Internet users in the United States between 2004 and 2012. By using these search activity datasets in a simple trading strategy for the S&P 500, they found that changes in how often users searched for terms relating to business and politics could be connected to subsequent stock market moves.

“By mining these datasets, we were able to identify a historic link between rises in searches for terms for both business and politics, and a subsequent fall in stock market prices,” said Suzy Moat, Assistant Professor of Behavioural Science at Warwick Business School.

“No other topic was linked to returns that were significantly higher than those generated by randomly buying and selling. The finding that political terms were of use in our trading strategies, as well as more obvious financial terms, provides evidence that valuable information may be contained in search engine data for keywords with less obvious semantic connections to events of interest. Our method provides a new approach for identifying such keywords.”

Moat continued, “Our results are in line with the hypothesis that increases in searches relating to both politics and business could be a sign of concern about the state of the economy, which may lead to decreased confidence in the value of stocks, resulting in transactions at lower prices.”

“Our results provide evidence of a relationship between the search behaviour of Google users and stock market movements,” said Tobias Preis, Associate Professor of Behavioural Science and Finance at Warwick Business School. “However, our analysis found that the strength of this relationship, using this very simple weekly trading strategy, has diminished in recent years. This potentially reflects the increasing incorporation of Internet data into automated trading strategies, and highlights that more advanced strategies are now needed to fully exploit online data in financial trading.”

“We believe that follow-up analyses incorporating data at a finer time granularity, or using other types of online data, could shed light on how the relationships we uncover have evolved in time,” said Curme.

Curme added, “While our investigation used stock market movements as a case study, these methods could in principle be applied to create predictive models for a wide range of other events.”

Source: University of Warwick

Jul 302014
 

FearAn evolutionarily ancient and tiny part of the brain tracks expectations about nasty events, finds new UCL research.

The study, published in Proceedings of the National Academy of Sciences, demonstrates for the first time that the human habenula, half the size of a pea, tracks predictions about negative events, like painful electric shocks, suggesting a role in learning from bad experiences.

Brain scans from 23 healthy volunteers showed that the habenula activates in response to pictures associated with painful electric shocks, with the opposite occurring for pictures that predicted winning money.

Previous studies in animals have found that habenula activity leads to avoidance as it suppresses dopamine, a brain chemical that drives motivation. In animals, habenula cells have been found to fire when bad things happen or are anticipated.

“The habenula tracks our experiences, responding more the worse something is expected to be,” says senior author Dr Jonathan Roiser of the UCL Institute of Cognitive Neuroscience. “For example, the habenula responds much more strongly when an electric shock is almost certain than when it is unlikely. In this study we showed that the habenula doesn’t just express whether something leads to negative events or not; it signals quite how much bad outcomes are expected.”

During the experiment, healthy volunteers were placed inside a functional magnetic resonance imaging (fMRI) scanner, and brain images were collected at high resolution because the habenula is so small. Volunteers were shown a random sequence of pictures each followed by a set chance of a good or bad outcome, occasionally pressing a button simply to show they were paying attention. Habenula activation tracked the changing expectation of bad and good events.

“Fascinatingly, people were slower to press the button when the picture was associated with getting shocked, even though their response had no bearing on the outcome.” says lead author Dr Rebecca Lawson, also at the UCL Institute of Cognitive Neuroscience. “Furthermore, the slower people responded, the more reliably their habenula tracked associations with shocks. This demonstrates a crucial link between the habenula and motivated behaviour, which may be the result of dopamine suppression.”

The habenula has previously been linked to depression, and this study shows how it could be involved in causing symptoms such low motivation, pessimism and a focus on negative experiences. A hyperactive habenula could cause people to make disproportionately negative predictions.

“Other work shows that ketamine, which has profound and immediate benefits in patients who failed to respond to standard antidepressant medication, specifically dampens down habenula activity,” says Dr Roiser. “Therefore, understanding the habenula could help us to develop better treatments for treatment-resistant depression.”

Source: University College London

Jul 302014
 

1.Prof-Diana-Anderson-in-lab.-landscape-web-200x100Researchers from the University of Bradford have devised a simple blood test that can be used to diagnose whether people have cancer or not.

The test will enable doctors to rule out cancer in patients presenting with certain symptoms, saving time and preventing costly and unnecessary invasive procedures such as colonoscopies and biopsies being carried out. Alternatively, it could be a useful aid for investigating patients who are suspected of having a cancer that is currently hard to diagnose.

Early results have shown the method gives a high degree of accuracy diagnosing cancer and pre-cancerous conditions from the blood of patients with melanoma, colon cancer and lung cancer.  The research is published online in FASEB Journal, the US journal of the Federation of American Societies for Experimental Biology.

The Lymphocyte Genome Sensitivity (LGS) test looks at white blood cells and measures the damage caused to their DNA when subjected to different intensities of ultraviolet light (UVA), which is known to damage DNA. The results of the empirical study show a clear distinction between the damage to the white blood cells from patients with cancer, with pre-cancerous conditions and from healthy patients.

Professor Diana Anderson, from the University’s School of Life Sciences led the research. She said: “White blood cells are part of the body’s natural defence system. We know that they are under stress when they are fighting cancer or other diseases, so I wondered whether anything measureable could be seen if we put them under further stress with UVA light.We found that people with cancer have DNA which is more easily damaged by ultraviolet light than other people, so the test shows the sensitivity to damage of all the DNA – the genome – in a cell.”

The study looked at blood samples taken from 208 individuals. Ninety-four healthy individuals were recruited from staff and students at the University of Bradford and 114 blood samples were collected from patients referred to specialist clinics within Bradford Royal Infirmary prior to diagnosis and treatment. The samples were coded, anonymised, randomised and then exposed to UVA light through five different depths of agar.

The UVA damage was observed in the form of pieces of DNA being pulled in an electric field towards the positive end of the field, causing a comet-like tail. In the LGS test, the longer the tail the more DNA damage, and the measurements correlated to those patients who were ultimately diagnosed with cancer (58), those with pre-cancerous conditions (56) and those who were healthy (94).

“These are early results completed on three different types of cancer and we accept that more research needs to be done; but these results so far are remarkable,” said Professor Anderson. “Whilst the numbers of people we tested are, in epidemiological terms, quite small, in molecular epidemiological terms, the results are powerful. We’ve identified significant differences between the healthy volunteers, suspected cancer patients and confirmed cancer patients of mixed ages at a statistically significant level of P<0.001. This means that the possibility of these results happening by chance is 1 in 1000. We believe that this confirms the test’s potential as a diagnostic tool.”

Professor Anderson believes that if the LGS proves to be a useful cancer diagnostic test, it would be a highly valuable addition to the more traditional investigative procedures for detecting cancer.

A clinical trial is currently underway at Bradford Royal Infirmary. This will investigate the effectiveness of the LGS test in correctly predicting which patients referred by their GPs with suspected colorectal cancer would, or would not, benefit from a colonoscopy – currently the preferred investigation method.

Source: University of Bradford

Jul 302014
 

photodune-7915921-selfie-xsA new study by researchers in the Department of Psychology at the University of York shows that it is possible to accurately predict first impressions using measurements of physical features in everyday images of faces, such as those found on social media.

When we look at a picture of a face we rapidly form judgements about a person’s character, for example whether they are friendly, trustworthy or competent. Even though it is not clear how accurate they are, these first impressions can influence our subsequent behaviour (for example, judgements of competence based on facial images can predict election results). The impressions we create through images of our faces (“avatars” or “selfies”) are becoming more and more important in a world where we increasingly get to know one another online rather than in the flesh.

Previous research has shown that many different judgements can be boiled down to three distinct “dimensions”: approachability (do they want to help or harm me?), dominance (can they help or harm me?) and youthful-attractiveness (perhaps representing whether they’d be a good romantic partner – or a rival!).

To investigate the basis for these judgements the research team took ordinary photographs from the web and analyzed physical features of the faces to develop a model that could accurately predict first impressions. Each of 1,000 faces was described in terms of 65 different features such as “eye height”, “eyebrow width” and so on. By combining these measures the model could explain more than half of the variation in human raters’ social judgements of the same faces.

Reversing the process it was also possible to create new cartoon-like faces that produced predictable first impressions in a new set of judges. These images also illustrate the features that are associated with particular social judgements.

The study, published today in Proceedings of the National Academy of Sciences (PNAS), shows how important faces and specific images of faces can be in creating a favourable or unfavourable first impression. It provides a scientific insight into the processes that underlie these judgements and perhaps into the instinctive expertise of those (such as casting directors, portrait photographers, picture editors and animators) who create and manipulate these impressions professionally.

Richard Vernon, a PhD student who was part of the research team, said: “Showing that even supposedly arbitrary features in a face can influence people’s perceptions suggests that careful choice of a photo could make (or break) others’ first impressions of you.”

Fellow PhD student, Clare Sutherland, said: “We make first impressions of others so intuitively that it seems effortless – I think it’s fascinating that we can pin this down with scientific models. I’m now looking at how these first impressions might change depending on different cultural or gender groups of perceivers or faces.”

Professor Andy Young, of the Department of Psychology at York, said: “Showing how these first impressions can be captured from very variable images of faces offers insight into how our brains achieve this seemingly remarkable perceptual feat.”

Dr Tom Hartley, who led the research with Professor Young, added: “In everyday life I am not conscious of the way faces and pictures of faces are influencing the way I interact with people. Whether in “real life” or online; it feels as if a person’s character is something I can just sense. These results show how heavily these impressions are influenced by visual features of the face – it’s quite an eye opener!”

Source: University of York 

Jul 302014
 

 photodune-6366711-hungry-xs/b>Our brain’s response to the sight of food appears to be driven more by how low our blood sugar level is at the moment than our upbringing or genetics, researchers said at the annual meeting of the Society for the Study of Ingestive Behavior. “The finding suggest our brains have a way to override our genetic inheritance, upbringing and habits to respond to our immediate nutritional needs,” said Dr. Ellen Schur, associate professor of medicine at the University of Washington.

In the study, Schur and UW Medicine colleagues at Harborview Medical Center, used brain scans to compare how appetite centers in the brains of identical twins responded to images of high- and low-calorie foods. The scans, called functional magnetic resonance imagers (fMRI), detect differences in the activity in different brain centers by measuring changes in blood flow

Studies of identical twins have shown that genetics and upbringing play a major role in a person’s body weight regulation. In this study, the researchers wanted to determine what role inherited similarities in brain function had on appetite. In the study, the researchers hypothesized that because identical twins have nearly identical genetic inheritance their brain appetite centers would react similarly when the twins were shown the images of high- and low-calorie foods. To test their hypothesis, the researchers enrolled 21 pairs of identical (monozygotic) twins who had been raised together. The twins were first fed a standardized breakfast, and 3.5 hours later they underwent a baseline fMRI brain scan.

After the first scan, they were fed a filling meal of macaroni and cheese, calibrated to satiate their appetites. The twins then had a second fMRI during which they were shown photographs of non-fattening foods, such as fruits and vegetables, and fattening foods, such as pizza and french fries.

Changes in blood flow measured by the fMRI were used to assess how the activity of key appetite-regulating centers in their brains changed in response to the images. Afterwards, the twins were offered to eat what they would like from a buffet. How much they ate from the buffet was recorded.

At regular intervals during the experiment the twins were asked to rate their feelings of hunger, fullness and satisfaction, using a standardized scale, and blood samples were taken to measure blood glucose levels and levels of regulatory hormones, such as insulin, leptin and ghrelin.

The researchers found the twin pairs gave similar responses when asked to rate their appetite before and after meals, had similar hormonal responses, and even ate similar amounts from the buffet — findings that suggest these responses were influenced by their shared upbringing and genetics. That did not seem to be the case with brain activation in response to the food images, however, when response tended to be greater in the twin with lower blood glucose levels.

The findings suggest that while genetics and upbringing play a big role in how much we weigh and how much we normally eat, our immediate response to food in the environment is driven by our bodies need for nutrition at the time, Schur said. “Just looking at pictures of high-calorie foods when we are hungry strongly engages parts of the brain that motivate us to eat.” The study’s findings might help explain why eating regular meals helps people keep their weight under control, she said.

Source: Society for the Study of Ingestive Behavior

  • RSSRSS
  • Social Slider
  • RSS