Quantcast
Channel: University of Cambridge - Latest news
Viewing all 4368 articles
Browse latest View live

Tiny changes in Parkinson’s protein can have “dramatic” impact on processes that lead to the disease

$
0
0

In a new study, a team of academics at the Centre for Misfolding Diseases, in the Department of Chemistry at the University of Cambridge, show that tiny changes in the amino acid sequence of the protein alpha-synuclein can have a dramatic effect on microscopic processes leading to its aggregation that may occur in the brain, eventually resulting in someone being diagnosed with Parkinson’s Disease.

Alpha-synuclein is a protein made up of 140 amino acids, and under normal circumstances plays an important part in helping with the smooth flow of chemical signals in the brain.

Parkinson’s Disease is thought to arise because, for reasons researchers still do not fully understand, the same protein sometimes malfunctions. Instead of adopting the specific structural form needed to do its job, it misfolds and begins to cluster, creating toxic, thread-like structures known as amyloid fibrils. In the case of Parkinson’s Disease, these protein deposits are referred to as Lewy-bodies.

The new study examined mutated forms of alpha-synuclein which have been found in people from families with a history of Parkinson’s Disease. In all cases, these mutations involved just one change to the protein’s amino acid sequence.

Although the differences in the sequence are small, the researchers found that they can have a profound effect on how quickly or slowly fibrils start to form. They also found that the mutations strongly influence a process called “secondary nucleation”, in which new fibrils are formed, in an auto-catalytic manner, at the surface of existing ones and thus enable the disease to spread.

The study stresses that these findings do not explain why humans get the disease. Parkinson’s Disease does not always emerge as a result of the mutations and has multiple, complex causes, which are not fully understood.

Patrick Flagmeier, a PhD student at St John’s College, University of Cambridge, and the study’s lead author, said: “As a finding, it helps us to understand fundamental things about the system by which this disease emerges. In the end, if we can understand all of this better, that can help us to develop therapeutic strategies to confront it. Our hope is that this study will contribute to the global effort towards comprehending why people with these mutations get the disease more frequently, or at a younger age.”

Although people who do not have mutated forms of alpha-synuclein can still develop Parkinson’s Disease, the five mutations studied by the research team were already known as “familial” variants – meaning that they recur in families where the disease has emerged, and seem to increase the likelihood of its onset.

What was not clear, until now, is why they have this effect. “We wanted to know how these specific changes in the protein’s sequence influence its behaviour as it aggregates into fibrils,” Flagmeier said.  

To understand this, the researchers conducted lab tests in which they added each of the five mutated forms of alpha-synuclein, as well as a standard version of the protein, to samples simulating the initiation of fibril formation, their growth and their proliferation.

The first round of tests examined the initiation of aggregation, using artificial samples recreating conditions in which misfolded alpha-synuclein attaches itself to small structures that are present inside brain cells called lipid vesicles, and then begins to cluster.

The researchers then tested how the different versions of the protein influence the ability of pre-formed fibrils to extend and grow. Finally, they tested the impact of mutated proteins on secondary nucleation, in which, under specific conditions, the fibrils can multiply and start to spread.

Overall, the tests revealed that while the mutated forms of alpha-synuclein do not notably influence the fibril growth, they do have a dramatic effect on both the initial formation of the fibrils, and their secondary nucleation. Some of the mutated forms of the protein made these processes considerably faster, while others made it almost “undetectably slow”, according to the researchers’ report.

“We have only recently discovered the autocatalytic amplification process of alpha-synuclein fibrils, and the results of the present study will help us to understand in much more detail the mechanism behind this process, and in what ways it differs from the initial formation of aggregates.” said Dr.  Alexander Buell, one of the senior authors on the study.

Why the mutations have this impact remains unclear, but the study opens the door to understanding this in detail by identifying, for the first time, that they have such a dramatic impact on very particular stages of the process.

Dr. Céline Galvagnion, another of the senior authors on the study, said: “This study quantitatively correlates individual changes in the amino acid sequence of alpha-synuclein with its tendency to aggregate. However, the effect of these mutations on other parameters such as the loss of the protein’s function and the efficiency of clearance of alpha-synuclein needs to be taken into account to fully understand the link between the familial mutations of alpha-synuclein and the onset of Parkinson’s Disease.”

“The effects we observed were changes of several orders of magnitude and it was unexpected to observe such dramatic effects from single-point mutations,” Flagmeier said. “It seems that these single-point mutations in the sequence of alpha-synuclein play an important role in influencing particular microscopic steps in the aggregation process that may lead to Parkinson’s Disease.”

The full study, which also involves Professors Chris Dobson and Tuomas Knowles, is published in the journal, Proceedings of the National Academy of Sciences.

Reference:

Flagmeier, P. et. al: Mutations associated with familial Parkinson’s disease alter the initiation and amplification steps of α-synuclein aggregation. PNAS (2016): DOI: 10.1073/pnas.1604645113

Specific mutations in the protein associated with Parkinson’s Disease, in which just one of its 140 building blocks is altered, can make a dramatic difference to processes which may lead to the condition’s onset, researchers have found.

Our hope is that this study will contribute to the global effort towards comprehending why people with these mutations get the disease more frequently, or at a younger age
Patrick Flagmeier
Image of “amyloid fibrils”; thread-like structures which form after the protein alpha-synuclein aggregates. Plaques (protein deposits) consisting of this protein have been found in the brains of Parkinson ’s Disease patients and linked to disease.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Why mole rats are more flexible than we previously thought

$
0
0

Mole rats, including the naked mole rat, live in underground colonies. The majority of rodents in the colonies are ‘workers’, with only one female (the ‘queen’) and one male responsible for breeding. All individuals cooperate by digging large underground tunnel systems to forage for food, and if a large food source is found, it is shared with the entire colony. ‘Queens’ and reproductive males remain in this role for their entire life after they have achieved this position. When a ‘queen’ dies, the strongest and largest helper is probably the prime candidate for inheriting the breeding position.

Early studies suggested that non-reproducing mole rats can be divided into non-workers, infrequent workers and frequent workers, and that most individuals stay members of distinct castes for their entire lives. Individual mole rats would focus on a particular task, such as digging, nest building or colony defence, throughout their lives.

Now, however, in a study published in Proceedings of the National Academy of Sciences, researchers from the Department of Zoology at the University of Cambridge have shown that in Damaraland mole rats, the contributions of individuals to cooperative activities change with age and that individual differences in behaviour that appeared to be a consequence of differences in caste are, in fact, age-related changes in behaviour. Whether variation in behaviour between naked mole rats is also a consequence of similar age-related changes is not known – but this seems likely.

Dr Markus Zöttl, first author of the study, explains: “In some ants, aphids and termites, individuals are born into castes that fulfil certain roles, such as soldiers or workers. Initially, everyone thought that this was only found in social invertebrates, like ants and bees, but in the eighties, the discovery of the social behaviour of mole rats challenged this view. Social mole rats were thought to be unique among vertebrates, in that they also had castes. To understand this fully, what we needed was long-term data on many mole rats over extended periods of their lives.”

To study mole rat development in more detail, a team at Cambridge led by Professor Tim Clutton-Brock from the Department of Zoology built a laboratory in the Kalahari Desert, where Damaraland mole rats are native, and established multiple colonies of mole rats in artificial tunnel systems. Over three years, they followed the lives of several hundred individuals to document how the behaviour of individuals changes as they age. All individuals were weighed and observed regularly to document their behavioural changes.

The researchers found that individual mole rats play different roles as they grow and get older.  Rather than being specialists, mole rats are generalists that participate in more or fewer community duties at different stages of their lives. Instead of behaving like ants or termites, where individuals are members of a caste and specialise in doing certain activities, all mole rats are involved in a range of different activities, and their contributions to cooperative activities increases with age.

“As Damaraland mole rats do not have castes, this may mean that castes are only found in social invertebrates and have not evolved in any vertebrates,” adds Dr Zöttl. “Mole rat social organisation probably has more in common with the societies of other cooperative mammals, such as meerkats and wild dogs, than with those of social insects.”

The research was funded by European Research Council.

One of the most interesting facts about mole rats – that, as with ants and termites, individuals specialise in particular tasks throughout their lives – turns out to be wrong. Instead, a new study led by the University of Cambridge shows that individuals perform different roles at different ages and that age rather than caste membership accounts for contrasts in their behaviour.

Mole rat social organisation probably has more in common with the societies of other cooperative mammals, such as meerkats and wild dogs.
Markus Zöttl

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

New model could help improve prediction of outbreaks of Ebola and Lassa fever

$
0
0

Many of the major new outbreaks of disease, particularly in Africa, are so-called zoonotic infections, diseases that are transmitted to humans from animals. The Ebola virus, for example, which recently killed over 11,000 people across Africa, was most likely transmitted to humans from fruit bats.

Modelling how outbreaks arise and whether they will take hold or quickly die out has proved challenging, with two factors in particular being difficult to quantify. The first is ‘spillover’, where the pathogen – a virus or parasite, for example – passes from an animal to a person. This can be through direct transmission, for example by being bitten or by eating ‘bush meat’ (wild animals such as fruit bats or monkeys that are caught and consumed), or indirectly, such as through contact with faeces or disease-carrying mosquitoes.

In many cases, a spillover will go no further. When a human is bitten by a rabid dog, they may become infected, but as the disease cannot transmit from human-to-human, the disease hits a dead end.

However, in some cases the infected person goes on to infect other humans. This is the case for diseases such as Ebola, Lassa fever (spread from rodents) and Crimean Congo haemorrhagic fever (spread from ticks). But in many cases, unless there are additional spillover events, the disease eventually fades out. This is referred to as a ‘stuttering chain’, and even though the disease is transmitted from human-to-human, they are still considered to be zoonotic infections.

Diseases such as HIV, however, which almost certainly began as a spillover from chimpanzees, are no longer considered to be zoonotic as the chain of transmission from humans to other humans is continuous and no longer relies on spillover to sustain transmission.

“Modelling spillovers is a real challenge,” says Dr Gianni Lo Iacono from the Department of Veterinary Medicine at the University of Cambridge. “We don’t have particularly good data on wildlife numbers, such as fruit bats in Sierra Leone, and only a crude idea of their geographic distribution and how many are infected. Even in the UK, we don't really know how many deer we have, which would be really useful to estimate the risk of Lyme disease.”

In addition, measuring the likelihood of contact with the infected animals is also extremely difficult as it involves understanding human and animal behaviour.

Stuttering transmission, too, can be difficult to model, says Dr Lo Iacono. “In the case of Lassa fever, people who catch the disease from animals show the same symptoms as those who get it from humans. So is this case a spillover or part of a human-to-human chain of transmission? And if members of the same family get the disease, have they caught it from a family member or from the same pot of contaminated rice?

“Sometimes you can be lucky and work this out, as we did in a previous study, but this was possible because information of outbreaks that were known to be pure human-to-human chains was, unusually, available. But we need more general methods.”

Dr Lo Iacono and colleagues have developed the most coherent and potentially most accurate mathematical model to date for zoonotic diseases, which incorporates spillover and stuttering transmission.

“The pathogen does not care if it jumped from an animal or from another human; the only difference is that in a stuttering transmission an infected person can trigger other chains of human infections. A general, realistic model should capture this mechanism,” adds Dr Lo Iacono.

Details of the model, including a demonstration applying the framework to Lassa fever, are published today in the open access journal PLOS Neglected Tropical Diseases.

“By modelling potential outbreaks more accurately, we can help inform public health messages,” explains Professor James Wood, Head of the Department of Veterinary Medicine, and senior author. “If you know that most cases of an outbreak of Lassa fever come from spillovers, then the message might be ‘kill the rats’, but if it is now mainly spreading between humans, the messages will be around washing your hands or avoiding contact with bodily fluids.”

The beauty of the model, say the researchers, is that it is simple to implement, so public health officials and non-mathematicians could easily use it. It also allows for the incorporation of data from different disciplines, factoring in socioeconomic, ecological and environmental factors, for example.

“It’s important to understand if and how these other important factors can increase the impact of stuttering chains,” says Professor Wood. “Ebola has always been a very severe disease but previously confined to small, remote regions. Then suddenly, in the last two years it exploded in West Africa. Why? Was it because social patterns changed? Our model could be used to address such questions better.”

The research informing the paper was carried out as part of the Dynamic Drivers of Disease in Africa Consortium, which was funded by Ecosystem Services for Poverty Alleviation (ESPA).

Reference
Lo Iacono, G et al. A unified framework for the infection dynamics of zoonotic spillover and spread. PLOS NTD; 2 Sept 2016; DOI: 10.1371/journal.pntd.0004957

Potential outbreaks of diseases such as Ebola and Lassa fever may be more accurately predicted thanks to a new mathematical model developed by researchers at the University of Cambridge. This could in turn help inform public health messages to prevent outbreaks spreading more widely.

If you know that most cases of an outbreak of Lassa fever come from spillovers, then the message might be ‘kill the rats’, but if it is now mainly spreading between humans, the messages will be around washing your hands or avoiding contact with bodily fluids
James Wood
Ebola in West Africa

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

New exoplanet think tank will ask the big questions about extra-terrestrial worlds

$
0
0

With funding from The Kavli Foundation, the think tank will bring together some of the major researchers in exoplanetary science – arguably the most exciting field in modern astronomy – for a series of annual meetings to address the biggest questions in this field which humanity could conceivably answer in the next decade.

“We’re really at the frontier in exoplanet research,” said Dr Nikku Madhusudhan of Cambridge’s Institute of Astronomy, who is leading the think tank. “The pace of new discoveries is incredible – it really feels like anything can be discovered any moment in our exploration of extra-terrestrial worlds. By bringing together some of the best minds in this field we aim to consolidate our collective wisdom and address the biggest questions in this field that humanity can ask and answer at this time.”

Tremendous advances have been made in the study of exoplanets since the first such planet was discovered around a sun-like star in 1995 by the Cavendish Laboratory’s Professor Didier Queloz. Just last month, a potentially habitable world was discovered in our own neighbourhood, orbiting Proxima Centauri, the nearest star to the sun.

However, there are still plenty questions to be answered, such as whether we’re capable of detecting signatures of life on other planets within the next ten years, what the best strategies are to find habitable planets, how diverse are planets and their atmospheres, and how planets form in the first place.

With at least four space missions and numerous large ground-based facilities scheduled to become operational in the next decade, exoplanetary scientists will be able to detect more and more exoplanets, and will also have the ability to conduct detailed studies of their atmospheres, interiors, and formation conditions. At the same time, major developments are expected in all aspects of exoplanetary theory and data interpretation.

In order to make these major advances in the field, new interdisciplinary approaches are required. Additionally, as new scientific questions and areas emerge at an increasingly fast pace, there is a need for a focused forum where emerging questions in frontier areas of the field can be discussed. “Given the exciting advancements in exoplanetary science now is the right time to assess the state of the field and the scientific challenges and opportunities on the horizon,” said Professor Andy Fabian, director of the Institute of Astronomy at Cambridge.

The think tank will operate in the form of a yearly Exoplanet Symposium series which will be focused on addressing pressing questions in exoplanetary science. One emerging area or theme in exoplanetary science will be chosen each year based on its critical importance to the advancement of the field, relevance to existing or imminent observational facilities, need for an interdisciplinary approach, and/or scope for fundamental breakthroughs.

About 30 experts in the field from around the world will discuss outstanding questions, new pathways, interdisciplinary synergies, and strategic actions that could benefit the exoplanet research community.

The inaugural symposium, “Kavli ExoFrontiers 2016”, is being held this week in Cambridge. The goal of this first symposium is to bring together experts from different areas of exoplanetary science to share their visions about the most pressing questions and future outlook of their respective areas. These visions will help provide both a broad outlook of the field and identify the ten most important questions in the field that could be addressed within the next decade. “We hope the think tank will provide a platform for new breakthroughs in the field through interdisciplinary and international efforts while bringing the most important scientific questions of our time to the fore,” said Madhusudhan. “We are in the golden age of exoplanetary science.”

More information about the Kavli ExoFrontiers 2016 Symposium is available at: www.ast.cam.ac.uk/meetings/2016/kavli.exofrontiers.2016.symposium

An international exoplanet ‘think tank’ is meeting this week in Cambridge to deliberate on the ten most important questions that humanity could answer in the next decade about planets outside our solar system.

We’re really at the frontier in exoplanet research.
Nikku Madhusudhan
Artist’s impression of the ultracool dwarf star TRAPPIST-1 from the surface of one of its planets

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Artificial pancreas trial in young children with diabetes receives €4.6millon grant from European Commission

$
0
0

Type 1 diabetes is one of the most common chronic diseases in children; around one in 4,000 children under 14 years of age is diagnosed with the disease each year in the UK. The disease causes the pancreas to stop producing sufficient insulin to regulate blood sugar (glucose) levels, and poor glucose control can lead to complications including eye, heart and kidney disease. Episodes of very low glucose levels can cause serious complications and may be life threatening.

People affected by the condition have to manage their condition through long term treatment. This usually involves regular insulin injections – in some cases, several times a day. However, a team at the University of Cambridge and Cambridge University Hospitals hopes to replace these treatments with an artificial pancreas, a small, portable medical device designed to carry out the function of a healthy pancreas in controlling blood glucose levels, using digital technology to automate insulin delivery. The system is worn externally on the body, and is made up of three functional components: continuous glucose monitoring, a computer algorithm to calculate the insulin dose, and an insulin pump.

The artificial pancreas promises to transform management of type 1 diabetes. Several trials have already shown that it is effective for use both school children and adults in the home environment, and last year saw the first natural birth to a mother fitted with an artificial pancreas. However, there has as yet been no research into its use by young children at home.

Now, KidsAP, a collaboration led the University of Cambridge and involving institutes across Europe and in the US, has received a €4.6millon under the European Commission’s Horizon 2020 programme to carry out a trial of the artificial pancreas among children aged one to seven years with type 1 diabetes. Cambridge has received a €1.6m share of the grant to act as coordinator of the project.

“We’ve already seen that the artificial pancreas can have a very positive effect on people’s lives and now, thanks to funding from the European Commission, we can see whether young children will also see these same benefits,” said Dr Roman Hovorka from the Department of Paediatrics at the University of Cambridge and Addenbrooke’s Hospital, who is leading the project. “At the moment, children have to have frequent insulin injections that are at best inconvenient, but at worst painful. We hope this new technology will eliminate this need.”

An initial pilot of 24 children, the main study will split 94 children into two groups: one will be treated over a year by the artificial pancreas and the other half by state-of-the-art insulin pump therapy, already used by some adults and teenagers. The researchers will measure quality of life and investigate the impact of the two approaches on the children’s daily life, as well as looking at which is the more effective, and cost-effective, approach.

“If the artificial pancreas is shown to be more beneficial than insulin pump therapy, then we expect that it will change how type 1 diabetes is managed both nationally and internationally, with a much improved quality of life for young children,” added Professor David Dunger, collaborator on the project.

An international trial to test whether an artificial pancreas can help young children manage their type 1 diabetes will begin next year, thanks to a major grant awarded by the European Commission.

We’ve already seen that the artificial pancreas can have a very positive effect on people’s lives and now, thanks to funding from the European Commission, we can see whether young children will also see these same benefits.
Roman Hovorka

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Young people exposed to vaping ads less likely to think occasional smoking is bad for health

$
0
0

Estimates suggest that among children who try smoking, between one third and one half are likely to become regular smokers within two to three years. However, young people are now more likely to experiment with e-cigarettes than they are with tobacco cigarettes. For example, a 2014 study found that 22% of children aged 11-15 in England had experimented with e-cigarettes, compared to 18% for tobacco cigarettes.

There is concern that the increasing exposure of children to e-cigarette adverts could be contributing to high rates of experimentation; in the US, adolescents’ exposure to e-cigarette adverts on TV more than trebled between 2011 to 2013. E-cigarette brands often market themselves as helping people quit smoking and as healthier and cheaper alternatives to tobacco cigarettes.

In this study from researchers at the Behaviour and Health Research Unit, University of Cambridge, and the University of North Carolina Gillings School of Global Public Health, and published today in the journal Tobacco Control, more than 400 English children aged 11-16 who had never smoked or ‘vaped’ previously were recruited and randomly allocated to one of three groups. One group was shown ten adverts that depicted e-cigarettes as glamorous, a second group was shown ten adverts that portrayed them as healthy, and a third control group was shown no adverts.

The children were then asked a series of questions aimed at determining their attitudes towards smoking and vaping. Children shown the adverts were no more or less likely than the control group to perceive tobacco smoking as appealing and all three groups understood that smoking more than ten cigarettes a day was harmful. However, both groups of children exposed to the e-cigarette adverts, both healthy and glamorous, were less likely to believe that smoking one or two tobacco cigarettes occasionally was harmful.

Dr Milica Vasiljevic from the Department of Public Health and Primary Care at the University of Cambridge says: “While we can be optimistic that the adverts don’t seem to make tobacco smoking more appealing to young people, they do appear to make occasional smoking seem less harmful. This is worrying, as we know that even occasional tobacco smoking is bad for your health, and young people who smoke occasionally believe they are somehow immune to its effects and do not feel the need to quit.”

The group of children that were shown adverts depicting e-cigarettes as glamorous also believed e-cigarette vaping to be more prevalent than did the other two groups.

Professor Theresa Marteau, Director of the Behaviour and Health Research Unit and a Fellow of Christ’s College, University of Cambridge, adds: “E-cigarette marketing across Europe is regulated under the new EU Tobacco Products Directive, which came into effect on the 20th May this year. The Directive limits the exposure of children to TV and newspaper e-cigarette adverts. However, it does not cover advertising in the form of posters, leaflets, and adverts at point of sale, nor does it cover the content of marketing materials depicting e-cigarettes as glamorous or healthy. The findings from our study suggest these omissions could present a threat to the health of children.”

The study was funded by the Department of Health.

Reference
Petrescu, D, Vasiljevic, M, Pepper, JK, Ribisl, KM, Marteau, TM . What is the impact of e-cigarette adverts on children’s perceptions of tobacco smoking? An experimental study. Tobacco Control; 6 Sept 2016; DOI: 10.1136/tobaccocontrol-2016-052940

Exposure to advertisements for e-cigarettes may decrease the perceived health risks of occasional tobacco smoking, suggests new research from the University of Cambridge, prompting concern that this may lead more young people to experiment with smoking.

While we can be optimistic that the adverts don’t seem to make tobacco smoking more appealing to young people, they do appear to make occasional smoking seem less harmful.
Milica Vasiljevic
E-cigarette

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Oesophageal cancer treatments could be tailor-made for individual patients, study finds

$
0
0

The findings, published in Nature Genetics on Monday, could help find drugs that target specific weaknesses in each subtype of the disease, potentially making treatment more effective and boosting survival.

Researchers looked at the complete genetic make-up of 129 oesophageal cancers and were able to subdivide the disease into three distinct types based on patterns detected in the DNA of the cancer cells called signatures.

The first subtype they found had faults in their DNA repair pathways. Damage to this pathway is known to increase the risk of breast, ovarian and prostate cancers. Patients with this subtype may benefit from a new family of drugs called PARP inhibitors that kill cancer cells by exploiting this weakness in their ability to repair DNA.

The second subtype had a higher number of DNA mistakes and more immune cells in the tumours, which suggests these patients could benefit from immunotherapy drugs already showing great promise in a number of cancer types such as skin cancer.

The final subtype had a DNA signature that is mainly associated with the cell ageing process and means this group might benefit from drugs targeting proteins on the surface of the cancer cells which make cells divide.

Professor Rebecca Fitzgerald, lead researcher based at the MRC Cancer Unit at the University of Cambridge, said: “Our study suggests we could make changes to the way we treat oesophageal cancer.

"Targeted treatments for the disease have so far not been successful, and this is mostly down to the lack of ways to determine which patients might benefit from different treatments. These new findings give us a greater understanding of the DNA signatures that underpin different subtypes of the disease and means we could better tailor treatment.

“The next step is to test this approach in a clinical trial. The trial would use a DNA test to categorise patients into one of the three groups to determine the best treatments for each group and move away from a one-size-fits-all approach.”

Each year around 8,800 people are diagnosed with oesophageal cancer in the UK, with just 12 per cent surviving for at least ten years. Cancer Research UK, who along with the Medical Research Council funded the study, has prioritised research into oesophageal cancer to help more people survive the disease by bringing people together, building infrastructure and developing the next generation of research leaders.

Professor Peter Johnson, Cancer Research UK’s chief clinician, said: “Being able to distinguish distinct types of oesophageal cancer is a genuinely new discovery from this work.  For the first time we may be able to identify and test targeted treatments designed to exploit the cancer’s specific weaknesses. Although survival rates from oesophageal cancer have been slowly rising in the last few years they are still far too low, and this research points the way to a completely new way of understanding and tackling the disease.”

The study, funded by Cancer Research UK and the Medical Research Council, is part of the Cancer Research UK-funded International Cancer Genome Consortium.

Reference

Secrier, M. et al. Mutational signatures in esophageal adenocarcinoma reveal etiologically distinct subgroups with therapeutic relevance Nature Genetics, 2016. 

Tailored, targeted treatment for patients with oesophageal cancer could be developed after scientists discovered that the disease can be classified into three different subtypes

Our study suggests we could make changes to the way we treat oesophageal cancer.
Rebecca Fitzgerald
Image of an oesophageal carcinoma

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Opinion: Why danger is exciting – but only to some people

$
0
0

It has been the most deadly summer for wingsuit flying to date. But what makes some people want to base jump off a cliff, binge drink to oblivion or hitchhike with strangers while others don’t even enjoy a rollercoaster ride? Is there such a thing as scaredy-cat gene or a daredevil brain structure? Or is our level of attraction to danger down to how protective our parents were?

Whether our weakness is extreme sports, speeding, drugs or other dangerous behaviours, it is typically a mix of risk and novelty that draw us in. What psychologists call “novelty seeking” is the preference for the unexpected or new. People with this trait are often impulsive and easily bored – but new experiences release a surge of pleasure chemicals in their brains. A rat or human with preferences for novelty will be more likely to do drugs and binge drink.

The concepts of risk and novelty are to some extent linked: a new stimulus is inherently more risky in that any associated consequence is unknown. However, we can dissociate these two in the laboratory.

It’s (always) about dopamine

Dopamine, used by neurons to transmit messages to other neurons, is often described as the brain’s “pleasure chemical”. Dopamine cells lie in the mid-brain, deep in the base of the brain, and send “projections” to brain regions where the dopamine molecule is released – such as those involved in the control of action, cognition and reward. Studies have shown that the dopamine system can be activated by rewarding experiences, such as eating, having sex or taking drugs.

In a study of patients with Parkinson’s disease, who were on drugs that stimulated dopamine receptors used to treat their movement symptoms, 17% developed highly unexpected behavioural addictions to gambling or compulsive sexual, shopping or eating behaviours. These patients also sought out risks more, and showed a preference for novelty on lab tests. So it seems that an active dopamine system can make us take more risks.

A study on anticipating risk showed that expecting a win is associated with an increase in brain activity in dopamine regions, whereas expecting a loss is associated with a decrease in such activity. Both drive us to take risk. Wingsuit flying or roller coaster riding are motivated by our expectation of reward – a thrill – but wingsuit flying may also driven by an urge to avoid loss (in this case death). The likelihood of a thrill from base jumping or a roller coaster is close to 100%. But while the likelihood of death from a rollercoaster ride is close to 0%, the chances of dying from basejumping are considerably higher. The closer to the extremes, 0% or 100%, the more certain, whereas the closer to 50%, the more uncertain.

Dopamine reward pathways in the human brain.Oscar Arias-Carrión1, Maria Stamelou, Eric Murillo-Rodríguez, Manuel Menéndez-González and Ernst Pöppel. - Oscar Arias-Carrión1, Maria Stamelou, Eric Murillo-Rodríguez, Manuel Menéndez-González and Ernst Pöppel., CC BY-SA

 

Many, but not all, studies have found that people with a certain dopamine receptor are more likely to be thrill seeking. This gene variant is also associated with greater responses to unexpected rewards in the brain, making the unexpected thrill more thrilling. Genetic hardwiring might therefore explain the tendency towards base jumping, linking the preference for novelty and also possibly for risk and reward. But how we are brought up also has an impact. And adolescents are known to be more risk taking, partly because their brains are still developing and they are more susceptible to peer pressure.

And, of course, there may be other reasons why we enjoy bungee jumping or binge drinking than an attraction to risk and novelty. For example, this can happen in social situations where there’s peer pressure for us to conform, or if we are feeling down or stressed.

Why are we inconsistent?

But if our genes can influence whether we’re brave or fearful, how come we are so inconsistent in our behaviour? For example, we may sky dive on holiday yet buy travel insurance.

Have we all got an inner piglet?wikimedia

 

We act differently based on whether the risk is perceived to gain reward or avoid loss – an effect known as framing. Most of us tend to avoid risky rewards – we’d rather not go sky diving – but in the case of an unlikely event with a high payout such as a lottery ticket, we’re happy to take a risk. We also normally seek risk in order to avoid huge losses. This is affected by how likely it is that the outcome might occur. In the case of an unlikely but possibly very bad outcome, such as the risk of incurring massive debt while hospitalised in a foreign country, we become risk averse and buy travel insurance.

People who enjoy danger or suffer from disorders of addiction have different risk tendencies. Pathological users of illegal drugs, alcohol or food all seek risk in the face of rewards – by going after the high. But those who use illegal drugs are driven by more risky high rewards whereas those that pathologically use alcohol or food are driven by less risky lower rewards.

How likely we are to take risks can also be manipulated. A study in rats showed that risk taking can be reduced by mimicking the dopamine signal providing information about the negative outcomes from previous risky choices – such as a shock to the foot or not receiving food. Risk taking in binge drinkers can also be reduced when they are explicitly exposed to a loss outcome – such as experiencing a loss of money rather than just expecting it. A night in an emergency room may therefore be enough to change their behaviour.

Also, a new and unexpected context can increase risk-taking behaviours, which could explain why we are more likely to take risks on holiday. In a recent study, my colleagues and I showed participants a series of faces – familiar or unknown ones – and asked them to choose between a risky gamble or a safe choice. When shown a new face, subjects were more likely to take the risky gamble. The study showed that those with greater brain activity in the striatum, a region involved in dopamine release, to the new face made greater risky choices. These findings suggest that novelty increases dopamine release in this area of the brain, which then possibly enhances the expectation of reward.

But being drawn to danger isn’t necessarily a bad thing. Our society needs both risk takers and risk avoiders to function. We need those that push boundaries – to set up camp on Mars or rescue people from fires – and we need those that write the rules and enforce regulations to keep society functioning.

Valerie Voon, Honorary Consultant Neuropsychiatrist and Senior Clinical Research Associate, Department of Psychiatry, University of Cambridge

This article was originally published on The Conversation. Read the original article.

The opinions expressed in this article are those of the individual author(s) and do not represent the views of the University of Cambridge.

The Conversation

Valerie Voon (Department of Psychiatry) discusses what makes some people want to base jump off a cliff, while others don’t even enjoy a rollercoaster ride.

BASE Jumping from Sapphire Tower, Istanbul

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Massive holes ‘punched’ through a trail of stars likely caused by dark matter

$
0
0

Researchers have detected two massive holes which have been ‘punched’ through a stream of stars just outside the Milky Way, and found that they were likely caused by clumps of dark matter, the invisible substance which holds galaxies together and makes up a quarter of all matter and energy in the universe.

The scientists, from the University of Cambridge, found the holes by studying the distribution of stars in the Milky Way. While the clumps of dark matter that likely made the holes are gigantic in comparison to our Solar System – with a mass between one million and 100 million times that of the Sun – they are actually the tiniest clumps of dark matter detected to date.

The results, which have been submitted to the Monthly Notices of the Royal Astronomical Society, could help researchers understand the properties of dark matter, by inferring what type of particle this mysterious substance could be made of. According to their calculations and simulations, dark matter is likely made up of particles more massive and more sluggish than previously thought, although such a particle has yet to be discovered.

“While we do not yet understand what dark matter is formed of, we know that it is everywhere,” said Dr Denis Erkal from Cambridge’s Institute of Astronomy, the paper’s lead author. “It permeates the universe and acts as scaffolding around which astrophysical objects made of ordinary matter – such as galaxies – are assembled.”

Current theory on how the universe was formed predicts that many of these dark matter building blocks have been left unused, and there are possibly tens of thousands of small clumps of dark matter swarming in and around the Milky Way. These small clumps, known as dark matter sub-haloes, are completely dark, and don’t contain any stars, gas or dust.

Dark matter cannot be directly measured, and so its existence is usually inferred by the gravitational pull it exerts on other objects, such as by observing the movement of stars in a galaxy. But since sub-haloes don’t contain any ordinary matter, researchers need to develop alternative techniques in order to observe them.

The technique the Cambridge researchers developed was to essentially look for giant holes punched through a stream of stars. These streams are the remnants of small satellites, either dwarf galaxies or globular clusters, which were once in orbit around our own galaxy, but the strong tidal forces of the Milky Way have torn them apart. The remnants of these former satellites are often stretched out into long and narrow tails of stars, known as stellar streams.

“Stellar streams are actually simple and fragile structures,” said co-author Dr Sergey Koposov. “The stars in a stellar stream closely follow one another since their orbits all started from the same place. But they don’t actually feel each other’s presence, and so the apparent coherence of the stream can be fractured if a massive body passes nearby. If a dark matter sub-halo passes through a stellar stream, the result will be a gap in the stream which is proportional to the mass of the body that created it.”

The researchers used data from the stellar streams in the Palomar 5 globular cluster to look for evidence of a sub-halo fly-by. Using a new modelling technique, they were able to observe the stream with greater precision than ever before. What they found was a pair of wrinkled tidal tails, with two gaps of different widths.

By running thousands of computer simulations, the researchers determined that the gaps were consistent with a fly-by of a dark matter sub-halo. If confirmed, these would be the smallest dark matter clumps detected to date.

“If dark matter can exist in clumps smaller than the smallest dwarf galaxy, then it also tells us something about the nature of the particles which dark matter is made of – namely that it must be made of very massive particles,” said co-author Dr Vasily Belokurov. “This would be a breakthrough in our understanding of dark matter.”

The reason that researchers can make this connection is that the mass of the smallest clump of dark matter is closely linked to the mass of the yet unknown particle that dark matter is composed of. More precisely, the smaller the clumps of dark matter, the higher the mass of the particle.

Since we do not yet know what dark matter is made of, the simplest way to characterise the particles is to assign them a particular energy or mass. If the particles are very light, then they can move and disperse into very large clumps. But if the particles are very massive, then they can’t move very fast, causing them to condense – in the first instance – into very small clumps.

“Mass is related to how fast these particles can move, and how fast they can move tells you about their size,” said Belokurov. “So that’s why it’s so interesting to detect very small clumps of dark matter, because it tells you that the dark matter particle itself must be very massive.”

“If our technique works as predicted, in the near future we will be able to use it to discover even smaller clumps of dark matter,” said Erkal. “It’s like putting dark matter goggles on and seeing thousands of dark clumps each more massive than a million suns whizzing around.”

Reference:
Denis Erkal et al. ‘A sharper view of Pal 5s tails: Discovery of stream perturbations with a novel non-parametric technique.’ arXiv:1609.01282

The discovery of two massive holes punched through a stream of stars could help answer questions about the nature of dark matter, the mysterious substance holding galaxies together.

While we do not yet understand what dark matter is formed of, we know that it is everywhere.
Denis Erkal
Artist's impression of dark matter clumps around a Milky Way-like galaxy

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

“Opening the skull” of patients after head injury reduces risk of death from brain swelling

$
0
0

Traumatic brain injury is a serious injury to the brain, often caused by road traffic accidents, assaults or falls. It can lead to dangerous swelling in the brain which, in turn, can lead to brain damage or even death.

A team led by researchers at the Department of Clinical Neurosciences, University of Cambridge, and based at Addenbrooke's Hospital, recruited over 400 traumatic brain injury patients over a ten-year period from the UK and another 19 countries worldwide. They then randomly assigned the patients to one of two groups for treatment – craniectomy or medical management.

In research published this week in the New England Journal of Medicine, the researchers report that six months after the head injury, just over one in four patients (27%) who received a craniectomy had died compared to just under a half (49%) of patients who received medical management. However, the picture was complicated as patients who survived after a craniectomy were more likely to be dependent on others for care (30.4% compared to 16.5%).

Further follow-up showed that patients who survived following a craniectomy continued improving from six to 12 months after injury. As a result, at 12 months, nearly half of craniectomy patients were at least independent at home (45.4%), as compared with one-third of patients in the medical group (32.4%).

Peter Hutchinson, Professor of Neurosurgery at the Department of Clinical Neurosciences at Cambridge, says: “Traumatic brain injury is an incredibly serious and life-threatening condition. From our study, we estimate that craniectomies can almost halve the risk of death for patients with a severe traumatic brain injury and significant swelling. Importantly, this is the first high-quality clinical trial in severe head injury to show a major difference in outcome. However, we need to be really conscious of the quality of life of patients following this operation which ranged from vegetative state through varying states of disability to good recovery.”

Angelos Kolias, Clinical Lecturer at the Department, adds: “Doctors and families will need to be aware of the wide range of possible long-term outcomes when faced with the difficult decision as to whether to subject someone to what is a major operation. Our next step is to look in more detail at factors that predict outcome and at ways to reduce any potential adverse effects following surgery. We are planning to hold a consensus meeting in Cambridge next year to discuss these issues.”

The research was funded by the Medical Research Council (MRC) and managed by the National Institute for Health Research (NIHR) on behalf of the MRC–NIHR partnership, with further support from the NIHR Cambridge Biomedical Research Centre, the Academy of Medical Sciences, the Health Foundation, the Royal College of Surgeons of England and the Evelyn Trust.

Reference

Hutchinson, PJ et al. Trial of Decompressive Craniectomy for Traumatic Intracranial Hypertension, NEJM; 2 Sept 2016

Craniectomy – a surgical procedure in which part of the skull is removed to relieve brain swelling – significantly reduces the risk of death following traumatic brain injury, an international study led by the University of Cambridge has found. 

Traumatic brain injury is an incredibly serious and life-threatening condition. From our study, we estimate that craniectomies can almost halve the risk of death for patients with a severe traumatic brain injury and significant swelling.
Peter Hutchinson
Firefighters Training for Operation Fodient

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Stolen World War Two letters help author uncover the hidden lives of army wives

$
0
0

Army Wives by Midge Gillies, Academic Director for Creative Writing at the Institute of Continuing Education (ICE), uses first-hand accounts, diaries and letters to piece together some of the extraordinary stories of servicemen’s wives through history – from Crimea to the war in Afghanistan.

Exploring all aspects of army life across the centuries; from the impact of life-changing injuries to séances, public memorials and death in foreign fields, Army Wives seeks to understand the singular experience of what it means for women to be part of the ‘army family’.

But it is perhaps the wartime letters of Diana Carnegie to her husband James which provide the most personal, colourful and touching accounts of a life wedded to both the soldier she loved, and the uncertain life of a military wife.

“I wanted a distinctive voice that took me beyond the familiar stories of bombing, blackouts and barrage balloons of the Second World War,” said Gillies. “Then I read a piece in The Telegraph about a cache of letters being sold at auction which provided an uncensored account of the war that wasn’t available in Pathe newsreels: Diana talked about couples having sex outside Buckingham Palace on VE Day and ‘getting tiddly’ on the way to hear a speech by Ernest Bevin.”

However, Gillies’ joy at outbidding her rivals at auction for the letters was short-lived when the auction house phoned her to reveal that although the letters were sold in good faith, they had in fact been stolen as part of a house burglary more than a decade earlier, and that she should expect a call from Kent Constabulary.

Sophie Carnegie, one of Diana’s two surviving daughters, only learnt about the letters’ reappearance a day after the auction, thanks to a chance phone conversation with someone who mentioned the Telegraph piece in passing. Sophie and her twin sister Charlotte both realised the letters must have come from a chest stolen when their parents’ house was burgled after Diana Carnegie’s death in 1998.

“Fortunately, the family were delighted I wanted to write about their parents and would help me as much as they could,” added Gillies. It didn’t take long to see that this was rich material. Diana wrote about the Home Guard and her fears of invasion right up until the terror of the V1 bombs and, finally, the agony of waiting for her husband to be demobbed at the war’s end. Her voice was witty, sassy and vivid – I liked her immediately.”

 

As well as Carnegie’s letters, Gillies visited and spoke with around 30 current and former army wives, as well as visiting archives across the UK in a sometimes difficult search for the voices of the women who were both left behind – or made the arduous journey to the front lines with their husbands.

Although it seems incredible today, the wives of British soldiers fighting in the Crimea were among the last of many to witness battle at close quarters; travelling with their husbands or sometimes stowing away on board Royal Navy ships in an effort not to be parted. Army wives, especially those married to lower-ranking men, often suffered terrible hardships and lived in squalor alongside their husbands, spending years in distant parts of the Empire, or accompanying their husbands from one seat of unrest to the other.

When a regiment was ordered abroad, a certain number of places were allocated for the wives of ordinary soldiers. In 1800, six women per 100 were allowed to go with their husbands. When soldiers began to travel further afield this rose to 12 per 100 men in India, China and New South Wales, and by the 1870s it was one in eight soldiers.

The wives drew lots to determine who would accompany their husbands in a tense and very public ritual that was usually left to the very last minute to avoid the risk of desertion if a man found his wife was to be left behind.

“This most cruel of lucky dips took place either in a room into which the wives filed in order of their husbands’ rank, or sometimes, at the very dock where the soldiers’ ship was waiting,” added Gillies. “This led to harrowing scenes in which distraught wives waited to find out their fate; the wrong scrap of paper or the wrong-coloured pebble meant they may not see their husband for several years – if ever again.”

In her book, Gillies recounts the experience of 24-year-old Nell Butler who followed husband Michael, a private in the 95th Derbyshire, to Crimea aboard troop ships and 20-mile-a-day marches.  Watching from a ship as a major battle commenced, Nell pleaded to be allowed ashore to search for Michael after fearing he must have been injured in the fierce fighting.

Once ashore, Nell trudged her way to Balaklava where she searched hospital ships and was mistaken for a nurse; being called into action to hold a soldier’s hand as his leg was amputated without anesthetic. Despite fainting, she earned herself a nursing role, tearing up her petticoats as makeshift bandages to treat the most appalling battlefield injuries.

Eventually, she found the badly-injured Michael and accompanied him to a hospital 300 miles away where she is thought to have served under Florence Nightingale in the hellish conditions that because synonymous with the conflict and the reforms of battlefield medicine and surgery.

Not that conditions for soldiers and their wives were markedly better at home. Army Wives reveals how overcrowding, poor hygiene, and a lack of basic cleaning facilities meant that diseases such as typhoid and tuberculosis were often rife, and their toll catastrophic.

In 1864 there was an outbreak of scarlet fever among army children at Aldershot and between 1865-1874, 120 children living in huts on Woolwich Common died of the same disease or diphtheria, at a much higher rate than in the civilian population.

Disease was by no means confined to home barracks. Husbands returning from service abroad often brought unwanted gifts back to their wives. The steady supply of prostitutes to army camps led to one estimate, in the middle of the 19th century, that around one quarter of the British Army had VD.

Rates for infection remained high in India, rising to 438 admissions per 1,000 men in 1890-93, double the rate for the British Army at home, and almost six times the German Army. This was partly why more wives were allowed to follow their husbands to the subcontinent.

In the 20th century, two world wars produced new generations of army wives and widows who lived through separation, injury and the deaths of husbands by forging friendships that lasted into peacetime. More recently, the Cold War and the war on terror has produced a new breed of more independent women who have supported their loved ones through an evolving landscape of combat operations.

“While the roles, expectations and the day-to-day lives of army wives may have altered over time, there were constant recurring themes as I wrote the book,” added Gillies.  “Accommodation has always been a bone of contention and the state of army housing remains a real cause for concern today.

“Likewise, although communication is a lot easier than the days of letters and telegrams, our era of instant communication brings with it its own problems when husbands in difficult and demanding situations are available on a daily basis via Facebook or Skype to hear that Jonny isn’t doing his homework or that the washing machine is on the blink when there is nothing they can do about it from such a distance.”

The strain is evident in divorce rates for soldiers and their wives. The figure remains much higher than that for couples in civilian life. So many army wives put their career second to become, effectively, a single mum for the time their husbands are deployed. Likewise, they often face the strain of uprooting their lives, and the children’s lives, time and again for new postings in the UK and overseas.

“The lot of an army wife is waiting, being there to support and almost being gagged in a sense,” said Gillies. “A lot of the wives I spoke to seemed inhibited about speaking to me either because they feared getting their husbands into trouble, or because of their fears about the war on terror after the death of Fusilier Lee Rigby.

“But on the plus side, the friendship networks they develop are fantastic and for those who throw themselves into the life, the experience can be a great one. There was a real sense of service among many of the wives I spoke to – even if their lives can sometimes be very lonely and unpredictable.”

Gillies was also struck by the importance that couples still place on letters. Lyrics for the song, Wherever you are, which was written as a result of Gareth Malone’s TV programme, The Choir: Military Wives (2011) and which reached Number One, was based on letters and poems. For army families the letter is still king – even if it is delivered electronically before being printed out as an “e-bluey”.  While the rest of us have abandoned letters in favour of texts and other forms of electronic communication the Army should provide rich pickings for future historians. 

A stolen chest of letters – penned by an army wife to her husband on the battlefields of the Second World War – has helped a Cambridge academic and biographer trace the history of the women behind the men in uniform.

Diana talked about couples having sex outside Buckingham Palace on VE Day and ‘getting tiddly’.
Midge Gillies
Diana Carnegie with her husband James and her children Charlotte, Sue and Sophie

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Lines of Thought: Understanding Anatomy

$
0
0

Since March, some of the world’s most valuable books and manuscripts have been on display as Cambridge University Library celebrates its 600th birthday with a once-in-a-lifetime free exhibition of its greatest treasures.

The objects in Lines of Thought: Discoveries that Changed the World, which will close to the public on September 30, communicate 4,000 years of human thought through the Library’s unique and irreplaceable collections. More than 70 per cent of the exhibits are displayed to the public for the first time.

The exhibition investigates through six distinct themes how Cambridge University Library’s eight million books and manuscripts have transformed our understanding of life here on earth and our place among the stars.

In Understanding Anatomy, curator Anna Jones reflects on some of the different ways in which the anatomy of the body has been represented over time and for different purposes, both on the page and in 3D.

“For many people the stand-out object of Lines of Thought is Vesalius’ Epitome, published in Switzerland in 1543,” said Jones. “Vesalius specially commissioned for the work to promote his thesis that the practice of dissection was essential to the study of anatomy.

“The University Library’s copy of the Epitome is very special because it’s hand-coloured, probably for presentation to an important patron, and the colouring really brings out the detail in the different layers of the manikin.”

Dissection had been practised during ancient times by the great Roman physician Galen, but had fallen out of use as a teaching method in the west during the middle ages until it was revived during the European renaissance. Early 16th century English students commonly travelled to the great centres of medical education in Italy or France to benefit from the influence of Vesalius and others, and eventually the techniques they learnt there were adopted at home.

“In Lines of Thought we also have the earliest-known written record of a dissection in England,” added Jones. “The book belonged to Thomas Lorkyn, who was Regius Professor of Physic in Cambridge. On 28 March, 1565, Lorkyn hosted one at Magdalene College, using the body of former criminal Richard or Ralph Tiple, recently hanged at Cambridge Castle just across the road.

“The dissection was carried out by a professional surgeon from London, while Lorkyn, the ‘instructor’, read out from a book – quite possibly the one on display here – and the students gathered round to watch and learn.”

Lorkyn left his books to the UL in his will at his death in 1591, the same year that exquisite models of a skeleton and musculature, also on display, were presented to the Library by the leading London barber surgeon John Banister. Such objects highlight that from relatively early on the University Library was a place to find current material for teaching and learning, as well as a repository for safe-keeping.

 

Learning from the body itself may be the ideal, but the limited supply of cadavers and challenges to preservation are some of the reasons people have looked to models to provide good substitutes for learning anatomy. Paper works well as a relatively cheap medium that you can fold, layer, and cut out, and the exhibits in the exhibition give a flavour of the variety across the University collections as a whole.

“The study and practice of anatomy – seeing and doing – remains an important strand in University teaching and research today,” added Jones. “Lines of Thought reminds us of some of the significant points in the development of our understanding of our bodies and how they function, and crucially how the books and objects on display have played a part in shaping the present.”

A hand-coloured copy of Vesalius’ 1543 Epitome – one of the most influential works in western medicine – and the first written record of a dissection carried out in England are among the objects in our latest film celebrating Lines of Thought at Cambridge University Library.

The study and practice of anatomy – seeing and doing – remains an important strand in University teaching and research today.
Anna Jones

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Quadruple helix form of DNA may aid in the development of targeted cancer therapies

$
0
0

Scientists have identified where a four-stranded version of DNA exists within the genome of human cells, and suggest that it may hold a key to developing new, targeted therapies for cancer. 

In work funded by Cancer Research UK and EMBO, the researchers, from the University of Cambridge, found that these quadruple helix structures occur in the regions of DNA that control genes, particularly cancer genes, suggesting that they may play a role in switching genes on or off. The results, reported in the journal Nature Genetics, could also have implications for cancer diagnostics and the development of new targeted treatments. 

Most of us are familiar with the double helix structure of DNA, but there is also a version of the molecule which has a quadruple helix structure. These structures are often referred to as G-quadruplexes, as they form in the regions of DNA that are rich in the building block guanine, usually abbreviated to ‘G’. These structures were first found to exist in human cells by the same team behind the current research, but at the time it was not exactly clear where these structures were found in the genome, and what their role was, although it was suspected that they had a link with certain cancer genes.

“There have been a number of different connections made between these structures and cancer, but these have been largely hypothetical,” said Professor Shankar Balasubramanian, from Cambridge’s Department of Chemistry and Cancer Research UK Cambridge Institute, and the paper’s senior author. “But what we’ve found is that even in non-cancer cells, these structures seem to come and go in a way that’s linked to genes being switched on or off.” 

Starting with a pre-cancerous human cell line, the researchers used small molecules to change the state of the cells in order to observe where the G-quadruplexes might appear. They detected approximately 10,000 G-quadruplexes, primarily in regions of DNA associated with switching genes on or off, and particularly in genes associated with cancer. 

“What we observed is that the presence of G-quadruplexes goes hand in hand with the output of the associated gene,” said Balasubramanian. This suggests that G-quadruplexes may play a similar role to epigenetic marks: small chemical modifications which affect how the DNA sequence is interpreted and control how certain genes are switched on or off. 

The results also suggest that G-quadruplexes hold potential as a molecular target for early cancer diagnosis and treatment, in particular for so-called small molecule treatments which target cancer cells, instead of traditional treatments which hit all cells. 

“We’ve been looking for an explanation for why it is that certain cancer cells are more sensitive to small molecules that target G-quadruplexes than non-cancer cells,” said Balasubramanian. “One simple reason could be that there are more of these G-quadruplex structures in pre-cancerous or cancer cells, so there are more targets for small molecules, and so the cancer cells tend to be more sensitive to this sort of intervention than non-cancer cells. 

“It all points in a certain direction, and suggests that there’s a rationale for the selective targeting of cancer cells.” 

“We found that G-quadruplexes appear in regions of the genome where proteins such as transcription factors control cell fate and function,” said Dr Robert Hänsel-Hertsch, the paper’s lead author. “The finding that these structures may help regulate the way that information is encoded and decoded in the genome will change the way we think this process works.”

Dr Emma Smith, Cancer Research UK’s science information manager, said: “Figuring out the fundamental processes that cancer cells use to switch genes on and off could help scientists develop new treatments that work against many types of the disease. And exploiting weaknesses in cancer cells could mean this approach would cause less damage to healthy cells, reducing potential side effects. It’s still early days, but promising leads like this are where the treatments of the future will come from.”

Reference:
Robert Hänsel-Hertsch et. al. ‘G-quadruplex structures mark human regulatory chromatin.’ Nature Genetics (2016). DOI: 10.1038/ng.3662

Researchers have identified the role that a four-stranded version of DNA may play in the role of cancer progression, and suggest that it may be used to develop new targeted cancer therapies.

It all points in a certain direction, and suggests that there’s a rationale for the selective targeting of cancer cells.
Shankar Balasubramanian
Crystal structure of parallel quadruplexes from human telomeric DNA.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Major global study reveals new hypertension and blood pressure genes

$
0
0

The discoveries include DNA changes in three genes that have much larger effects on blood pressure in the population than previously seen, providing new insights into the physiology of hypertension and suggesting new targets for treatment.

High blood pressure or hypertension is a major risk factor for cardiovascular disease and premature death. It is estimated to be responsible for a larger proportion of global disease burden and premature mortality than any other disease risk factor. However, there is limited knowledge on the genetics of blood pressure.

The teams investigated the genotypes of around 347,000 people and their health records to find links between their genetic make-up and cardiovascular health. The participants included healthy individuals and those with diabetes, coronary artery disease and hypertension, from across Europe, (including the UK, Denmark, Sweden, Norway, Finland and Estonia), the USA, Pakistan and Bangladesh. The study brought together around 200 investigators from across 15 countries.

Study author Professor Patricia Munroe from QMUL said: “We already know from earlier studies that high blood pressure is a major risk factor for cardiovascular disease. Finding more genetic regions associated with the condition allows us to map and understand new biological pathways through which the disease develops, and also highlight potential new therapeutic targets. This could even reveal drugs that are already out there but may now potentially be used to treat hypertension.”

Most genetic blood pressure discoveries until now have been of common genetic variants that have small effects on blood pressure. The study, published in Nature Genetics, has found variants in three genes that appear to be rare in the population, but have up to twice the effect on blood pressure.

“The sheer scale of our study has enabled us to identify genetic variants carried by less than one in a hundred people that affect blood pressure regulation,” said study author, Dr Joanna Howson from Cambridge’s Department of Public Health and Primary Care. “While we have known for a long time that blood pressure is a risk factor for coronary heart disease and stroke, our study has shown that there are common genetic risk factors underlying these conditions.”

RBM47 is a gene that encodes for a protein responsible for modifying RNA. RRAS is involved in cell cycle processes and has already been implicated in a syndrome related to ‘Noonan syndrome’ which is characterised by heart abnormalities. COL21A1 is involved in collagen formation in many tissues, including the heart and aorta. COL21A1 and RRAS warrant particular interest since both are involved in blood vessel remodelling, with relevance to hypertension.

The team also found a mutation in a key molecule ENPEP that affects blood pressure. This gene codes for an enzyme that is a key molecule involved in regulating blood pressure through the dilation and constriction of blood vessels, and is currently a therapeutic target.

Professor Jeremy Pearson, Associate Medical Director at the British Heart Foundation which part-funded the research, said: “Large scale genetic studies continue to expand the number of genes that may contribute to the development of heart disease, or risk factors such as high blood pressure. But so far most of the genes discovered in these studies individually have only very small effects on risk – though they may still provide valuable clues for new drug targets.

“This study has increased the number of genes implicated in control of blood pressure to almost 100 and, in the process, has also identified three genes that have larger effects on blood pressure than previously found.”

The study was also funded by the National Institute for Health Research (NIHR), National Institute of Health (NIH), Wellcome Trust and the Medical Research Council.

Reference:
Surendran et al. ‘Trans-ancestry meta-analyses identify rare and common variants associated with blood pressure and hypertension’. Nature Genetics 2016. DOI: 10.1038/ng.3654

Thirty-one new gene regions linked with blood pressure have been identified in one of the largest genetic studies of blood pressure to date, involving over 347,000 people, and jointly led by Queen Mary University of London (QMUL) and the University of Cambridge. 

While we have known for a long time that blood pressure is a risk factor for coronary heart disease and stroke, our study has shown that there are common genetic risk factors underlying these conditions.
Joanna Howson
Pulmonary hypertension-associated vasculitis

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Placenta plays pivotal “umpire” role to influence pregnancy outcomes

$
0
0

Researchers have shown for the first time how the placenta “umpires” a fight for nutrients between a pregnant mother and her unborn baby. The study suggests that the placenta will adjust the amount of nutrients transported to the foetus for growth in line with the mother’s physical ability to supply.

The findings, published in the journal PNAS, suggest that if the bodily environment that a mother provides for her baby is unfavourable, for example through small body size or metabolic dysfunction, the placenta will change the flow of nutrients to the foetus relative to her own state. This can affect foetal development, resulting in complications at birth.

It is the first time that scientists have been able to provide clear evidence that the placenta plays the decisive role in this delicate balancing act, rather than merely acting as a passive interface which enables the transfer of nutrients from mother to foetus.

The study, by researchers at the University of Cambridge, involved making a precise genetic change in mice, which caused poor growth and changed the mother’s bodily environment. They then observed how the placenta developed and acted in response, finding that in mothers in which this alteration had been made, the structure of the placenta was different, and fewer nutrients reached the foetus.

A better understanding of how the placenta manages the trade-off will eventually enable researchers to reduce pregnancy complications in both humans and other mammals.

The study was led by Dr Amanda Sferruzzi-Perri, a Research Associate at St John’s College, University of Cambridge, and is part of a five-year project in the Department of Physiology, Development and Neuroscience examining the relationship between the placenta and pregnancy complications.

“During pregnancy there is a kind of ‘tug-of-war’ going on between the mother and the foetus over who gets the nutrients that the mother ingests,” Sferruzzi-Perri said. “This work shows for the first time that the placenta is the umpire which controls that fight. Understanding more about the placenta’s role is extremely important. If nutrients cannot be divided correctly during pregnancy, it can lead to life-threatening complications for expectant mothers, and long-term health consequences for both mother and child.”

At least one in every eight pregnancies in the UK is affected by complications stemming from impairment of the placenta. In the developing world the rate is even higher, with at least one in every five pregnant women affected. The potential consequences include abnormal birth weight, premature delivery, pre-eclampsia, and maternal diabetes.

A major cause appears to be the placenta’s response to unfavourable biological changes in the mother herself. These may, for example, be the result of poor nutrition, high stress levels, metabolic dysfunction, or obesity.

How the placenta allocates nutrients in these situations, however, and the hormonal signals that the placenta may be releasing while doing so, is not fully understood. By understanding these processes better, researchers hope to identify both the biological early warning signals that a problem has arisen, and their relationship to specific causes, enabling them to develop therapeutic interventions that reduce the number of complications overall.

The new study represents a step towards those aims because researchers were able to directly influence the balancing act that the placenta performs and observe it in relation to both the physiology of the mother, and the actual growth and nutrient supply of the foetus.

To achieve this they used a model system where an enzyme called p110 alpha was genetically modified in mice. In a healthy mother, this enzyme is activated by hormones like insulin and insulin-growth factors (IGFs), kick-starting a relay race within cells which stimulates nutrient uptake and, as a result, normal growth and metabolic function. By altering this enzyme, the team reduced the mother’s overall responsiveness to such hormones, creating an unfavourable environment.

The results showed that in mothers which carried the altered form of p110 alpha, the placenta’s growth and structure was impaired. As well as being physically different, it was also found to be transporting fewer nutrients to the unborn offspring.

Because of the way in which the experiments were set up, the team were also able to see what would happen to the placenta if the foetus carried the altered form of p110 alpha, but the mother was normal. They found that in these cases, the placenta also showed defects, but was able to compensate for this by transporting more nutrients to the foetus, and thus optimising nutrition.

This shows that the placenta will fine-tune the distribution of nutrients between the mother and foetus, in response to the circumstances in which it finds itself. It also indicates that, because the mother needs to be able to support her baby both during pregnancy and after birth, the placenta will do its best to judge how much nutrition the foetus receives, so that the mother’s health is not compromised.

“The placenta is taking in signals all the time from the mother and the foetus,” Sferruzzi-Perri explained. “If the mother has some sort of defect in her ability to grow, the placenta will limit the amount of nutrients it allocates to the foetus to try and preserve her health.”

“What this tells us is that the mother’s environment is a very strong, modifiable characteristic to which we should be paying more attention, in particular to see if there are specific factors that we can change to improve the outcome of pregnancies. Being able to influence the mother’s environment through changes in p110 alpha gives us a means to study this in a controlled way, and to work out what those critical factors are.”

The next stage of the research will involve examining the signals that the placenta sends to the mother to affect the way she uses the nutrients she ingests, potentially providing important clues about biomarkers which provide an early warning of pregnancy complications.

Dr Sferruzzi-Perri’s research is supported by a Dorothy Hodgkin Fellowship from the Royal Society. Her paper, Maternal and fetal genomes interplay through phosphoinositol 3-kinase(PI3K)-p110α signalling to modify placental resource allocation, is published in PNAS. The work was supported by a Next Generation Fellowship from the Centre for Trophoblast Research.

New research provides the first clear evidence that the amount of nutrients transported to the foetus by the placenta adjusts according to both the foetal drive for growth, and the mother’s physical ability to provide.

During pregnancy there is a kind of ‘tug-of-war’ going on between the mother and the foetus over who gets the nutrients that the mother ingests. This work shows for the first time that the placenta is the umpire which controls that fight
Amanda Sferruzzi-Perri
"Pregnant".

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Opinion: Imposing an arbitrary national language would only divide Pakistan further

$
0
0

For a country seven decades old, Pakistan is dealing with a surprisingly fundamental political and cultural problem: a struggle over what language to use for government.

The Supreme Court has ordered the government to use the constitutionally-mandated national language, Urdu, in place of English in the many contexts where English is currently used. (Ironically, the court’s order was itself written in English.) Prime Minister Nawaz Sharif has declared his enthusiasm for the transition to Urdu, and a committee was constituted to monitor its progress.

But is imposed monolingualism a good fit for South Asia – or does it in fact follow a very Eurocentric idea of how a nation-state should work?

This discussion has been rumbling on and off ever since India and Pakistan achieved independence. Both of their post-colonial constitutions required that after 15 years, English should be officially replaced by Urdu and Hindi respectively, but both countries eventually side-stepped the requirement. Pakistan continued to use English without comment alongside Urdu; India declared it a “subsidiary official language”, symbolically inferior to Hindi but nonetheless still recognised.

Today, the problem comes in how narrowly Urdu and Hindi are defined by the bodies tasked with monitoring and developing the official languages.

Pakistan’s National Language Promotion Department (formerly the National Language Authority) and India’s Department of Official Language both have a reputation for filling their respective languages with clunky neologisms. These are used to avoid common English loanwords; Hindi ones are drawn largely from Sanskrit, and Urdu’s largely from Arabic and Persian.

The people who complain about the language policy aren’t necessarily trying to maintain their English-speaking privilege; there really are genuine questions about the character of the official language. If its speakers commonly use words that aren’t recognised by governmental language bodies, is it right to have a two-track system in which there is a governmental variety of a language and very different one that normal people use?

Taking a hard line against English as a colonial language makes little sense decades after independence, especially when it has become the language of international business and when English loanwords have become embedded in people’s everyday usage in other South Asian languages. And looking back over history, this is a very recent argument anyway.

The scorched-earth cultural politics of imposing a national language never took hold in the subcontinent before modern India and Pakistan came into being. Persian was the apex language during Mughal times and well into the era of British colonial rule, but it never overwhelmed the subcontinent’s longstanding linguistic diversity.

Many modern historians never think to question the colonial line that Persian was “thoroughly debasing and worthless” in India, but this is a fiction; I myself wrote an entire book arguing against the idea that Persian was a foreign imposition that patriotic Indians never really embraced. In reality, people used the languages available to them, making allowances for difference and freely taking words from other languages.

It was recognised, as the old Hindi saying has it, that in South Asia “kos kos par bhasha badle, do kos par pani” or “the language changes every mile, and the taste of the water every two miles”.

Overridden and overwhelmed

In Europe, where national languages are largely a foregone conclusion, we tend to forget how brutal and undemocratic their imposition was.

Languages other than English, notably Irish and Welsh, were repressed across the British Isles in early modern times. The 1536 Welsh Act of Union, for example, excluded Welsh speakers from all government posts.

Across the English Channel, the adoption of standard French involved centuries of violent confrontation with Occitan and Breton speakers. The 1539 Ordinance of Villers-Cotterêts, which replaced Latin with French in legal documents, has often been read as an act of popular liberation from the dead hand of Latin, but from the minority-language perspective it was a disaster; whereas all linguistic communities had previously used the same Latin documents, now only one community was represented.

But while Europe’s worst battles over minority and non-standard languages have been largely swept under the rug in recent centuries, radically multilingual India and Pakistan simply don’t have enough rug to do the same. An unintended consequence of decolonisation has been an almost colonial imposition of artificial, non-colloquial registers of Hindi and Urdu by Indian and Pakistani elites, who are concerned that without a unifying national language their nations will face devastating social and political disintegration.

This is misguided. Instead of repeating some of the unsavoury linguistic nationalism of early modern Europe, these elites should celebrate the wide variation in usage. They should acknowledge the ways Hindi and Urdu mix with languages like English and Punjabi, and make allowances for the complexity of language in society.

Far too little attention is routinely paid to how the citizens themselves might wish to speak. Nothing illustrates this more poignantly than a 1951 speech by the Aga Khan, in which he argued that the only possible national language for the new Islamic Republic of Pakistan could be Arabic. While he addressed the point that Urdu was the mother tongue of a tiny minority of Pakistanis and thus apparently unsuitable as a national language, he did not acknowledge the undeniable fact that Arabic was the mother tongue of precisely 0% of Pakistanis.

And of course, he gave the speech in English.

Arthur Dudney, Leverhulme Early Career Fellow, Asian and Middle Eastern Studies, University of Cambridge

This article was originally published on The Conversation. Read the original article.

The opinions expressed in this article are those of the individual author(s) and do not represent the views of the University of Cambridge.

The Conversation

Arthur Dudney (Faculty of Asian and Middle Eastern Studies) discusses Pakistan's struggle over what language to use for government.

Getting girls into school in Pakistan's Punjab region

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Gaia results revealed – first data release from the most detailed map ever made of the sky

$
0
0

Detailed information about more than a billion stars in the Milky Way has been published in the first data release from the Gaia satellite, which is conducting the first-ever ‘galactic census.’ The release marks the first chance astronomers and the public have had to get their hands on the most detailed map ever made of the sky.

Gaia, which orbits the sun at a distance of 1.5 million kilometres from the earth, was launched by the European Space Agency in December 2013 with the aim of observing a billion stars and revolutionising our understanding of the Milky Way. During its expected five-year lifespan, Gaia will observe each of a billion stars about 70 times.

The unique mission is reliant on the work of Cambridge researchers who collect the vast quantities of data transmitted by Gaia to a data processing centre at the University, overseen by a team at the Institute of Astronomy.

“Gaia’s first major data release is both a wonderful achievement in its own right, and a taster of the truly dramatic advances to come in future years,” said Professor Gerry Gilmore from the Institute of Astronomy, who is also the UK Principal Investigator for Gaia. “Several UK teams have leading roles in Gaia’s Data Processing and Analysis efforts, which convert the huge raw data streams from the satellite into the beautiful science-ready information now made available for the global scientific and public communities. UK industry made critical contributions to the Gaia spacecraft. The UK public, including school students, as well as scientists, are sharing the excitement of this first ever galactic census.”

In addition to the work taking place at Cambridge, teams from Edinburgh, the Mullard Space Science Laboratory (MSSL) at UCL London, Leicester, Bristol and the Science and Technology Facilities Council’s Rutherford Appleton Lab are all contributing to the processing of the vast amounts of data from Gaia, in collaboration with industrial and academic partners from across Europe.

The team in Cambridge, led by Dr Floor van Leeuwen, Dr Dafydd Wyn Evans and Dr Francesca De Angeli, processed the flux information – the amount of energy that crosses a unit area per unit time – providing the calibrated magnitudes of around 1.6 billion stars, 1.1 billion of which are now published as part of the first data release.

The Cambridge team also check the daily photometric data for unexpected large outliers, which led to the regular publication of photometric science alerts that were ready for immediate follow-up observations from the ground.

“The sheer volume of data processed for this first release is beyond imagination: around 120 billion images were analysed, and most of these more than once, as all the processing is iterative,” said van Leeuwen, who is Gaia photometric data processing lead. “Many problems had to be overcome, and a few small ones still remain. Calibrations have not yet reached their full potential. Nevertheless, we are already reaching accuracies that are significantly better than expected before the mission, and which can challenge most ground-based photometric data in accuracy.”

“This first Gaia data release has been an important exercise for the Gaia data reduction teams, getting them to focus on deliverable products and their description,” said Evans. “But it is only the first small step towards much more substantial results.”

While today marks the first major data release from Gaia, in the two years since its launch, the satellite has been producing scientific results in the form of Gaia Alerts.

Dr Simon Hodgkin, lead of the Cambridge Alerts team said, “The Gaia Alerts project takes advantage of the fact that the Gaia satellite scans each part of the sky repeatedly throughout the mission. By comparing successive observations of the same patch of sky, scientists can search for transients – astronomical objects which brighten, fade, change or move. These transients are then announced to the world each day as Gaia Alerts for both professional and amateur astronomers to observe with telescopes from the ground.”

The range of Gaia’s discoveries from Science Alerts is large – supernovae of various types, cataclysmic variable stars, novae, flaring stars, gravitational microlensing events, active galactic nuclei and quasars, and many sources whose nature remains a mystery.

Gaia has discovered many supernovae, the brilliant explosions of stars at the end of their lives. Many of these have been ‘Type Ia’ supernovae, which can be used to measure the accelerating expansion of the Universe. But among these apparently typical supernovae there have been some rarer events. Gaia16ada was spotted by Gaia in the nearby galaxy NGC4559, and appears to be an eruption of a very massive, unstable star. The Hubble Space Telescope observed this galaxy some years ago, allowing astronomers to pinpoint the precise star which erupted.

Another lucky catch for Gaia was the discovery of Gaia16apd – a supernova which is nearly a hundred times brighter than normal. Astronomers still don't know what the missing ingredient in these ultra-bright supernovae is, and candidates include exotic rapidly spinning neutron stars, or jets from a black hole. Cambridge astronomer Dr Morgan Fraser is trying to understand these events, saying, “We have only found a handful of these exceptionally bright supernovae, compared to thousands of normal supernovae. For Gaia to spot one so nearby is a fantastic result.”

Many of the Gaia Alerts found so far are bright enough to be observable with a small telescope. Amateur astronomers have taken images of supernovae found by Gaia, while schoolchildren have used robotic telescopes including the Faulkes Telescopes in Australia and Hawaii to do real science with transients.

Dr Hodgkin said: “Since the announcement of the first transients discovered with Gaia in September 2014, over one thousand alerts have been released. With Gaia continually relaying new observations to ground, and our team working on finding transients continually improving their software, the discoveries look set to continue well into the future!”

For the UK teams the future means providing improvements in the pre-processing of the data and extending the processing to cover the photometric Blue and Red prism data. Also data from the Radial Velocity Spectrometer, with major involvement from MSSL, will be included in future releases. The photometric science alerts will continue to operate throughout the mission, and summaries of the results will be included in future releases. “Despite the considerable amount of data, the first Gaia data release provides just a taste of the accuracy and completeness of the final catalogue,” said De Angeli.

The Cambridge Gaia team has also released a dedicated smartphone app, which will allow anyone worldwide to follow the Gaia Alerts discoveries as they happen. Real spacecraft data will be available to the world as soon as it is processed, with users able to follow the discoveries and see what they are. Information to access the app is available at https://gaia.ac.uk.

The Gaia data processing teams in the UK have been and are being supported by the UK Space Agency and the STFC. STFC helped the set-up of the data applications centre and STFC’s current support involves the UK exploitation of the scientific data to be yielded from the mission. Industrial partners include Airbus, MSSL and e2v Technologies.

The first results from the Gaia satellite, which is completing an unprecedented census of more than one billion stars in the Milky Way, are being released today to astronomers and the public.

Gaia’s first major data release is both a wonderful achievement in its own right, and a taster of the truly dramatic advances to come in future years.
Gerry Gilmore
Gaia’s first sky map

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

A tight squeeze for electrons – quantum effects observed in ‘one-dimensional’ wires

$
0
0

Scientists have controlled electrons by packing them so tightly that they start to display quantum effects, using an extension of the technology currently used to make computer processors. The technique, reported in the journal Nature Communications, has uncovered properties of quantum matter that could pave a way to new quantum technologies.

The ability to control electrons in this way may lay the groundwork for many technological advances, including quantum computers that can solve problems fundamentally intractable by modern electronics. Before such technologies become practical however, researchers need to better understand quantum, or wave-like, particles, and more importantly, the interactions between them.

Squeezing electrons into a one-dimensional ‘quantum wire’ amplifies their quantum nature to the point that it can be seen, by measuring at what energy and wavelength (or momentum) electrons can be injected into the wire.

“Think of a crowded train carriage, with people standing tightly packed all the way down the centre of the carriage,” said Professor Christopher Ford of the University of Cambridge’s Cavendish Laboratory, one of the paper’s co-authors. “If someone tries to get in a door, they have to push the people closest to them along a bit to make room. In turn, those people push slightly on their neighbours, and so on. A wave of compression passes down the carriage, at some speed related to how people interact with their neighbours, and that speed probably depends on how hard they were shoved by the person getting on the train. By measuring this speed, one could learn about the interactions.”

“The same is true for electrons in a quantum wire – they repel each other and cannot get past, so if one electron enters or leaves, it excites a compressive wave like the people in the train,” said the paper’s first author Dr Maria Moreno, also from the Cavendish Laboratory.

However, electrons have another characteristic, their angular momentum or ‘spin’, which also interacts with their neighbours. Spin can also set off a wave carrying energy along the wire, and this spin wave travels at a different speed to the charge wave. Measuring the wavelength of these waves as the energy is varied is called tunnelling spectroscopy. The separate spin and charge waves were detected experimentally by researchers from Harvard and Cambridge Universities.

Now, in the paper published in Nature Communications, the Cambridge researchers have gone one stage further, to test the latest predictions of what should happen at high energies, where the original theory breaks down.

A flurry of theoretical activity in the past decade has led to new predictions of other ways of exciting waves among the electrons — it’s as if the person entering the train pushes so hard some people fall over and knock into others much further down the carriage. These new ‘modes’ are weaker than the spin and charge waves and so are harder to detect.

The collaborators of the Cambridge researchers from the University of Birmingham predicted that there would be a hierarchy of modes corresponding to the variety of ways in which the interactions can affect the quantum-mechanical particles, and the weaker modes should be strongest in very short wires.

To make a set of such short wires, the Cambridge group set about devising a way of making contact to a set of 6000 narrow strips of metal that are used to create the quantum wires from the semiconducting material gallium arsenide (GaAs). This required an extra layer of metal in the shape of bridges between the strips.

By varying the magnetic field and voltage, the tunnelling from the wires to an adjacent sheet of electrons could be mapped out, and this revealed evidence for the extra curves predicted, where it can be seen as an upside-down replica of the spin curve.

These results will now be applied to better understand and control the behaviour of electrons in the building blocks of a quantum computer.

Reference:
Moreno et al. Nonlinear spectra of spinons and holons in short GaAs quantum wires.’ Nature Communications (2016).DOI: 10.1038/ncomms12784 

Researchers have observed quantum effects in electrons by squeezing them into one-dimensional ‘quantum wires’ and observing the interactions between them. The results could be used to aid in the development of quantum technologies, including quantum computing. 

Regime of a single 1D wire subband filled

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

South Asian patients have worse experiences of GP interactions, study suggests

$
0
0

Patients’ evaluations of doctors’ interpersonal skills are used to assess quality of care. In both the UK and the US, certain minority ethnic groups report lower patient experience scores compared to the majority population. For example, the English General Practice Patient Survey found that South Asian groups report particularly low scores compared to the White British majority, with Bangladeshi and Pakistani groups providing the lowest scores.

Several potential explanations have been proposed for these lower ratings.  These mainly relate to whether South Asian patients receive lower quality care, or whether they receive similar care, but rate this more negatively.

To explore whether the low scores reflect a genuinely poor experience, researchers at the Centre for Health Services Research, University of Cambridge, showed 564 White British and 564 Pakistani adults a series of films showing typical clinical scenarios. They were asked to rate how good the GP was at various measures: giving sufficient time and listening to the patient in the the film, explaining the tests and treatment, involving the patients in decisions about care and treating them with care and concern.

Based on the participants’ responses, the researchers then gave a score out of 100 for how positively the participants had judged the GP’s performance in the vignettes. The results of the study, funded by the National Institute for Health Research, are published in the journal BMJ Open.

The scores from Pakistani participants were typically higher than those from White British participants when they’d seen the same video. The mean communication score from Pakistani participants was 67 of 100, ten points higher than the mean score from White British participants. When adjusted for age, gender, deprivation, self-rated health, and video, the difference increased to 11 points. The largest differences were seen when participants were over 55 years old.

“Given that Pakistani adults tend to have a more positive take on the same vignettes viewed by their White British counterparts, we can only conclude that the low scores they give in national surveys do genuinely reflect worse care,” says Dr Jenni Burt from the Cambridge Centre for Health Services Research at the University of Cambridge.

“To some extent, this may reflect challenges arising from language barriers and poorer health literacy, but this is unlikely to explain all of the variations in care. These findings very clearly show that there are major inequalities in care for minority ethnic groups.”

Professor Martin Roland, Emeritus Professor of Health Services Research at the University of Cambridge, adds: “Understanding why minority ethnic groups often give poorer evaluations of care is critical to helping health services improve the services they offer to their patients. We need more research now that focuses on how factors such as language barriers, health literacy, discrimination and system-level failures that combine to create inequalities that affect South Asian people.”

Reference
Burt, J et al. Understanding negative feedback from South Asian patients: experimental vignette study. BMJ Open; 8 Sept 2016; DOI: 10.1136/bmjopen-2016-011256

Communication between doctors and South Asian patients is poor, according to national GP surveys, but a question has been raised about whether this reflects genuinely worse experiences or differences in responding to questionnaires. Now, a new study led by researchers at the University of Cambridge has shown that it is in fact the former – South Asian patients do experience poorer communication with their GP than the White British majority.

Given that Pakistani adults tend to have a more positive take on the same vignettes viewed by their White British counterparts, we can only conclude that the low scores they give in national surveys do genuinely reflect worse care
Jenni Burt
Stethoscope

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Lines of Thought: Telling the Story of History

$
0
0

Since March, some of the world’s most valuable books and manuscripts have been on display as Cambridge University Library celebrates its 600th birthday. This fortnight is the last chance to see this once-in-a-lifetime free exhibition of its greatest treasures.

The objects in Lines of Thought: Discoveries that Changed the World, which will close to the public on September 30, communicate 4,000 years of human thought through the Library’s unique and irreplaceable collections. More than 70 per cent of the exhibits are displayed to the public for the first time.

The exhibition investigates through six distinct themes how Cambridge University Library’s eight million books and manuscripts have transformed our understanding of life here on earth and our place among the stars.

In Telling the Story of History, curator Dr John Wells traces the way in which literature has treated the monarchs and heroes of history.

Long before the development of evidence-based history, this was done through story-telling. Stories that elaborate on myths, legends and folk memories accumulate down the years, connecting successive ages with their past, and influencing writers of the present. In the Western European tradition fables inherited from classical antiquity have been passed down the centuries to inspire countless reinventions and retellings. Themes and characters from Homer’s Odyssey, for example, surface again and again in literature, from James Joyce’s Ulysses to Margaret Drabble’s novel The gates of ivory.

"Homer stands at the head of the Western European tradition of narrative, and there are no epics older than the Homeric epics – the influence that these texts have had is really quite incalculable," says Wells.

The plays of William Shakespeare, gathered here in the ‘First Folio’ of 1623, are a highwater mark of imaginative literature. Their fictional depiction of real people and real events, such as Henry V at the Battle of Agincourt, can shape our understanding of historical events.

"The 'First Folio' of Shakespeare, the collection of his plays which was published soon after his death by his friends John Heminges and Henry Condell, collects many plays which never saw print in Shakespeare's lifetime," explains Wells. "If it hadn't been for the work of Heminges and Condell, so many plays which are at the peak of English literary tradition would simply not be known to us."

"Shakespeare's views and interpretations of his characters really have affected the way in which we now think of historical figures," says Wells.

Fantastical fictional writings such as Dante’s Divine comedy also draw on figures of the past for their protagonists, or use allusion to pass on subtle messages. Folk stories, whether written on ancient papyri or in a modern novel, weave their way through Cambridge University Library’s collections and through our collective imaginations.

"Our line of thought, which we began with a papyrus fragment of Homer, leads right to the end of the 20th century now with Cambridge-educated novelist Margaret Drabble," says Wells.

"In her novel The gates of ivory, Drabble sets her characters against the great sweep of history, and in particular the revolution in Cambodia in the 1970s. The University Library is actively acquiring the archives of literary authors because we know that they are going to be subjects of study in the years to come - the notes and drafts which are accumulated are sources of scholarship in their own right."

Inset image: Homer, Fragments of the Odyssey, XII, ll. 250–304, Second century CE.

Shakespeare's 'First Folio', Dante's Divine Comedy, and fragments of Homer's Odyssey from the second century CE, are among the objects in our final film celebrating Lines of Thought at Cambridge University Library.

The influence that these texts have had is really quite incalculable
John Wells
Mr. William Shakespeares comedies, histories, & tragedies: published according to the true originall copies (the ‘First Folio’) London: printed by Isaac Jaggard and Edward Blount, 1623

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
Viewing all 4368 articles
Browse latest View live




Latest Images