Articles on this Page
- 01/13/17--08:16: _Call of duty: fight...
- 01/13/17--08:00: _Opinion: How we can...
- 01/16/17--02:00: _Releasing a better ...
- 01/16/17--03:49: _Frankly, do we give...
- 01/16/17--12:00: _A natural compound ...
- 01/16/17--16:40: _Patients recovering...
- 01/17/17--02:58: _Opinion: Four ways ...
- 01/17/17--06:51: _Opinion: The Full B...
- 01/18/17--00:58: _Opinion: Brexit and...
- 01/18/17--00:35: _How bright is your ...
- 01/18/17--02:00: _Teenagers who acces...
- 01/18/17--03:30: _Graduate, get a job...
- 01/19/17--02:00: _Graphene’s sleeping...
- 01/19/17--04:47: _Darwin Lectures go ...
- 01/19/17--06:42: _Solar storms could ...
- 01/20/17--07:03: _Opinion: Mozambique...
- 01/23/17--00:53: _Opinion: How danger...
- 01/23/17--01:19: _Biosocial science: ...
- 01/23/17--01:20: _'Psychological vacc...
- 01/23/17--05:07: _University of Cambr...
- 01/13/17--08:16: Call of duty: fighting Ebola in Sierra Leone
- 01/13/17--08:00: Opinion: How we can make super-fast hyperloop travel a reality
- 01/17/17--02:58: Opinion: Four ways to understand Theresa May’s Hard Brexit Speech
- 01/17/17--06:51: Opinion: The Full Brexit
- 01/18/17--00:58: Opinion: Brexit and workers' rights
- 01/18/17--00:35: How bright is your digital future?
- 01/18/17--03:30: Graduate, get a job … make a difference #3
- 01/19/17--02:00: Graphene’s sleeping superconductivity awakens
- 01/19/17--04:47: Darwin Lectures go to extremes
- 01/19/17--06:42: Solar storms could cost USA tens of billions of dollars
- 01/20/17--07:03: Opinion: Mozambique's unexpected truce still hangs in the balance
- 01/23/17--00:53: Opinion: How dangerous is burnt toast?
- A general inoculation, consisting of a warning that “some politically-motivated groups use misleading tactics to try and convince the public that there is a lot of disagreement among scientists”.
- A detailed inoculation that picks apart the Oregon petition specifically. For example, by highlighting some of the signatories are fraudulent, such as Charles Darwin and members of the Spice Girls, and less than 1% of signatories have backgrounds in climate science.
On the windowsill of Professor Ian Goodfellow’s office sit photographs of him with his children, and just down the corridor, his wife is carrying out research in the same department. Even at work, he is surrounded by constant reminders of the special things in his life, providing a sense of security.
His work, too – apart from the treadmill of seeking funding – is a secure, safe environment. Goodfellow is a basic scientist, carrying out lab-based studies into viruses such as norovirus, the winter vomiting virus. He doesn’t even come into contact with norovirus patients, so is at no particular risk of contracting this unpleasant, but relatively harmless, infection.
Yet in December 2014, Goodfellow chose to leave all of this security behind – for several months at a time – to join a taskforce fighting one of the most hazardous and frightening emerging infections of recent times, the Ebola outbreak in Sierra Leone. Since the epidemic began in West Africa in 2013 until it was declared over in March 2016, the virus infected more than 28,000 and killed over 11,000 people.
Goodfellow was one of over 30 people from Cambridge, coordinated by Dr Tim Brooks at Public Health England, who lent their support. Goodfellow helped set up one of the first diagnostic laboratories in an Ebola Treatment Centre near Makeni, in northern Sierra Leone, with support from the UK government. This was physically demanding and at times potentially dangerous work. “We had to move several tons of equipment and reagents by hand, in 35°C heat with over 90% humidity on a rather dangerous and very active building site,” he recalls. During their stay they encountered fires, electric shocks, and one of his own postdocs was bitten by both a spider and a snake.
Since the start of the epidemic, Goodfellow and colleagues have sequenced over 600 Ebola genomes, helping provide information about how the virus is evolving in, and how its evolution has been affected by, unprecedented levels of human– human transmission.
Towards the tail end of the epidemic, sequencing allowed researchers to trace the origin of new cases. “To end the epidemic, you need to make sure that any new cases are in transmission chains that are being monitored and are geographically contained, so you can pinpoint where this virus is coming from.”
The Tonkolili District, for example, had been Ebola-free for several months when a new case occurred. “We needed to know if this new case had come from a new introduction from an animal host, from a neighbouring country, or if it was part of a chain of transmission that had been hidden from the healthcare providers.
There’s a lot of stigma around Ebola, so it was possible there was a whole cluster in a village and that no-one was reporting the cases. That would be a disaster: all of a sudden, you don’t go from one to two cases, you go from one to tens or even hundreds.”
By sequencing the virus, in a very short time they were able to trace the source back to a survivor in whom the virus had persisted, and to take appropriate measures to prevent further spread. In fact, their work showed that Ebola can persist in survivors for over 15 months after infection and be transmitted through unprotected sex, and possibly even from a mother to her child through breastmilk.
Now that the emergency has passed, the treatment centre has closed down, but its equipment is being used at the University of Makeni Infectious Disease Research Laboratory in a building donated by the country’s president, Ernest Bai Koroma. The laboratory was kitted out with support from the Wellcome Trust and the Cambridge- Africa Programme, and now functions as a base for local and visiting scientists to carry out research. Goodfellow and his postdoc Dr Luke Meredith have helped train local technicians and researchers in some of the latest techniques in surveillance and sequencing of pathogens such as HIV and hepatitis B.
“We need to avoid ‘parachute science’, where scientists fly in, take samples and leave,” he insists. “It should be about developing sustainable partnerships, about developing local capacity. With training and support, local researchers have the ability to respond to these outbreaks; they just need the equipment and the infrastructure.”
This has already shown its value. A new case arose in January 2016 while neither Goodfellow nor any of his colleagues were in the country, but local scientists were able to use the techniques to trace the source of the infection.
Going to Sierra Leone was not an easy decision for Goodfellow, but he feels that he had a duty to respond. “The academic virology community had a responsibility to offer support. We couldn’t just sit back and watch this massive epidemic explode in front of our eyes with the knowledge that we have skills that could be useful.”
Many of the scientists who went out have struggled to return to their normal work, he says – some even quit their jobs on returning to take up more front-line jobs or to undertake more translational research. For Goodfellow, it has certainly made him appreciate the contribution that basic science makes.
“Basic science can often feel removed from real world applications,” he says, “but the skills you gain from running a laboratory are actually very useful in these kinds of environments. The ability to think on your feet and to figure out solutions is invaluable.”
It has also given him some perspective about what he does. “The satisfaction you get from being involved in a response like this and in capacity building is orders of magnitude better than publishing academic papers.”
Cambridge graduate Charlotte Dixon (Churchill), BA (2014) Modern and Medieval Languages, was also part of the Ebola crisis response in Sierra Leone in 2015 while working with the Department for International Development on their Graduate Scheme.
Working in a lab as a basic scientist can often seem far removed from the real world. A year since the World Health Organization declared the Ebola outbreak over, one researcher tells how the skills he learned working in a lab in Cambridge turned out to be surprisingly useful in fighting one of the most terrifying disease outbreaks of recent times.
Dr Caroline Trotter works on an infectious disease that has killed even more than Ebola. It occurs periodically right across the ‘midriff’ of Africa from Senegal to Ethiopia, in the so-called ‘Meningitis Belt’.
In the last major meningitis outbreak, in 1996, some 250,000 were infected and 25,000 people died. It was at this point that the global health community came together to fight back.
The Meningitis Vaccine Project (MVP) was launched, a partnership between international health organisation PATH and the World Health Organization (WHO). Working with the Serum Institute of India, MVP developed and rolled out the meningococcal A conjugate vaccine in just 10 years to combat the particular strain that affected the African belt. Since its introduction in 2010, 265 million people have been vaccinated. In Burkina Faso, where the vaccine was first used, a mass vaccination campaign saw 10 million people vaccinated in 10 days.
But even campaigns as huge as this aren’t enough to eliminate the infection, as Dr Caroline Trotter from the Department of Veterinary Medicine, explains: “You get a honeymoon period, but then you see a resurgence of cases.”
Trotter and her team used mathematical modelling to predict the best strategies for ensuring that this did not happen. The WHO, who funded her work, used it to shape their guidelines and ensure that the vaccine was introduced into routine vaccination programmes across sub- Saharan Africa.
She and Goodfellow were part of a Cambridge-Africa delegation to The Gambia in 2014 – a trip that inspired Goodfellow to lend support to combatting Ebola – and as a result Trotter is now working with collaborators at the Medical Research Council Unit in the country to look at the effect of the vaccine on pregnant women and their babies.
Meanwhile, she continues working with the African Meningococcal Carriage Consortium, a global research effort to study how meningococcal meningitis is spread in Africa, with the hope of gradually tightening the belt on this devastating disease.
Across Europe and parts of Asia, travellers can enjoy some of the fastest rail services in the world. From Málaga to Madrid, Tokyo to Osaka, high-speed electric trains condense the travel times between major hubs by racing along at some 300kph. The fastest commercial service in the world is the Shanghai maglev – short for magnetic levitation, the method of propulsion it uses to glide along its tracks as rapidly as 430kph.
Of course, air travel is still much faster: an Airbus A380 aircraft has a cruising speed of over 1,000kph. But at a time when reducing emissions is a top priority across the globe, there’s an urgent demand for cleaner, more energy-efficient alternatives – especially in the US, which is by far the world’s biggest user of air travel, with almost 800m passengers each year. Enter, the Hyperloop – a train-like technology which has the potential to match air travel for speed.
Hyperloop is the brainchild of US business magnate Elon Musk. First proposed in 2013, the Hyperloop system consists of “pods”, which are suspended inside a tube by magnetic levitation and propelled using a linear electric motor. The environment inside the tube is almost a complete vacuum, allowing the pods to travel at great speeds without being slowed by air resistance. The tubes themselves can be placed underground, or run above ground, elevated by columns.
The race begins
Musk originally intended the Hyperloop to cover the 600km route from Los Angeles to San Francisco at an average speed of about 960kph, reducing what’s currently a 12-hour train journey to just 35 minutes. Although funding has since been channelled into a bullet train service for this route, the idea of the hyperloop has attracted interest elsewhere.
The wealthy city-state of Dubai has agreed to conduct a feasibility study for a 150km link with Abu Dhabi. There’s also a proposal to connect Vienna with Budapest and Bratislava. And US start-up Hyperloop One recently announced a shortlist of 35 potential hyperloop test projects, which included proposals for routes linking Sydney with Melbourne, London with Edinburgh and Mumbai with Delhi.
While these developments have sparked much excitement, some remain sceptical about whether they can work in the real world.
Too fast to function?
Hyperloop pods are designed to reach their top speed of 1,220kph (slightly less than the speed of sound) in about 70 seconds, when accelerating at 0.5G (the “G” refers to “G-force”, which is how we measure acceleration).
To put this in context, at 1G we are pushed into the back of our seat with a force equal to our body weight – it would be uncomfortable. But the acceleration of an aircraft during takeoff is typically around 0.4G, and most people are happy with that.
We also experience G-forces when we go around a curve. This “centrifugal force” is what flings you from side to side on fairground rides. Again, about 0.5G is the limit for comfort. Travelling at speeds of 1,220kph sets the minimum curve radius to about 23km, which means that the track has to be pretty straight. It must be very level, too, because vertical hills and bumps also give rise to G-forces.
With the right site, these constraints could be manageable. The real challenge for hyperloop will be dealing with earth movements. In all large-scale engineering, allowances are made for thermal expansion, ground water and seismic activity – things that make the ground shift around. Normally, these aren’t too much of a problem. There are expansion joints in bridges and pavements, and even when subsidence causes cracks to appear in a wall, we shrug our shoulders and say “so what?”.
But movement in the hyperloop track could cause real problems, when the pods are travelling at such high speeds. That’s why Musk favours a track on columns, so that it can be adjusted and realigned in the event of ground movement. Indeed, we already do this kind of realignment with conventional railway tracks: the rails on sleepers are loosely supported on ballast and regular “tamping” ensures that the track is kept straight.
With such demanding specifications, actually constructing a hyperloop will not be cheap. But the days of aircraft and ships are numbered, unless we can find a way to power them with electricity or hydrogen fuel. Perhaps we could even learn to live with nuclear-powered ships. Hyperloop offers a novel vision of the future of long-distance travel – one that might just catch on.
Trains are getting increasingly faster, but as Professor Hugh Hunt from the Department of Engineering explains, the 'super-fast hyperloop' could soon see them matching air travel fo speed.
Handwritten letters, in a digital world, are increasingly rare. But, on 18 November 2016, John sat down to write to his friend Jakub. His message begins in capitals: “YES, JAKUB” and goes on to congratulate Jakub on the latest developments in his career. He writes: “I now consider myself your friend, who is so proud of you.”
John’s words are inscribed in biro on lined paper: the notepaper of Her Majesty’s Prison Service. Writer and recipient of this letter could hardly be more different. A former addict, John is serving a lengthy sentence at HM Prison Grendon in Buckinghamshire. Thousands of miles away, Jakub is starting a PhD in criminology in the Czech Republic while working for the Constitutional Court in Prague. With a Masters in criminology from Cambridge University, his future looks bright.
Jakub and John are just two of more than 100 people who have been brought together by an ambitious scheme run by academics at Cambridge’s Institute of Criminology. Taught in prisons, Learning Together gives university students and prisoners the chance to study alongside each other. They sit in the same classrooms, engage with the same topics, and carry out the same assignments.
Learning Together was piloted at HMP Grendon in 2015. An-eight week criminology course was taken by 24 learners, half of them graduate students and half of them prisoners. The programme is now expanding to other prisons and subject areas. Its remarkable success stems from the passionate belief of its creators – criminologists Drs Ruth Armstrong and Amy Ludlow – in the power of education to capacitate, unlock potential and transform society for the better.
This term, prisoners at Grendon have the opportunity to sign up for a course in literary criticism led by Dr Stacey McDowell from Cambridge’s Faculty of English. Meanwhile, prisoners at HM Prison Whitemoor in Cambridgeshire are offered a course on ‘The Good Life and the Good Society’ run by Drs Ryan Williams (Centre of Islamic Studies) and Elizabeth Phillips (Divinity Faculty).
Religious, political and social differences are high on the public agenda, yet theological and religious education is often taught in a way that’s disconnected from the real world. Williams suggests that this gap between theoretic and real-life perspectives represents a valuable opportunity. “While carrying out my research, I observed that people are guided on a daily basis by ethical and theological questions of what constitutes the ‘good’,” he says.
“Our course finds a middle ground, and provides a chance for students to sharpen their own understanding of what is right and ‘good’ in their own life and in society by having meaningful contact with, and learning alongside, people from a diversity of backgrounds. Yes, we’re taking a risk in that we're exploring questions of difference often seen as sources of conflict, but we believe it’s a crucial one to take.”
Universities and prisons might seem poles apart but both communities set out to transform lives for the benefit of society. “While teaching on access-to-university courses, aimed at students from less advantaged backgrounds, we realised that the students we were meeting had a lot in common with the prisoners we’d encountered in the course of our research,” say Armstrong and Ludlow.
“Many came from similar backgrounds and had been brought up on similar streets. The access students tended to have punitive views of people who commit crime – while many prisoners thought they had nothing in common with ‘clever’ people who were destined for university. We saw the same potential brimming in many of them.”
Teaching in prisons is nothing new. However, Learning Together has a broader objective. It sets out to create enduring ‘communities of learning’ in which students from universities and prisons realise how much they have to learn from, and with, each other.
The shared endeavour of structured learning forges friendships and shatters stereotypes. As a prison-based Learning Together student called Adam put it in an article about his experiences “I had my fears about the course. Will I be judged? Will I be up to it socially? Can I really learn with Cambridge students without looking stupid?”
Adam found the learning environment to be “inclusive and enabling” and wrote that “my confidence has soared and I come out of each session buzzing with new knowledge, new friendships and knowing that I’ve contributed way more than I thought I could". Since completing the course he has won a scholarship that will enable him to take a Masters in English Literature. He has also trained as a mentor for Learning Together students.
Many prisoners have negative experiences of school and gain few formal qualifications. For their part, many university students have relatively narrow life experiences. “Going into a prison, I expected to find immaturity,” said one Cambridge student in a film made by prisoners at HMP Springhill, another prison involved in the project. “Instead, I discovered that I was the immature one.”
At the heart of Learning Together is an approach described by Armstrong and Ludlow as ‘dialogical learning’ – learning through dialogue with fellow students and teachers in an environment of trust. In a blog for an online magazine, a prisoner at Grendon called Anthony shares his thoughts about the liberating nature of this approach.
Anthony writes: “Every session … gave me the feeling that I had been free for a few hours, although not free in the sense that I had been outside the prison, but free in a deeper sense. I could be a better version of myself, which my incarceration, past and fears did not dictate to and smother. It was warmth, compassion and the exchange of ideas – alongside the acceptance of others – that created this released version of me.”
If you are interested in learning more about how your university or department could get involved in working in partnership with a local prison, please contact Ruth Armstrong and Amy Ludlow on email@example.com
A pioneering project to teach university students alongside prisoners, so that they learn from each other, has proved remarkably successful. The creators of Learning Together, Drs Ruth Armstrong and Amy Ludlow, are now expanding the scheme and seeking to widen participation across university departments.
Profanity is obscene language which, in some social settings is considered inappropriate and unacceptable. It often refers to language that contains sexual references, blasphemy or other vulgar terms. It’s usually related to the expression of emotions such as anger, frustration or surprise. But profanity can also be used to entertain and win over audiences.
There are conflicting attitudes to profanity and its social impact has changed over the decades. In 1939, Clark Gable uttering the memorable line “Frankly my dear, I don’t give a damn” in the film Gone with the Wind, was enough to land the producers a $5,000 fine. Nowadays our movies, TV shows and books are peppered with profane words and, for the most part, we are more tolerant of them.
As dishonesty and profanity are both considered deviant they are often viewed as evidence of low moral standards. On the other hand, profanity can be positively associated with honesty. It is often used to express unfiltered feelings and sincerity. The researchers cite the example of President-elect Donald Trump who used swear words in some of his speeches while campaigning in last year’s US election and was considered, by some, to be more genuine than his rivals.
Dr David Stillwell, a lecturer in Big Data Analytics at the University of Cambridge, and a co-author on the paper, says: “The relationship between profanity and dishonesty is a tricky one. Swearing is often inappropriate but it can also be evidence that someone is telling you their honest opinion. Just as they aren’t filtering their language to be more palatable, they’re also not filtering their views. ”
The international team of researchers set out to gauge people’s views about this sort of language in a series of questionnaires which included interactions with social media users.
In the first questionnaire 276 participants were asked to list their most commonly used and favourite swear words. They were also asked to rate their reasons for using these words and then took part in a lie test to determine whether they were being truthful or simply responding in the way they thought was socially acceptable. Those who wrote down a higher number of curse words were less likely to be lying.
A second survey involved collecting data from 75,000 Facebook users to measure their use of swear words in their online social interactions. The research found that those who used more profanity were also more likely to use language patterns that have been shown in previous research to be related to honesty, such as using pronouns like “I” and “me”. The Facebook users were recruited from across the United States and their responses highlight the differing views to profanity that exist between different geographical areas. For example, those in the north-eastern states (such as Connecticut, Delaware, New Jersey and New York) were more likely to swear whereas people were less likely to in the southern states (South Carolina, Arkansas, Tennessee and Mississippi).
Gilad Feldman et al “Frankly, we do give a damn: The relationship between profanity and honesty” DOI:10.1177/1948550616681055
It’s long been associated with anger and coarseness but profanity can have another, more positive connotation. Psychologists have learned that people who frequently curse are being more honest. Writing in the journal Social Psychological and Personality Science a team of researchers from the Netherlands, the UK, the USA and Hong Kong report that people who use profanity are less likely to be associated with lying and deception.
A naturally-occurring compound has been found to block a molecular process thought to underlie Parkinson’s Disease, and to suppress its toxic products, scientists have reported.
The findings, although only preliminary, suggest that the compound, called squalamine, could be exploited in various ways as the basis of a potential treatment for Parkinson’s Disease. The compound has previously been used in clinical trials for cancer and eye conditions in the United States, and a trial in Parkinson’s Disease patients is now being planned by one of the researchers involved in the study.
Squalamine is a steroid which was discovered in the 1990s in dogfish sharks. It is, however, impossible to derive any medical benefits from shark tissue, and the form used by scientists is a safer and more reliable synthetic analogue. To date, it has been extensively investigated as a potential anti-infective and anticancer therapy.
But in the new study, researchers discovered that squalamine also dramatically inhibits the early formation of toxic aggregates of the protein alpha-synuclein – a process thought to start a chain reaction of molecular events eventually leading to Parkinson’s Disease. Remarkably, they also then found that it can suppress the toxicity of these poisonous particles.
The researchers tested squalamine in both cell cultures in the lab, and in an animal model using nematode worms. While their findings therefore only represent a step towards a treatment for Parkinson’s Disease in humans, they described the results as representing significant progress.
The study was led by academics from the Centre for Misfolding Diseases, based in the Chemistry Department at the University of Cambridge in the United Kingdom, and Georgetown University and the National Institutes of Health in the United States. Scientists from the Netherlands, Italy and Spain also played key roles. The findings are published in Proceedings of The National Academy of Sciences.
Professor Christopher Dobson, who is one of the authors and Master of St John’s College, as well as a Professor in the Chemistry Department at the University of Cambridge, said: “To our surprise, we found evidence that squalamine not only slows down the formation of the toxins associated with Parkinson’s Disease, but also makes them less toxic altogether.”
“If further tests prove to be successful, it is possible that a drug treating at least some of the symptoms of Parkinson’s Disease could be developed from squalamine. We might then be able to improve on that incrementally, by searching for better molecules that augment its effects.”
Professor Michele Vendruscolo, from the Department of Chemistry at the University of Cambridge and a co-author, said: “This is an encouraging step forward in our efforts to discover potential drugs against Parkinson’s Disease. Squalamine can prevent alpha-synuclein from malfunctioning, essentially by normalising its binding to lipid membranes. If there are going to be ways to beat the disease, it seems likely that this is one that may work.”
The study stemmed from research led by Dr Michael Zasloff, professor of surgery and pediatrics at Georgetown University School of Medicine in the USA. Zasloff, who also co-authored the latest study, discovered squalamine in 1993 and has since led extensive work exploring its potential as a treatment for conditions including cancer.
In the new study, the researchers explored squalamine’s capacity to displace alpha-synuclein from cell membranes – a phenomenon that was first observed in the laboratory headed by another co-author, Dr Ad Bax, in the National Institutes of Health in Bethesda, USA. This finding has significant implications for Parkinson’s Disease, because alpha-synuclein works by binding to the membranes of tiny, bubble-like structures called synaptic vesicles, which help to transfer neurotransmitters between neurons.
Under normal circumstances, the protein thus aids the effective flow of chemical signals, but in some instances, it malfunctions and instead begins to clump together, creating toxic particles harmful to brain cells. This clustering is the hallmark of Parkinson’s Disease.
The researchers carried out a series of experiments which analysed the interaction between squalamine, alpha-synuclein and lipid vesicles, building on earlier work from Cambridge scientists which showed the vital role that vesicles play in initiating the aggregation. They found that squalamine inhibits the aggregation of the protein by competing for binding sites on the surfaces of synthetic vesicles. By displacing the protein in this way, it significantly reduces the rate at which toxic particles form.
Further tests, carried out with human neuronal cells, then revealed another key factor – that squalamine also suppresses the toxicity of these particles.
Finally, the group tested the impact of squalamine in an animal model of Parkinson’s Disease, by using nematode worms genetically programmed to over-express alpha-synuclein in their muscle cells. As the worms develop, alpha-synuclein aggregation causes them to become paralysed, but squalamine prevented the paralysis from taking effect. “We could literally see that the oral treatment of squalamine did not allow alpha-synuclein to cluster, and prevented muscular paralysis inside the worms,” Zasloff said.
Together, the results imply that squalamine could be used as the basis of a treatment targeting at least some of the symptoms of Parkinson’s Disease. Zasloff says he is now planning a clinical trial with squalamine in Parkinson’s Disease patients in the US.
Further research is, however, needed to determine what the precise benefits of squalamine would be – and what form any resulting drug might take. In particular, it is not yet clear whether squalamine can reach the specific regions of the brain where the main molecular processes determining Parkinson’s Disease take place.
The researchers suggest that it would be particularly interesting to start investigating the efficacy of squalamine as a means to alleviate certain symptoms. If taken orally, for instance, the compound may perhaps relieve the severe constipation many patients experience, by targeting the gastrointestinal system and affecting alpha-synuclein in the gut.
It is also conceivable that a treatment of that sort could “cascade” signals to other parts of the body. “Targeting alpha-synuclein in the gut may perhaps in some cases be sufficient to delay the progress of other aspects of Parkinson’s Disease, at least for symptoms concerning the peripheral nervous system,” Vendruscolo said.
“In many ways squalamine gives us a lead rather than a definitive treatment,” Professor Dobson added. “Parkinson’s Disease has many symptoms and we hope that either this compound, or a derivative of it with a similar mechanism of action, could alleviate at least some of them.”
“One of the most exciting prospects is that, subject to further tests, we might be able to use it to make improvements to patients’ lives, while also studying other compounds with the aim of developing a more powerful treatment in the future.”
The paper, A natural product inhibits the initiation of α-synuclein aggregation and suppresses its toxicity, is published in Proceedings of the National Academy of Sciences. DOI: 10.1073/pnas.1610586114
Squalamine, a natural product studied for its anticancer and anti-infective properties, could also lead to future treatments for Parkinson’s Disease.
Depression is one of the leading causes of disability worldwide. Symptoms such as difficulty concentrating or indecisiveness contribute to the disability associated with depression. Almost all patients with depression experience problems with concentration, memory, and attention. At least half of all patients with depression show cognitive deficits that can be measured objectively. These deficits tend to persist in the recovery phase. Patients with persistent cognitive problems have poorer outcomes such as impaired work functioning and increased risk for relapse. Depression can be relapsing and return periodically, often for several months at a time.
Depression is associated with taking time off work, but also, in some cases, with ‘presenteeism’ in the workplace, where employees may not be able to work as well as usual. People often feel distressed when they have difficulty achieving their previous level of work performance on return to work after experiencing depression.
However, currently available treatments do not specifically address cognitive deficits in depression. Recent reports have highlighted the importance of defining cognition as a target for treatment in depression.
In a study funded by the Medical Research Council (MRC) and Wellcome, researchers from the Department of Psychiatry and the Behavioural and Clinical Neuroscience Institute at the University of Cambridge investigated the potential of modafinil to treat cognitive dysfunction in depression. Modafinil has already been shown in other studies to have beneficial effects on cognitive function in psychiatric disorders such as schizophrenia.
Sixty patients aged between 18 and 65 years with remitted depression completed computerised memory, attention and planning tasks after receiving modafinil or a placebo. The results showed that patients given a dose of modafinil experienced improvements in memory functions, compared to those patients on placebo. Specifically, patients had benefits in two types of memory – episodic memory and working memory, both of which are important in our day-to-day activities.
“We use episodic memory when we are remembering where we left our keys in the house, or remembering where we parked our car,” explains Professor Barbara Sahakian, the study’s senior author. “Working memory, on the other hand, is the ability we use when we are rehearsing a new telephone number while we are trying to find a pen and paper to write it down, for example.”
The study demonstrated that patients receiving modafinil made fewer errors than those who received a placebo. For example, in one of the tasks which involved remembering the location among an increasing number of boxes of a particular pattern, patients receiving modafinil made fewer than half the number of mistakes that those receiving the placebo made, at the most difficult level.
“These results are very promising,” says lead author Dr Muzaffer Kaser from the Department of Psychiatry at the University of Cambridge. “GPs or psychiatrists often hear complaints of concentration or memory difficulties from patients with depression, but we are not good enough at treating these symptoms. Our study shows that modafinil may be a feasible option to tackle persistent cognitive problems in depression.”
It is not clear from the study whether the same effects would be seen over the long term, say the researchers. Professor Sahakian adds: “We now need a longer term study using modafinil to see if the drug, which improves cognition and motivation, can facilitate successful return to work following depression.”
Dr Kathryn Adcock, Head of Neurosciences and Mental Health at the MRC, added: “Preventing relapse is an integral part of any ongoing treatment strategy for depression, and some people can understandably feel hampered if they find it hard to get back to their previous capacity when they go back to work after experiencing depression. These results suggest there may be a way to help these people in their recovery from depression and that’s really encouraging.”
Kaser M, et al. Modafinil Improves Episodic Memory and Working Memory Cognition in Patients with Remitted Depression: A Double-Blind, Randomized, Placebo Controlled Study. Biological Psychiatry: Cognitive Neuroscience and Neuroimaging; 17 Jan 2017; DOI: 10.1016/j.bpsc.2016.11.009.
Modafinil, a drug used to treat narcolepsy – excessive daytime sleepiness – can improve memory in patients recovering from depression, according to new research from the University of Cambridge. The findings, published today in the journal Biological Psychiatry: CNNI, result from a randomised, double-blind, placebo-controlled study and offer hope of a treatment for some of the cognitive symptoms of depression.
Hard Brexit. Clean Brexit. Full Brexit. Naked Brexit. Whatever you want to call it, we now know what Brexit really means. Or do we? Theresa May's long awaited speech isn't quite as clear cut as it seems. There are, in fact, four very different ways to interpret the Prime Minister's carefully crafted words, underpinned by what could be seen as the four motivating factors behind what would now appear to be Britain’s hard Brexit stance: the economy, politics, society and the art of negotiation.
Firstly, it could be a matter of “it’s the economy, stupid". Whilst economists are well known for predicting economic calamity as a result of the referendum decision, and the latest research suggests that, if anything, they were too optimistic - that factoring in the adverse effects of reduced immigration doubles the economic losses - there are those who argue that a clean break could bring economic benefits in the longer run. According to Shanker Singham, Chairman of the Special Trade Commission at the Legtaum Institute, these benefits arise from the potential for Britain to become more global: to, for example, eradicate the agricultural protectionism associated with the Common Agricultural Policy, and to make independent trade deals with the wider world. According to Singham, this can only be achieved by a "full Brexit". Whilst I have myself identified a number of issues with this type of "globalising Brexiteer" story, it may be that it has nevertheless persuaded the Prime Minister.
Of course, even if such hoped for economic benefits really are on offer, there is still a rather big fly in the ointment: the leap of faith that is required to read the Brexit vote as a vote for more - rather than less - globalisation. As I’ve argued elsewhere, Brexit supporters span the whole spectrum of economic views from the left-leaning to the free-market right. Global Britain might satisfy some not all, leading us down a yellow brick road of disappointment. And, even if the majority really are in support of a more global Britain, Theresa May’s strategy seems to involve pressing the de-globalisation button alongside the button for globalisation. To the rest of the world, Britain is not leading the way with free trade, as it did centuries ago, it’s doing precisely the opposite.
Having been a Remainer herself, the Prime Minister is likely to be well aware of the possible economic fallout of Brexit (even if she won’t admit it). In that case, her speech is more a matter of "it's politics, stupid" than it is of "the economy, stupid". To date, Europe has made it clear that Britain can't have its cake and eat it, leaving May with a choice: remain in the single market for the benefit of the economy, thereby accepting freedom of movement, or aim instead for a little England, with control over migration, whatever the economic cost. May's speech will be widely interpreted as prioritising the issue of immigration whilst minimisng the economic trade-off involved – or suggesting that we will be peddling hard globally to do whatever we can to compensate for losses associated with reduced European trade. Even if that peddling does get us somewhere, we cannot assume that more trade will mean a stronger economy. The reality is that it takes a stronger economy to create more trade, not vice-versa.
Aside from the economic and the political interpretations, there is, however, a third interpretation of May's speech: that society is being brought in from the cold. Over the last hundred years, a battle has been raging between the state and the market. On the left and the right, politicians and economists have imagined the economic pie as being divided into two pieces: the slice that represents government and the slice that represents market activity. The idea of a direct trade-off between the two naturally follows - more of one must mean less of the other. However, many of the concerns expressed by voters, whether in the U.K. Brexit vote or the U.S.A. election, don't fit neatly into this two-slice division of the economic pie. That's because there is a third slice of the pie that economists and politicians have been neglecting and, here, it is a matter of "it's society, stupid".
Only by bringing society into our picture can we truly understand the Brexit (and Trump) vote: why, for example, so many Conservatives (or Republicans) who would normally be pro-market and anti-state are in favour of the state “regaining” control over immigration, and why some on the Left are in favour of remaining in what is a free-trade zone, despite not exactly being enamored with free markets. Such apparent inconsistencies require us to think about peoples fears and hopes for society, beyond either the frontiers of the market or the state.
Perhaps the Prime Minister understands that concerns about society are one of the key drivers of dissatisfaction amongst voters: that what we have experienced is not simply a backlash against markets and a desire for the state to do more, but, rightly or wrongly, worries about social breakdown. If that is the case, then her speech falls short. What we desperately need is a wider public debate that engages with two big questions: firstly, to what extent has society really deteriorated, and, secondly, to what extent have globalisation and the free market model – including free movement - really been bad for society.
However, this third interpretation might be giving the Prime Minister too much credit, which brings me to my fourth and some might say most likely interpretation of May's hard Brexit tactics - that they are nothing more than a negotiation strategy. That this is a game of asking Europe for more than the government is truly happy to accept. If that's really the case, we need to take what May says with a pinch of salt. Perhaps we can’t infer much at all.
Economics first, politics first, society first, or the art of negotiation. There are four distinct ways to interpret May's hard Brexit stance. Between all four, you could be left wondering whether we really are any the wiser about Britain’s future. However, whatever happens, May’s image of a stronger, richer and more global Britain cannot yet be taken for granted.
Dr Victoria Bateman is a Fellow in Economics and economic historian, Gonville & Caius College
An economic historian offers her initial reaction to the Prime Minister's address
The Prime Minister’s much-anticipated speech on her Government’s objectives for the UK’s withdrawal from the European Union confirms what was increasing likely to be the political direction of travel. The UK will not be seeking a relationship with the EU like that giving rise to the European Economic Area agreement between the EU and three EFTA states.
Indeed, it will not seek any type of ‘association agreement’; helpful, given that such agreements require the unanimous consent of all Member States’ governments as well as ratification in all EU states. Instead what the Prime Minister wants is something ‘bespoke’ and British and which is in tune with her central theme of building a truly ‘global Britain’.
What is noteworthy about the speech is that it purports to map an exciting new future for the UK that encompasses not just its future trading relationship with the EU, but also the Commonwealth, the Gulf states and – inspired by the recent words of President-elect Trump – the United States. And a stronger Britain is not to be at the expense of the EU, with the UK wanting the EU to be a success, just not with the UK as a member. All of which is remarkably resonant of UK policy in the 1950s when the UK decided not to join the fledgling EEC because it sought the bigger prize of global trade rather than a compromise of regional economic integration. In respects it is distinctly Churchillian: happy to let true Europeans forge an economic, and maybe even political, union just as long as the UK looks on rather than participates.
Important details remain to be settled including what type of customs arrangements would reduce customs barriers while still permitting the UK to enter into its own free trade deals with non-EU countries. The type, scope and duration of any transitional arrangements seems likely to form a key strand of future negotiations.
The Prime Minister made clear that the final deal will be presented to both Houses of Parliament and will be voted upon. This engagement with Parliament is important in seeking to restore the authority of Parliament as the body to whom government accounts. This is especially significant given calls by some for a second referendum to endorse the final deal. By rejecting another referendum, and by giving Parliament a vote at the end of the process, Theresa May is trying to bring domestic political institutions back to the centre of decision-making and, in so doing, to try and put the populist genie back in the referendum bottle.
What is also striking about the speech is Theresa May’s clear intention of steadying the ship with the iron grip of Unionism. Indeed, her speech started with the pledge to put ‘the preservation of our precious Union at the heart of everything we do’. So no special deal for Scotland and no differentiated Brexit. All that is on the menu is the Full British Brexit, complete with HP sauce and a solid 1950s Formica table.
The Director of Cambridge's Centre for European Legal Studies offers his initial reaction to the Prime Minister's address
Perhaps the best news for workers in Theresa May’s Brexit speech is that she committed herself to ensuring that ‘workers rights are fully protected and maintained’. She indicated this before in the speech she gave at the Conservative party conference in October. So the Working Time and Agency Work Regulations are safe - for now.
But, as always, the devil is in the detail (or lack of it).
'Indeed, under my leadership, not only will the government protect the rights of workers set out in European legislation, we will build on them. Because under this government, we will make sure legal protection for workers keeps pace with the changing labour market – and that the voices of workers are heard by the boards of publicly-listed companies for the first time.'
So the UK government will build on the rights in European legislation. Does that mean she will update UK rights derived from EU law in line with changes introduced by the EU? Will ECJ case law interpreting EU law continue to be applied, even if only as persuasive authority?
The reference to legal protection for workers keeping pace with ‘the changing labour market’ is more perplexing. On the one hand, the reference to worker protection on boards suggests a progressive direction (although this proposal has already run into difficulties), as does the subsequent reference to ‘Enhancing rights for workers’. But what if the economy goes into recession following Brexit? Do such changes in the labour market mean in fact deregulation of workers’ rights?
And then there is the threat. If the negotiations lead to a ‘punitive deal’ that ‘punishes Britain’, Theresa May said ‘no deal for Britain is better than a bad deal for Britain’, freeing the UK ‘to set the competitive tax rates and embrace the policies that would attract the world’s best companies and biggest investors to Britain’.
Her Chancellor was more explicit. In his interview with Die Welt, he said that ‘We are now objectively a European-style economy … with a social model that is recognizably the European social model that is recognizably in the mainstream of European norms, not U.S. norms’. He concluded: ‘I personally hope we will be able to remain in the mainstream of European economic and social thinking. But if we are forced to be something different, then we will have to become something different.’ Workers' rights may be less secure than first appears.
Catherine Barnard is Professor of European Union Law and Jean Monnet Chair of EU Law at the University of Cambridge. She is also Senior Tutor of Trinity College. She specializes in European Union law, labour and discrimination law, and competition law.
Cambridge's Professor of European Union Law offers her initial reaction to the Prime Minister's Brexit speech
The combination of new technologies, IT infrastructures and data analytics holds out an alluring possibility of a world in which the end-to-end supply chain is utterly transformed – highly connected, flexible, efficient, resilient and truly responsive to customer needs. Each of those attributes sounds attractively incremental but put them together and you have a completely new way of doing business and one in which customers are not just on the receiving end of a product or service but are central to it.
A good example of this is the pharmaceutical sector. As part of the REMEDIES project, we are working with the major players in the UK pharmaceutical supply chain to address some of the challenges they face, such as tackling the hundreds of days’ of inventory sitting in the supply chain and the vast quantities of waste caused by patients not taking the drugs they are prescribed.
Using digital technologies and data-rich systems to make the pharmaceutical supply chain much more efficient is one thing but we are also mapping an entirely new business model in which drugs can be manufactured to order – possibly at the local pharmacy. Not only would this meet a patient’s individual medical needs, but the consumption and effects of those drugs can be continuously monitored to help doctors better support their patients.
A brave new world, in other words, of personalised medicine enabled by digital manufacturing processes, digital infrastructures and lots of data. But realising this vision of a digital future remains elusive, particularly for the largest global businesses.
Many of these companies recognise the need to digitalise aspects of their supply chain, often in response to particular challenges. They may, for example, as in the pharmaceutical sector, have a pressing need to solve the intransigent inventory management issues that bedevil many supply chains. They may have an issue with quality and see digitalisation as the best way to ensure their products are of a consistently high quality and their provenance is traceable.
Or they may be losing competitive advantage through poor customer service and see a digital agenda as a way of regaining market share, possibly while supporting their ambitions to reduce environmental impact.
But developing an end-to-end digital supply chain involves a major transformation both at a conceptual level and in execution. And while thought leaders and change agents within big companies may see the prize, CEOs and shareholders will be much more cautious given the levels of investment and organisation-wide disruption it entails. This is particularly the case for the global giants with a history of merger and acquisition (M&A) and an array of legacy systems to integrate. Even without the complication of M&A, all large companies have to organise themselves into manageable structures, which have a natural tendency to turn into silos and hence become obstacles to organisational change.
There is also the wider question of a lack of digital skills and attitudes across the board – at senior and middle management levels as well as within day-to-day factory operations. Companies may be able to see the opportunity, acquire the technology and capture the data but a shortage of both skills and mindset presents a significant barrier.
One of the challenges with the digital supply chain vision is the sheer scale and ambition of it. At the Centre for International Manufacturing, we have begun to conceptualise what a digital supply chain might look like and break it down into key areas to help companies understand the key ways in which digitalisation can impact on their organisation. We have been doing this by talking to companies both individually and as a non-competitive group.
Having identified the key areas, we have been developing ‘maturity models’ against which companies can benchmark their current performance, identify where the greatest opportunities lie and start to think about where to prioritise their efforts.
Factory design and production processes
Digital developments in factory design and production processes underpin the extended supply chain. The flexible factory is an important concept in this rapidly moving environment: how can you design and configure a factory for technologies which you don’t yet know? In this context, factories need to be modular and reconfigurable. One of the questions our framework helps companies consider is this: it is relatively straightforward to design a state-of-the-art, highly flexible plug-and-play factory – but is it cost-effective? Is it where companies will be able to create and capture most value?
Making the most of data
Some companies are already very good at gathering product and customer data but the challenge is how to integrate that data and use it to make better decisions about, for example, product lifecycle management, sales forecasting and designing products and services in response to customer needs. Data ownership is fast becoming an important issue in the supply chain and service delivery context. When partners are involved, who owns and can access the data is a critical question. Data sharing and connectivity also raises the question of open source versus ‘black box’ and developing common international data standards across sectors. In this area we must also consider the resilience of these digital supply chains and understand the cyber security challenges they may present.
Flexibility versus connectivity
One of the conceptual and practical challenges for organisations is whether to build monolithic, enterprise-wide systems that can connect supply chains. Clearly, for many companies – particularly those with a history of M&A – it would require a huge act of organisational will, not to mention significant investment, to move to a common platform. And, would doing so actually deliver a sufficiently flexible and reconfigurable solution? Instead, companies are talking about developing a ‘digital backbone’ that can interface with other systems to provide more networked and flexible approaches to optimising the end-to-end supply chain. And this digital backbone is more than an IT system – it should embody the critical touch points and interfaces between organisations as well as the data architectures and analytics. It also signifies a cultural shift to digital.
The last leg
Using web-based systems to fulfil orders and manage the complexity of last-mile logistics is something that we have seen business-to-consumer companies do with impressive levels of sophistication and achieve corresponding levels of competitive advantage. For many large manufacturers there is still work to be done in developing systems that can support product delivery to multiple points of sale and ultimately direct to the end customer. But the opportunities are clear and create a virtuous circle. By delivering better customer service you not only attract new customers (and retain the old ones) but you also get access to better customer data which in turn can improve both the product and the service you offer. There are also many efficiencies to be had from digitalising this last leg of the supply chain though better stock management and reduced transport costs.
Towards the digital supply chain
By breaking down the digital supply chain into distinct but connected scenarios against which companies can measure their performance and aspirations, we believe we have created a powerful framework that will help them develop their digital supply chain capabilities. The scenarios help to clarify thinking and develop a strategic approach to digitalisation which is both deliverable and will create maximum value for the company.
The next step is to put the strategy into action.
First published in IfM Review.
Dr Jag Srai, Head of Cambridge's Centre for International Manufacturing, and his team have developed a new way to help companies embrace the challenges and opportunities of digitalising the extended supply chain. Here, he provides a glimpse of this digital future.
The study, published in Lancet Psychiatry, found that 14-year-old adolescents who had contact with mental health services had a greater decrease in depressive symptoms than those with similar difficulties but without contact. By the age of 17, the odds of reporting clinical depression were more than seven times higher in individuals without contact than in service users who had been similarly depressed at baseline.
Researchers from the Department of Psychiatry recruited 1,238 14-year-old adolescents and their primary caregivers from secondary schools in Cambridgeshire, and followed them up at the age of 17. Their mental state and behaviour was assessed by trained researchers, while the teenagers self-reported their depressive symptoms. Of the participants, 126 (11%) had a current mental illness at start of the study – and only 48 (38%) of these had had contact with mental health services in the year prior to recruitment.
Contact with mental health services appeared to be of such value that after three years the levels of depressive symptoms of service users with a mental disorder were similar to those of 996 unaffected individuals.
“Mental illness can be a terrible burden on individuals, but our study shows clearly that if we intervene at an early stage, we can see potentially dramatic improvements in adolescents’ symptoms of depression and reduce the risk that they go on to develop severe depressive illness,” says Sharon Neufeld, first author of the study and a research associate at in the Department of Psychiatry.
The Cambridge study is believed to be the first study in adolescents to support the role of contact with mental health services in improving mental health by late adolescence. Previous studies have reported that mental health service use has provided little or no benefit to adolescents, but the researchers argue that this may be because the design of those studies did not consider whether service users had a mental disorder or not. The approach taken on this new study enabled it to compare as closely as possible to present study statistically-balanced treated versus untreated individuals with a mental disorder a randomised control trial.
The researchers say their study highlights the need to improve access to mental health services for children and adolescents. Figures published in 2015 show that NHS spending on children’s mental health services in the UK has fallen by 5.4% in real terms since 2010 to £41 million, despite an increase in demand. This has led to an increase in referrals and waiting times and an increase in severe cases that require longer stays in inpatient facilities.
On 9 January this year, the Prime Minister announced plans to transform the way we deal with mental illness in the UK at every stage of a person’s life – not just in our hospitals, but in our classrooms, at work and in our communities – adding: “This starts with ensuring that children and young people get the help and support they need and deserve – because we know that mental illness too often starts in childhood and that when left untreated, can blight lives, and become entrenched.”
Professor Ian Goodyer, who led the study, has cautiously welcomed the commitment from the Prime Minister and her Government. “The emphasis going forward should be on early detection and intervention to help mentally-ill teens in schools, where there is now an evidence base for psychosocial intervention,” he says. “We need to ensure, however, that there is a clear pathway for training and supervision of school-based psychological workers and strong connections to NHS child and adolescent mental health services for those teens who will need additional help.
“As always, the devil is in the detail. The funding of services and how the effectiveness of intervention is monitored will be critical if we are to reduce mental illness risks over the adolescent years. With the right measures and school-based community infrastructure, I believe this can be achieved.”
The research was funded by Wellcome and the National Institute for Health Research.
Neufeld, S et al. Reduction in adolescent depression after contact with mental health services: a longitudinal cohort study in the UK. Lancet Psychiatry; 10 Jan 2017; DOI: 10.1016/S2215-0366(17)30002-0
Young people with mental health problems who have contact with mental health services are significantly less likely to suffer from clinical depression later in their adolescence than those with equivalent difficulties who do not receive treatment, according to new research from the University of Cambridge. This comes as Prime Minister Theresa May announced measures to improve mental health support at every stage of a person’s life, with an emphasis on early intervention for children and young people.
Cambridge graduates enter a wide range of careers but making a difference tops their career wish lists. In this series, inspiring graduates from the last three years describe Cambridge, their current work and their determination to give back.
Researchers have found a way to trigger the innate, but previously hidden, ability of graphene to act as a superconductor – meaning that it can be made to carry an electrical current with zero resistance.
The finding, reported in Nature Communications, further enhances the potential of graphene, which is already widely seen as a material that could revolutionise industries such as healthcare and electronics. Graphene is a two-dimensional sheet of carbon atoms and combines several remarkable properties; for example, it is very strong, but also light and flexible, and highly conductive.
Since its discovery in 2004, scientists have speculated that graphene may also have the capacity to be a superconductor. Until now, superconductivity in graphene has only been achieved by doping it with, or by placing it on, a superconducting material - a process which can compromise some of its other properties.
But in the new study, researchers at the University of Cambridge managed to activate the dormant potential for graphene to superconduct in its own right. This was achieved by coupling it with a material called praseodymium cerium copper oxide (PCCO).
Superconductors are already used in numerous applications. Because they generate large magnetic fields they are an essential component in MRI scanners and levitating trains. They could also be used to make energy-efficient power lines and devices capable of storing energy for millions of years.
Superconducting graphene opens up yet more possibilities. The researchers suggest, for example, that graphene could now be used to create new types of superconducting quantum devices for high-speed computing. Intriguingly, it might also be used to prove the existence of a mysterious form of superconductivity known as “p-wave” superconductivity, which academics have been struggling to verify for more than 20 years.
The research was led by Dr Angelo Di Bernardo and Dr Jason Robinson, Fellows at St John’s College, University of Cambridge, alongside collaborators Professor Andrea Ferrari, from the Cambridge Graphene Centre; Professor Oded Millo, from the Hebrew University of Jerusalem, and Professor Jacob Linder, at the Norwegian University of Science and Technology in Trondheim.
“It has long been postulated that, under the right conditions, graphene should undergo a superconducting transition, but can’t,” Robinson said. “The idea of this experiment was, if we couple graphene to a superconductor, can we switch that intrinsic superconductivity on? The question then becomes how do you know that the superconductivity you are seeing is coming from within the graphene itself, and not the underlying superconductor?”
Similar approaches have been taken in previous studies using metallic-based superconductors, but with limited success. “Placing graphene on a metal can dramatically alter the properties so it is technically no longer behaving as we would expect,” Di Bernardo said. “What you see is not graphene’s intrinsic superconductivity, but simply that of the underlying superconductor being passed on.”
PCCO is an oxide from a wider class of superconducting materials called “cuprates”. It also has well-understood electronic properties, and using a technique called scanning and tunnelling microscopy, the researchers were able to distinguish the superconductivity in PCCO from the superconductivity observed in graphene.
Superconductivity is characterised by the way the electrons interact: within a superconductor electrons form pairs, and the spin alignment between the electrons of a pair may be different depending on the type - or “symmetry” - of superconductivity involved. In PCCO, for example, the pairs’ spin state is misaligned (antiparallel), in what is known as a “d-wave state”.
By contrast, when graphene was coupled to superconducting PCCO in the Cambridge-led experiment, the results suggested that the electron pairs within graphene were in a p-wave state. “What we saw in the graphene was, in other words, a very different type of superconductivity than in PCCO,” Robinson said. “This was a really important step because it meant that we knew the superconductivity was not coming from outside it and that the PCCO was therefore only required to unleash the intrinsic superconductivity of graphene.”
It remains unclear what type of superconductivity the team activated, but their results strongly indicate that it is the elusive “p-wave” form. If so, the study could transform the ongoing debate about whether this mysterious type of superconductivity exists, and – if so – what exactly it is.
In 1994, researchers in Japan fabricated a triplet superconductor that may have a p-wave symmetry using a material called strontium ruthenate (SRO). The p-wave symmetry of SRO has never been fully verified, partly hindered by the fact that SRO is a bulky crystal, which makes it challenging to fabricate into the type of devices necessary to test theoretical predictions.
“If p-wave superconductivity is indeed being created in graphene, graphene could be used as a scaffold for the creation and exploration of a whole new spectrum of superconducting devices for fundamental and applied research areas,” Robinson said. “Such experiments would necessarily lead to new science through a better understanding of p-wave superconductivity, and how it behaves in different devices and settings.”
The study also has further implications. For example, it suggests that graphene could be used to make a transistor-like device in a superconducting circuit, and that its superconductivity could be incorporated into molecular electronics. “In principle, given the variety of chemical molecules that can bind to graphene’s surface, this research can result in the development of molecular electronics devices with novel functionalities based on superconducting graphene,” Di Bernardo added.
The study, p-wave triggered superconductivity in single layer graphene on an electron-doped oxide superconductor, is published in Nature Communications. (DOI: 101038/NCOMMS14024).
Since its discovery in 2004, scientists have believed that graphene may have the innate ability to superconduct. Now Cambridge researchers have found a way to activate that previously dormant potential.
Each series of the Darwin College Lectures his built around a single theme, approached in a multi-disciplinary way, and each lecture is prepared for a general audience by a leading authority on his or her subject. The theme for this year’s lecture series, now in its 32nd year, is ‘Extremes’. The lectures are free and open to the public, and are held on Friday evenings during Lent Term at Lady Mitchell Hall on the University’s Sidgwick Site.
The first lecture of the 2017 series is ‘Extreme Weather’ and will be given by Darwin Fellow Dr Emily Shuckburgh, who is also deputy head of the Polar Oceans Team at the British Antarctic Survey. In her lecture, she will discuss the scientific evidence surrounding the causes and consequences of climate change and the prospects for the future. Dr Shuckburgh is co-author of a recently-published Ladybird book on Climate Change which has been written with co-authors HRH The Prince of Wales and Tony Juniper, former Executive Director of Friends of the Earth.
Next week’s speaker is Nassim Nicholas Taleb from New York University, and author of the bestseller The Black Swan. Taleb will speak on the theme of ‘Extreme Events and How to Live with Them.’ His research shows where conventional statistical tools fail, such as the conventional law of large numbers, and how robust statistics are not robust at all.
Other speakers this term include Professor David Runciman on Dealing with Extremism; ocean rower Roz Savage on her story of rowing solo across the Atlantic and Pacific; Professor Andy Fabian on Extremes of the Universe; Oxford’s Professor Sarah Harper on Extreme Ageing; and the BBC’s Lyse Doucet on reporting from extreme environments. Full details of the series are available at: www.dar.cam.ac.uk/lectures.
“We have again attracted a mix of outstanding speakers, representing the natural and the social sciences, as well as the humanities and the world beyond academia,” said Julius Weitzdörfer, who convened the series with Duncan Needham. “All of them are not only highly interesting people, but also excellent communicators.”
Admission to the lectures is free and open to all, however those interested in attending should arrive early in order to secure a place in the main hall (lectures start at 5.30pm). An adjacent overflow theatre (with a live TV feed) is provided for those who cannot be seated in the main hall.
From climate change and extending the human lifespan to political extremism and reporting from war zones, this year’s Darwin College Lecture Series will focus on some of the extremes faced by society.
Previous studies have focused on direct economic costs within the blackout zone, failing to take account of indirect domestic and international supply chain loss from extreme space weather.
According to the study, published in the journal Space Weather, on average the direct economic cost incurred from disruption to electricity represents just under a half of the total potential macroeconomic cost.
The paper was co-authored by researchers from the Cambridge Centre for Risk Studies at University of Cambridge Judge Business School, British Antarctic Survey, the British Geological Survey and the University of Cape Town.
Under the study’s most extreme blackout scenario, affecting two-thirds of the US population, the daily domestic economic loss could total $41.5 billion plus an additional $7 billion loss through the international supply chain.
Electrical engineering experts are divided on the possible severity of blackouts caused by “Coronal Mass Ejections,” or magnetic solar fields ejected during solar flares and other eruptions. Some believe that outages would last only hours or a few days because electrical collapse of the transmission system would protect electricity generating facilities, while others fear blackouts could last weeks or months because those transmission networks could in fact be knocked out and need replacement.
Extreme space weather events occur often, but only sometimes affecting Earth. The best-known geomagnetic storm affected Quebec in 1989, sparking the electrical collapse of the Hydro-Quebec power grid and causing a widespread blackout for about nine hours.
There was a very severe solar storm in 1859 known as the “Carrington event” (after the name of a British astronomer). A widely cited 2012 study by Pete Riley of Predictive Sciences Inc. said that the probability of another Carrington event occurring within the next decade is around 12 per cent; a 2013 report by insurer Lloyd’s, produced in collaboration with Atmospheric and Environmental Research, said that while the probability of an extreme solar storm is “relatively low at any given time, it is almost inevitable that one will occur eventually.”
“We felt it was important to look at how extreme space weather may affect domestic US production in various economic sectors, including manufacturing, government and finance, as well as the potential economic loss in other nations owing to supply chain linkages,” says study co-author Dr Edward Oughton of the Cambridge Centre for Risk Studies.
“It was surprising that there had been a lack of transparent research into these direct and indirect costs, given the uncertainty surrounding the vulnerability of electrical infrastructure to solar incidents.”
The study looks at three geographical scenarios for blackouts caused by extreme space weather, depending on the latitudes affected by different types of incidents.
If only extreme northern states are affected, with 8 per cent of the US population, the economic loss per day could reach $6.2 billion supplemented by an international supply chain loss of $0.8 billion. A scenario affecting 23 per cent of the population could have a daily cost of $16.5 billion plus $2.2 billion internationally, while a scenario affecting 44 per cent of the population could have a daily cost of $37.7 billion in the US plus $4.8 billion globally.
Manufacturing is the US economic sector most affected by those solar-induced blackouts, followed by government, finance and insurance, and property. Outside of the US, China would be most affected by the indirect cost of such US blackouts, followed by Canada and Mexico as these countries provide a greater proportion of raw materials, and intermediate goods and services, used in production by US firms.
Oughton, EJ et al. Quantifying the daily economic impact of extreme space weather due to failure in electricity transmission infrastructure. Space Weather; 18 Jan 2017; DOI: 10.1002/2016SW001491
Adapted from a press release by the Cambridge Judge Business School.
The daily economic cost to the USA from solar storm-induced electricity blackouts could be in the tens of billions of dollars, with more than half the loss from indirect costs outside the blackout zone, according to a new study led by University of Cambridge researchers.
Christmas tidings of peace and goodwill in Mozambique seemed almost too good to be true after four years of sporadic but escalating civil conflict.
On December 26, Afonso Dhlakama, leader of the Renamo opposition movement, told the media that he and President Felipe Nyusi had spoken by phone and agreed to a provisional ceasefire.
A week later they agreed to extend the truce by a further 60 days. The good news was unexpected given that international mediators had recently packed up and left Mozambique after six months of stop-start talks that made almost no progress.
A further oddity of the conflict is that the Renamo guerrillas as well as its parliamentarians are under one man’s leadership – Dhlakama. This means he leads a guerrilla force as well as parliamentarians who debate against the Frelimo government in the National Assembly.
Dhlakama was brought into electoral politics as a result of the 1992 peace accord. But by 2009 he was disillusioned with his party’s declining performance at the polls and relocated to the northern city of Nampula, a place where Renamo has long had solid support.
It was there that his bodyguards – a force he was allowed to retain in the terms of the peace accord – exchanged fire with the police. Following the shootout, Dhlakama moved again, this time to Satungira, his old wartime redoubt in the Gorongosa National Park in Sofala Province of central Mozambique.
Renamo soldiers, mostly ageing civil war veterans who had not received the demobilisation benefits they expected in 1992, began to gather and form encampments at locations across central and northern Mozambique.
Renamo ambushes on the main roads and exchanges of fire with government forces became more frequent through 2013 and 2014.
Elections in October 2014 brought a truce. But from late 2015 government forces started attacking Renamo positions and targeting civilians suspected of supporting Renamo. During 2016 at least eight Renamo officials were assassinated. Renamo in turn became less restrained in attacking civilian targets, including local government officials.
Dialogue mediated by Mozambican civil society groups secured the truce before the 2014 elections, but failed to find a more enduring settlement. Renamo had been pushing for international mediators and foreign teams arrived in Mozambique in July 2016.
The government and Renamo each got to pick members of the mediation team. The government called on the Southern African Development Community and on Jonathan Powell, a former chief-of-staff to British Prime Minister Tony Blair.
Renamo got the Catholic Church and the European Union on board. When the mediators left in December after half-a-year of stop-start talks, they made it clear that there was little point in them being there when little progress had been made towards common ground.
The main sticking point involved a political demand put on the table by Renamo: that it be granted the power to appoint provincial governors in the provinces where it claims to have won an electoral majority. Which provinces Renamo won is a further matter of dispute.
This solution would involve a shift away from today’s centralised politics, whereby Frelimo, as the winner of the elections at national level, gets to appoint all the provincial governors.
But it’s also not exactly a gain for democracy: the proposal is not for the provinces to choose their own leaders, but for Renamo, rather than Frelimo, to appoint governors in certain provinces on the basis of previous election results.
At one point during the negotiations, it looked as though the government might be about to make concessions on the crucial issue of decentralisation, only to backtrack. This apparent dithering reflects differing opinions within the party.
On the one hand, a centralised state is an article of faith for party hawks, who also fear that Renamo appointments to provincial governorships would create centres of patronage for Renamo and represent cracks in Frelimo’s dominance of state power.
But another tendency within the party, likely including Nyusi himself, believes Frelimo has little to fear from decentralisation. This more flexible position on Nyusi’s side could explain why a couple of ad-hoc phone chats between him and Dhlakama have managed to keep alive the idea of a peace just weeks after the mediation process fizzled out.
A question of sovereignty
Yet the terms of the ceasefire remain unresolved, and this poses an immediate threat to the truce. Renamo has promised to continue operating patrols within a 3km radius of its bases. The government refuses to keep its distance from Renamo bases, which Renamo sees as provocation.
This is not a trivial issue, but goes to the heart of a question about sovereignty and political legitimacy. The same disagreement over where government forces can and cannot go derailed the peace talks in August 2016 when the mediators tried to negotiate a security corridor for them to visit Dhlakama at Satungira.
The government has maintained the position that its sovereign prerogatives allow it to deploy its forces wherever it will, and that there is no such thing as Renamo territory.
Renamo, on the other hand, portrays the war as a conflict between equals and insists that it has the right to defend its positions against what it speaks of as government aggression. As things stand now, a skirmish between soldiers of the two sides could easily be seized upon by Dhlakama or by a Frelimo hawk as a reason to declare the truce null and void.
Whether or not the ceasefire holds, Mozambique’s leaders are only starting to face up to the consequence of the country’s financial crisis. In October, Mozambique acknowledged it could not pay its debts.
This is the outcome of a mounting scandal that broke earlier in 2016, when Mozambique revealed that it had US$1 billion in undeclared debt: the result of government bailouts for two part state-owned companies.
Major lenders promptly halted loans. The government also continues to be haunted by the disappearance of $600 million in bonds issued by the state fishing company supposedly to buy new boats. There are suspicions that the missing money was channelled into the war effort.
This squandering of state resources has had consequences for Mozambique’s development indicators: half of rural people live below the poverty line, a figure barely reduced since 2003. Although Renamo is in no position to give farmers a better deal, it has won some sympathy for its cause by exploiting a sense of resentment in the largely rural provinces of the centre and north.
After four years of escalating civil conflict, a truce has unexpectedly arisen in Mozambique. But what are the chances of this ceasefire lasting, asks Justin Pearce, Leverhulme Early Career Fellow in Politics and International Studies & Research Associate of St John's College.
The Food Standards Agency (FSA) today launched its Go for Gold campaign, encouraging us not to burn our roast or fried vegetables and keep our oven chips at a nice golden colour. The idea is to reduce people’s intake of acrylamide, a chemical that is “created when many foods, particularly starchy foods like potatoes and bread, are cooked for long periods at high temperatures, such as when baking, frying, grilling, toasting and roasting.” (FSA)
Acrylamide can be, in large doses, a very nasty substance. It is used as an industrial sealant, and workers with very high exposures suffered serious neurotoxicity. Very high doses have been shown to increase the risk of mice getting cancer. The IARC (International Agency for Research on Cancer) considers it a ‘probable human carcinogen’, putting it in the same category as many chemicals, red meat, being a hairdresser and shift-work.
However, there is no good evidence of harm from humans consuming acrylamide in their diet: Cancer Research UK say that “At the moment, there is no strong evidence linking acrylamide and cancer.”
This is not for want of trying. A massive report from the European Food Standards Agency (EFSA) lists 16 studies and 36 publications, but concludes
In the epidemiological studies available to date, AA intake was not associated with an increased risk of most common cancers, including those of the GI or respiratory tract, breast, prostate and bladder. A few studies suggested an increased risk for renal cell, and endometrial (in particular in never-smokers) and ovarian cancer, but the evidence is limited and inconsistent. Moreover, one study suggested a lower survival in non-smoking women with breast cancer with a high pre-diagnostic exposure to AA but more studies are necessary to confirm this result. (p185)
Remember that each study is testing an association with a long list of cancers, so using the standard criteria for statistical significance, we would expect 1in 20 of these associations to be positive by chance alone.
A standard response might be the over-used cliché: ‘absence of evidence is not evidence of absence’. If there has been a huge effort to find an association, and none has been found, it’s true that this may not be direct evidence of the absence of an effect (though this can never be proved anyway). But it can be considered evidence of something that is not very important.
Given the numbers provided by EFSA and FSA, it is perhaps unsurprising that no association has been shown in large studies. EFSA calculated a BMDL10 of 170 µg/kg body weight/day — this means it is unlikely that exposures at this level would cause tumours in mice (technically it is the lower end of a confidence interval for the dose that would cause 10% increased tumours). They then compare this with human acrylamide exposure obtained from multiple detailed dietary surveys, which for adults has an average of 0.56 and a ‘high’ of 1.1 µg/kg/day, in the sense that 97.5% of people consume less than this. The BMDL10 is then divided by these exposures to give the ‘margin of exposure’, which rather confusingly end up being high for low risks and low for high risks.
So, for example, adults with the highest consumption of acrylamide could consume 160 times as much and still only be at a level that toxicologists think unlikely to cause increased tumours in mice (that's essentially what the ‘margin of exposure’ means).
This all seems rather reassuring, and may explain why it’s been so difficult to observe any effect of acrylamide in diet. But, for cancer, toxicology committees demand a rather arbitrary margin of exposure of 10,000 before considering the chemical essentially acceptable.
Reactions to the FSA’s Go for Gold campaign may range from the extremes of encouraging obsessive concern in the worried-well, to irate editorials on yet another intrusion from the ‘nanny state’. More worrying, people may just consider this yet another scare story from scientists, and lead them to dismiss truly important warnings about, say, the harms from obesity.
Cancer Research UK say that “researchers estimate that overweight and obesity are behind around 18,000 cases of cancer each year in the UK”. In stark contrast, the FSA provide no estimate of the current harm caused by acrylamide, nor the benefit from any reduction due to people following their advice. To be honest, I am not convinced it is appropriate to launch a public campaign on this basis.
The Winton Centre for Risk and Evidence Communication is a new centre hosted within the Department of Pure Mathematics and Mathematical Statistics.
A new campaign is warning people that burning some food, such as toast, is a potential cancer risk. Here, the evidence for this claim is explored by David Spiegelhalter, Professor of the Public Understanding of Risk at the new Winton Centre for Risk and Evidence Communication.
Self-righteousness, gratitude, sympathy, sincerity, and guilt – what if these social behaviours are biologically influenced, encoded within our genes and shaped by the forces of evolution to promote the survival of the human species? Does free will truly exist if our genes are inherited and our environment is a series of events set in motion before we are born?
American biologist E O Wilson made these arguments when he published Sociobiology: The New Synthesis in 1975 and On Human Nature in 1978. Wilson is the father of sociobiology, a field that believes social behaviour in animals, including humans, is biologically determined – partially shaped by genes and the forces of evolution. Time magazine picked up the emerging new scientific field, dedicating the August 1977 cover to “Sociobiology: A New Theory of Behavior.”
Today, it is a field still shrouded with controversy, but one which is offering new views on how our environment influences who we are and what we do.
Likened to eugenics
At its conception, sociobiology ignited heated criticism from prominent biologists including Stephen Jay Gould and Robert Lewontin. They argued that the field was biologically determinist and perpetuated eugenic ideologies that sought to legitimise racial and social hierarchies. As critics pointed out, while “sociobiology” as a formal field did not come into existence until the 1970s, research that used biological explanations to justify social phenomena was not new.
To figures such as Gould and Lewontin, this “biosocial” scientific language lived in the fields of physical anthropology and eugenics. In the early 20th century, eugenicists like Madison Grant had used this kind of language to explain and justify class and race hierarchies. Supporters of such ideas used them to advocate for social policies prohibiting class and racial mixing, and restrictions on immigration.
Biosocial science was soon used as a guise for the eugenics movement. The American Eugenics Society changed its name in 1972 to the Society for the Study of Social Biology, three years before the field of “sociobiology” was formally established. The society’s official journal Eugenics Quarterly, whose first volume in 1954 focused heavily on IQ differences between population groups, changed its name to Social Biology in 1969. It continues to exist today under the name of Biodemography and Social Biology.
Social life in ‘molecular terms’
Sociobiology has also influenced the development of “sociogenomics” – a term coined in 2005 by molecular biologist Gene Robinson whose work examines the genetic mechanisms governing social behaviour in the honeybee. Though early sociogenomics work focused primarily on insect populations, the field has moved to include an examination of human populations.
Sociogenomics is a field driven by two desires. The first is to identify the genes and pathways that regulate aspects of development, physiology and behaviour that in turn influence the way animals or humans develop social links and form cooperative communities. The second is to determine how these genes and pathways themselves are influenced by social life and social evolution. Yet in practice, these two main components of sociogenomics research seem to be in conflict.
One side tries to identify genetic markers associated with behaviours commonly thought to be shaped by social interactions. Researchers have looked at everything from political orientation to educational attainment and antisocial behaviour linked to criminality.
Some studies have sought to find genetic variations linked to social phenomena like social deprivation and household income. One study claimed to have identified common genetic variations that can explain up to 21% of the observed differences in social deprivation between individuals.
Nature and nurture
The other side of sociogenomics examines how the environment moderates what’s called “gene expression”. This is the process by which genes are “activated” to synthesise proteins that allow the genotype (an individual’s genetic makeup) to give rise to a phenotype (an observed behaviour or trait).
In this form of sociogenomics, the classical argument of “nature versus nurture” becomes more clearly a matter of both “nature and nurture”. Social or environmental conditions such as low social status, social isolation or low socioeconomic status have been found to change the expression of hundreds of genes in both animals and humans.
This is now considered by some to be potentially transformative in our approach to addressing inequality. For example, biosocial research which shows how structural or environmental aspects influence biological processes could throw much needed weight behind socially-oriented policies. On the other hand, biosocial researchers might argue that rather than fix what’s happening in society, we could focus on trying to treat biological deficits.
“Gene x environment” studies, as they are called, have found that in the US, low socioeconomic status represses an individual’s genetic potential. This means, for example, that the high estimates for genetic influence on educational attainment may only fully apply to those living in well-off circumstances, where money, status, and comfort are not pressing concerns.
Mixing the hard and social sciences
Some advocates for the biosocial sciences believe the social sciences will become more robust and more highly regarded with the incorporation of genetics research. There are sociologists, economists, and political scientists who are already beginning to bring genetic analyses into their work. They argue that this additional data may help the social sciences “better understand patterns of human behavior, enhance individuals’ self-understanding, and design optimal public policy”.
Such mixing of the traditionally hard and social sciences has produced studies in sociogenomics examining how high taxation of tobacco products meant to discourage people from purchasing harmful products may not be beneficial for those with a particular variant of the nicotine receptor that might make them willing to pay more for tobacco. It has also contributed to research looking at cortisol levels in young ethnic-minorities as they note racism or discrimination. This work has highlighted how everyday micro-aggressions and social inequality can have real and harmful biological consequences.
These studies point to the continued desire to explain social phenomena through biology. As the biosocial sciences continue the journey to analyse everyday human life and behaviour, they have the potential to have a profound impact – both positive and negative – on our understandings of how we as individuals and we as a society operate.
The idea that social behaviours are biologically influenced is controversial, but may provide new views on how our environment influences who we are and what we do, writes Daphne Martschenko from the Faculty of Education.
In medicine, vaccinating against a virus involves exposing a body to a weakened version of the threat, enough to build a tolerance.
Social psychologists believe that a similar logic can be applied to help “inoculate” the public against misinformation, including the damaging influence of ‘fake news’ websites propagating myths about climate change.
A new study compared reactions to a well-known climate change fact with those to a popular misinformation campaign. When presented consecutively, the false material completely cancelled out the accurate statement in people’s minds – opinions ended up back where they started.
Researchers then added a small dose of misinformation to delivery of the climate change fact, by briefly introducing people to distortion tactics used by certain groups. This “inoculation” helped shift and hold opinions closer to the truth, despite the follow-up exposure to ‘fake news’.
The study on US attitudes found the inoculation technique shifted the climate change opinions of Republicans, Independents and Democrats alike.
Published in the journal Global Challenges, the study was conducted by researchers from the universities of Cambridge, UK, Yale and George Mason, US. It is one of the first on ‘inoculation theory’ to try and replicate a ‘real world’ scenario of conflicting information on a highly politicised subject.
“Misinformation can be sticky, spreading and replicating like a virus,” says lead author Dr Sander van der Linden, a social psychologist from the University of Cambridge and Director of the Cambridge Social Decision-Making Lab.
“We wanted to see if we could find a ‘vaccine’ by pre-emptively exposing people to a small amount of the type of misinformation they might experience. A warning that helps preserve the facts.
“The idea is to provide a cognitive repertoire that helps build up resistance to misinformation, so the next time people come across it they are less susceptible.”
Fact vs. Falsehood
To find the most compelling climate change falsehood currently influencing public opinion, van der Linden and colleagues tested popular statements from corners of the internet on a nationally representative sample of US citizens, with each one rated for familiarity and persuasiveness.
The winner: the assertion that there is no consensus among scientists, apparently supported by the Oregon Global Warming Petition Project. This website claims to hold a petition signed by “over 31,000 American scientists” stating there is no evidence that human CO2 release will cause climate change.
The study also used the accurate statement that “97% of scientists agree on manmade climate change”. Prior work by van der Linden has shown this fact about scientific consensus is an effective ‘gateway’ for public acceptance of climate change.
In a disguised experiment, researchers tested the opposing statements on over 2,000 participants across the US spectrum of age, education, gender and politics using the online platform Amazon Mechanical Turk.
In order to gauge shifts in opinion, each participant was asked to estimate current levels of scientific agreement on climate change throughout the study.
Those shown only the fact about climate change consensus (in pie chart form) reported a large increase in perceived scientific agreement – an average of 20 percentage points. Those shown only misinformation (a screenshot of the Oregon petition website) dropped their belief in a scientific consensus by 9 percentage points.
Some participants were shown the accurate pie chart followed by the erroneous Oregon petition. The researchers were surprised to find the two neutralised each other (a tiny difference of 0.5 percentage points).
“It’s uncomfortable to think that misinformation is so potent in our society,” says van der Linden. “A lot of people’s attitudes toward climate change aren’t very firm. They are aware there is a debate going on, but aren’t necessarily sure what to believe. Conflicting messages can leave them feeling back at square one.”
Alongside the consensus fact, two groups in the study were randomly given ‘vaccines’:
For those ‘inoculated’ with this extra data, the misinformation that followed did not cancel out the accurate message.
The general inoculation saw an average opinion shift of 6.5 percentage points towards acceptance of the climate science consensus, despite exposure to fake news.
When the detailed inoculation was added to the general, it was almost 13 percentage points – two-thirds of the effect seen when participants were just given the consensus fact.
The research team point out that tobacco and fossil fuel companies have used psychological inoculation in the past to sow seeds of doubt, and to undermine scientific consensus in the public consciousness.
They say the latest study demonstrates that such techniques can be partially “reversed” to promote scientific consensus, and work in favour of the public good.
The researchers also analysed the results in terms of political parties. Before inoculation, the fake negated the factual for both Democrats and Independents. For Republicans, the fake actually overrode the facts by 9 percentage points.
However, following inoculation, the positive effects of the accurate information were preserved across all parties to match the average findings (around a third with just general inoculation; two-thirds with detailed).
“We found that inoculation messages were equally effective in shifting the opinions of Republicans, Independents and Democrats in a direction consistent with the conclusions of climate science,” says van der Linden.
“What’s striking is that, on average, we found no backfire effect to inoculation messages among groups predisposed to reject climate science, they didn’t seem to retreat into conspiracy theories.
“There will always be people completely resistant to change, but we tend to find there is room for most people to change their minds, even just a little.”
New research finds that misinformation on climate change can psychologically cancel out the influence of accurate statements. However, if legitimate facts are delivered with an “inoculation” – a warning dose of misinformation – some of the positive influence is preserved.
Professor Neely, currently Head of the Institute for Manufacturing and the Manufacturing and Management Division of the Engineering Department, and a fellow of Sidney Sussex College, will take up the position of Pro-Vice-Chancellor for Enterprise and Business Relations in March 2017.
He joined the University of Cambridge in the early 1990s, initially as a researcher, before taking up the institution’s first joint lectureship between the Engineering Department and the Cambridge Judge Business School. In 2000 he was appointed to a chair at Cranfield School of Management and between 2003-2012 he was deputy director of the Advanced Institute of Management (AIM) Research, then the UK’s largest ever investment in management research. On his return to the University of Cambridge, he founded the Cambridge Service Alliance – a research partnership with leading firms to explore service business model innovation in manufacturing.
As Pro-Vice-Chancellor for Enterprise and Business Relations, Professor Neely will lead a strategy to enhance and develop the University’s engagements and partnerships with industry and commerce, and the wider enterprise economy in the region.
Professor Neely said: “I am delighted to have been offered the role of Pro-Vice-Chancellor for Enterprise and Business Relations. Universities make a difference in the world through their research, education and engagement and I am looking forward to working with colleagues from across the University to help strengthen our relationships with large and small firms alike.”
The University of Cambridge Vice-Chancellor, Professor Sir Leszek Borysiewicz, said: “I am pleased to announce the appointment of Andrew as Pro-Vice-Chancellor for Enterprise and Business Relations. He has an impressive track record of working in higher education and with business and industry. This will help strengthen the University’s efforts to consolidate, as well as develop, business partnerships and enterprise opportunities.”
Professor Andrew Neely has been appointed as the University of Cambridge’s Pro-Vice-Chancellor for Enterprise and Business Relations