Quantcast
Channel: University of Cambridge - Latest news
Viewing all 4368 articles
Browse latest View live

Teaching celebrated across the University

$
0
0
Dr Martin Ruehl, Faculty of Modern and Medieval Languages
The winners include a veterinary anaesthetist praised for developing an acclaimed Clinical Skills Centre, a pioneer of interdisciplinary Gender Studies programmes, and a Classicist as passionate about outreach as Ovid.
 
While the prizes reveal the diversity of teaching at Cambridge, certain themes emerge, in particular a focus on the individual student, the value of research-led teaching, the continuing importance of one-to-one teaching, innovation in teaching practice, interdisciplinary approaches, and above all, dedication to students.
 
The Pilkington Prizes were initiated by Sir Alastair Pilkington – graduate of Trinity College, engineer, businessman and the first Chairman of the Cambridge Foundation – who passionately believed that teaching excellence was crucial to Cambridge’s future success.
 
The Pilkington Prizes are organised by The Cambridge Centre for Teaching and Learning, which supports staff by providing training, developing networks, hosting events and encouraging and funding innovation. The Centre also provides a focus for strategic priorities within Cambridge and for engaging with national and international developments in higher education.
 
New films about teaching at Cambridge
 
The University has produced a series of films about five of this year’s Pilkington Prize winners. These films go behind the scenes to show Cambridge teaching in action as well as inviting winners to explain their passion for teaching and reveal some of their trade secrets. The films feature Lecturer in German Thought, Martin Ruehl; Physics Lecturer Lisa Jardine-Wright; Sociologist Mónica Moreno Figueroa; Zoologist Andrew Balmford; and Design Engineer James Moultrie. Watch the films here.
 
Dr Martin Ruehl said:
 
"I was an undergraduate here myself so I want to give back some of what I received. I had a number of very charismatic teachers who inspired me back then. I think the trick is always to find something that’s growing out of your own research."
 
Dr Lisa Jardine-Wright said:
 
"It is only when you start teaching a subject that you really start to understand it and all of its nuances. The most important thing for me is that my students are willing to make mistakes, and learn from them."
 
This year’s winners in full are:
 
Dr Anthony Ashton (Department of Mathematics)
Dr Jackie Brearley (Department of Veterinary Medicine)
Dr Jude Browne (Department of Politics and International Studies)
Dr Menna Clatworthy (School of Clinical Medicine)
Dr Richard Davies (School of Clinical Medicine)
Dr Ingo Gildenhard (Faculty of Classics)
Dr Nigel Kettley (Institute of Continuing Education)
Professor Jochen Runde (Judge Business School)
Dr Martin Ruehl (Faculty of Modern and Medieval Languages)
Dr Lisa Jardine-Wright (Department of Physics)
Dr Mónica Moreno Figueroa​ (Department of Sociology)
Professor Andrew Balmford (Department of Zoology)
Dr James Moultrie (Department of Engineering)
 

Thirteen Cambridge academics have been recognised for their outstanding teaching in the University’s 24th Pilkington Prizes.

"I was an undergraduate here myself so I want to give back some of what I received. I think the trick is always to find something that’s growing out of your own research."
Dr Martin Ruehl, Faculty of Modern and Medieval Languages
Dr Martin Ruehl, Faculty of Modern and Medieval Languages, University of Cambridge

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Computer-designed antibodies target toxins associated with Alzheimer’s disease

$
0
0

The researchers used computer-based methods to develop antibodies – the star players of the body’s natural defence system – to target the deposits of misfolded proteins which are a hallmark of Alzheimer’s disease. Early tests of the antibodies in test tubes and in nematode worms showed an almost complete elimination of these pathogens.

The antibodies were designed to systematically scan the sequence of amyloid-beta, the main component of the toxic deposits associated with Alzheimer’s disease. By targeting specific regions, or epitopes, of the amyloid-beta sequence, the different antibodies were able to block amyloid-beta’s ability to stick together, or aggregate. Their results are reported in the journal Science Advances.

Alzheimer’s disease is the most common form of dementia, which affects nearly one million people in the UK and about 50 million worldwide. One of the hallmarks of Alzheimer’s disease is the build-up of protein deposits, known as plaques and tangles, in the brains of affected individuals. These deposits, which accumulate when naturally-occurring proteins in the body fold into the wrong shape and aggregate, are formed primarily of two proteins: amyloid-beta and tau.

The process of protein aggregation also creates smaller clusters called oligomers, which are highly toxic to nerve cells and are thought to be responsible for brain damage in Alzheimer’s disease. Researchers around the world have spent decades attempting to unravel the processes that cause Alzheimer’s disease, and to target the misfolding proteins before they are able to aggregate.

Antibodies are dedicated proteins that help defend the body against harmful pathogens by recognising their specific targets, known as antigens. The power of antibodies can be harnessed to make effective treatments, such as vaccines, but to date no antibody has been developed to treat Alzheimer’s or any other neurodegenerative disease, although several antibody-based treatments for Alzheimer’s disease are currently in clinical trials.

“Developing antibody-based therapies is costly and time-consuming, but if we can find better and cheaper ways of producing antibodies, we would increase the chances of finding treatments for patients – making them by design can create opportunities to achieve this goal,” said Professor Michele Vendruscolo from the Centre for Misfolding Diseases in Cambridge, and the paper’s senior author.

To date, there have been two main ways of producing antibodies. The first, which has been in use for about 50 years, is to inject animals with the relevant antigen. The antigen stimulates the immune system to produce antibodies to attack the alien substance, and those antibodies can then be extracted as a therapeutic. The second method, developed in the 1990s, does not require the use of animals and instead relies on the screening of large laboratory-constructed libraries to isolate the relevant antibodies.

“In the past few years, thanks to increasingly powerful computers and large structural databases, it has become possible to design antibodies in a computer, which substantially lowers the time and cost required,” said study co-author Dr Pietro Sormanni, a postdoctoral researcher in the Centre for Misfolding Diseases. “It also allows us to target specific regions within the antigen, as well as to control for other properties critical for clinical applications, such as antibody stability and solubility.”

One of the advantages of the antibodies used in this study is their very small size. In these smaller antibodies, called single-domain antibodies, the ‘trigger’ for an immune response is stripped off, thereby blocking the inflammatory reactions that have so far prevented the widespread adoption of antibody-based therapies for Alzheimer’s disease.

A major advantage of these designed antibodies is that they can be systematically produced to bind to the different regions of the target protein. In this way researchers can extensively and inexpensively explore a variety of mechanisms of action, and select the most effective one for blocking the production of toxins.

“Since the designed antibodies can selectively target oligomers, which are present in low numbers relative to the total amounts of amyloid-beta, we expect them to be effective even when administered in low doses,” said Dr Francesco Aprile, a Senior Research Fellow of the Alzheimer's Society in the Centre for Misfolding Diseases and the study’s first author.

Not only are these antibodies designed to not stimulate an immune response, but they are also much smaller than standard antibodies, so they could be delivered more effectively to the brain through the blood-brain barrier. Aprile has recently been awarded the 2017 ‘Outstanding early-career contribution to dementia’ award by the Alzheimer’s Society for his work.

“The innovative approach taken by Dr Aprile and his colleagues tackles the issue of developing drugs for Alzheimer’s disease from a new angle, by using advanced computer techniques to design drugs that specifically block a crucial aspect of the disease process,” said James Pickett, Head of Research at the Alzheimer’s Society. “Over the last 50 years, advances in antibody technology have delivered radical new treatments for a wide range of common diseases including rheumatoid arthritis, multiple sclerosis and some forms of cancer. While the research is still in the early stages, we are excited by the potential of this work and hope it can do the same for Alzheimer’s disease.”

“These results indicate that computational methods are becoming ready to be used alongside existing antibody discovery methods, enabling the exploration of new ways of treating a range of human diseases,” said Vendruscolo.

Reference:
Francesco A. Aprile et al. ‘Selective targeting of primary and secondary nucleation pathways in Aβ42 aggregation using a rational antibody scanning method.’ Science Advances (2017). DOI: 10.1126/sciadv.1700488

Researchers at the University of Cambridge have designed antibodies that target the protein deposits in the brain associated with Alzheimer’s disease, and stop their production. 

If we can find better and cheaper ways of producing antibodies, we would increase the chances of finding treatments for patients
Michele Vendruscolo
Brain showing hallmarks of Alzheimer’s disease (plaques in blue)

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Cambridge BRAINFest 2017 kicks off a weekend celebrating the wonders of the brain

$
0
0

The festival, which runs until Sunday 25 June, will allow audiences to quiz more than 130 leading Cambridge neuroscientists on everything from why we get fat to how to repair a ‘broken’ brain.

“We’re really excited by the opportunity to share the cutting-edge brain research taking place at Cambridge with the public,” says Dr Dervila Glynn, coordinator of Cambridge Neuroscience, who is organising the event. “This is a chance for everyone to exercise their brain cells in a fun and engaging way. And along the way, I think everyone – adults, children and our professors – will learn something new.”

Tonight at the Babbage Lecture Theatre, BBC Horizon presenter Dr Giles Yeo will reveal why we are all getting fatter, while Professor Usha Goswami will explain how dyslexic brains may be “in tune but out of time”. Poet Lavinia Greenlaw will perform a moving poem about dementia, while Cambridgeshire-based Dance Ensemblé explores the story of Parkinson’s disease through the medium of dance – before Professor Roger Barker describes how to repair the diseased brain.

On Saturday and Sunday, the Cambridge Corn Exchange transforms into an interactive tour of the brain, with themes including ‘Development’, ‘Brain & Body’, ‘Pain & Pleasure’, Perception & Imagination’ and ‘Learning & Forgetting’ spanning research from molecules to man. Visitors, adults and children alike, will get the opportunity to take part in experiments across 30 different interactive exhibits, build their own brain and get the chance to see a series of films looking at conditions such as dementia and OCD. Café Scientifique will explore the breadth of brain science from body clocks and brain networks to the weird and wonderful world of the naked mole-rat.

On Saturday night, a panel of experts from the University of Cambridge and Cambridgeshire & Peterborough NHS Foundation Trust will explore ways to help us better understand and treat mental health disorders and look at how we can bridge the existing gap between neuroscience research and current practice in the health service. The panel, chaired by Professor Sir Simon Wessely, President of the Royal College of Psychiatrists, will look at issues including how the brain and body interact, the stigma surrounding mental health problems, and the transition between child and adult psychiatry.

Those wishing to stretch their legs as well as their minds can pick up a ‘Neurotrail’ map at the Corn Exchange, which will lead them around the places, people, and discoveries that have put our city at the heart of our understanding of the brain. Explorers can discover scientific instruments, first editions of old manuscripts and stories surrounding some of the brain scientists throughout Cambridge’s history at a number of special ‘pop up brain-themed exhibitions’: Cambridge University Library and the Whipple Museum will open to visitors on Saturday 24 June, and the Old Library at Christ’s College will be open to visitors on Sunday 25 June.

School children from in and around Cambridge have been busy creating brain-inspired art, which will help transform the foyer of the Corn Exchange into a BRAINArt exhibition of brain-inspired art by local school children.

All events are free, but booking is recommended for the evening events at the Babbage Lecture Theatre. Further details, including how to book, can be found on the Cambridge BRAINFest 2017 website.

Join the #CambridgeBRAINfest conversation on Twitter @CamNeuro and on Facebook

Cambridge today (23 June) begins a three-day celebration of the wonders of the brain, with talks, hands-on activities and a ‘secret cinema’ – all part of Cambridge BRAINFest 2017, a free public festival celebrating the most complex organ in the body.

This is a chance for everyone to exercise their brain cells in a fun and engaging way. And along the way, I think everyone – adults, children and our professors – will learn something new
Dervila Glynn
Exercise Plays Vital Role Maintaining Brain Health (edited)

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

How to train your drugs: from nanotherapeutics to nanobots

$
0
0

Chemotherapy benefits a great many patients but the side effects can be brutal.

When a patient is injected with an anti-cancer drug, the idea is that the molecules will seek out and destroy rogue tumour cells. However, relatively large amounts need to be administered to reach the target in high enough concentrations to be effective. As a result of this high drug concentration, healthy cells may be killed as well as cancer cells, leaving many patients weak, nauseated and vulnerable to infection.

One way that researchers are attempting to improve the safety and efficacy of drugs is to use a relatively new area of research known as nanothrapeutics to target drug delivery just to the cells that need it. 

Professor Sir Mark Welland is Head of the Electrical Engineering Division at Cambridge. In recent years, his research has focused on nanotherapeutics, working in collaboration with clinicians and industry to develop better, safer drugs. He and his colleagues don’t design new drugs; instead, they design and build smart packaging for existing drugs.

Nanotherapeutics come in many different configurations, but the easiest way to think about them is as small, benign particles filled with a drug. They can be injected in the same way as a normal drug, and are carried through the bloodstream to the target organ, tissue or cell. At this point, a change in the local environment, such as pH, or the use of light or ultrasound, causes the nanoparticles to release their cargo.

Nano-sized tools are increasingly being looked at for diagnosis, drug delivery and therapy. “There are a huge number of possibilities right now, and probably more to come, which is why there’s been so much interest,” says Welland. Using clever chemistry and engineering at the nanoscale, drugs can be ‘taught’ to behave like a Trojan horse, or to hold their fire until just the right moment, or to recognise the target they’re looking for.

“We always try to use techniques that can be scaled up – we avoid using expensive chemistries or expensive equipment, and we’ve been reasonably successful in that,” he adds. “By keeping costs down and using scalable techniques, we’ve got a far better chance of making a successful treatment for patients.”

In 2014, he and collaborators demonstrated that gold nanoparticles could be used to ‘smuggle’ chemotherapy drugs into cancer cells in glioblastoma multiforme, the most common and aggressive type of brain cancer in adults, which is notoriously difficult to treat. The team engineered nanostructures containing gold and cisplatin, a conventional chemotherapy drug. A coating on the particles made them attracted to tumour cells from glioblastoma patients, so that the nanostructures bound and were absorbed into the cancer cells. 

Once inside, these nanostructures were exposed to radiotherapy. This caused the gold to release electrons that damaged the cancer cell’s DNA and its overall structure, enhancing the impact of the chemotherapy drug. The process was so effective that 20 days later, the cell culture showed no evidence of any revival, suggesting that the tumour cells had been destroyed. 

While the technique is still several years away from use in humans, tests have begun in mice. Welland’s group is working with MedImmune, the biologics R&D arm of pharmaceutical company AstraZeneca, to study the stability of drugs and to design ways to deliver them more effectively using nanotechnology. 

“One of the great advantages of working with MedImmune is they understand precisely what the requirements are for a drug to be approved. We would shut down lines of research where we thought it was never going to get to the point of approval by the regulators,” says Welland. “It’s important to be pragmatic about it so that only the approaches with the best chance of working in patients are taken forward.” 

The researchers are also targeting diseases like tuberculosis (TB). With funding from the Rosetrees Trust, Welland and postdoctoral researcher Dr Íris da luz Batalha are working with Professor Andres Floto in the Department of Medicine to improve the efficacy of TB drugs. 

Their solution has been to design and develop nontoxic, biodegradable polymers that can be ‘fused’ with TB drug molecules. As polymer molecules have a long, chain-like shape, drugs can be attached along the length of the polymer backbone, meaning that very large amounts of the drug can be loaded onto each polymer molecule. The polymers are stable in the bloodstream and release the drugs they carry when they reach the target cell. Inside the cell, the pH drops, which causes the polymer to release the drug. 

In fact, the polymers worked so well for TB drugs that another of Welland’s postdoctoral researchers, Dr Myriam Ouberaï, has formed a start-up company, Spirea, which is raising funding to develop the polymers for use with oncology drugs. Ouberaï is hoping to establish a collaboration with a pharma company in the next two years.

“Designing these particles, loading them with drugs and making them clever so that they release their cargo in a controlled and precise way: it’s quite a technical challenge,” adds Welland. “The main reason I’m interested in the challenge is I want to see something working in the clinic – I want to see something working in patients.”

Could nanotechnology move beyond therapeutics to a time when nanomachines keep us healthy by patrolling, monitoring and repairing the body? 

Nanomachines have long been a dream of scientists and public alike. But working out how to make them move has meant they’ve remained in the realm of science fiction.

But last year, Professor Jeremy Baumberg and colleagues in Cambridge and the University of Bath developed the world’s tiniest engine – just a few billionths of a metre in size. It’s biocompatible, cost-effective to manufacture, fast to respond and energy efficient.

The forces exerted by these ‘ANTs’ (for ‘actuating nano-transducers’) are nearly a hundred times larger than those for any known device, motor or muscle. To make them, tiny charged particles of gold, bound together with a temperature-responsive polymer gel, are heated with a laser. As the polymer coatings expel water from the gel and collapse, a large amount of elastic energy is stored in a fraction of a second. On cooling, the particles spring apart and release energy.

The researchers hope to use this ability of ANTs to produce very large forces relative to their weight to develop three-dimensional machines that swim, have pumps that take on fluid to sense the environment and are small enough to move around our bloodstream.

Working with Cambridge Enterprise, the University’s commercialisation arm, the team in Cambridge's Nanophotonics Centre hopes to commercialise the technology for microfluidics bio-applications. The work is funded by the Engineering and Physical Sciences Research Council and the European Research Council.

“There’s a revolution happening in personalised healthcare, and for that we need sensors not just on the outside but on the inside,” explains Baumberg, who leads an interdisciplinary Strategic Research Network and Doctoral Training Centre focused on nanoscience and nanotechnology.

“Nanoscience is driving this. We are now building technology that allows us to even imagine these futures.” 

Read more about research on future therapeutics in Research Horizons magazine. 

Nanotechnology is creating new opportunities for fighting disease – from delivering drugs in smart packaging to nanobots powered by the world’s tiniest engines. 

Designing these particles, loading them with drugs and making them clever so that they release their cargo in a controlled and precise way: it’s quite a technical challenge.
Mark Welland
Artist's impression of a nanobot

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

World War II bombing associated with resilience, not ‘German Angst’

$
0
0

Germans have been stereotyped as being industrious and punctual, but also as being more likely to be anxious and worried, a phenomenon described as ‘German Angst’. Former German Chancellor Helmut Schmidt, widely regarded as one of Germany’s leading post-war intellectuals, once claimed, “The Germans have a tendency to be afraid. This has been part of their consciousness since the end of the Nazi period and the war”.

This personality type is characterised by high levels of neurotic personality traits (more likely to be in a negative emotional state), as opposed to traits of openness, agreeableness, extraversion, or conscientiousness, which together make up the ‘Big Five’ personality traits. It has been suggested that the heavy bombing of German cities in World War II, and the resulting destruction and trauma experienced by residents, may have been a contributory factor in this proposed higher incidence of neurotic traits.

In a study published this week in European Journal of Personality, an international team of researchers from the UK, Germany, USA, and Australia, analysed the neurotic personality traits and mental health of over 33,500 individuals across 89 regional German cities that experienced wartime bombing, and investigated whether people in cities that experienced higher levels of bombing were more likely to display neurotic traits. The researchers measured neurotic traits using the Big Five Inventory personality test as part of an online questionnaire, and focused on measures of neuroticism, anxiety, and depression.

“If the idea of ‘German Angst’ is true, then we’d expect people from cities that were heavily bombed during the war to be more anxious and less resilient to new stresses such as economic hardship,” says study author Dr Jason Rentfrow from the Department of Psychology, University of Cambridge. “Ours is the first study to investigate this link.”

The researchers found that in fact, residents of heavily bombed cities were less likely to display neurotic traits, suggesting that wartime bombing is not a factor in German Angst. The results indicate that residents of heavily bombed German cities instead recorded higher levels of mental resilience and were better able to cope in times of stress.

“We’ve seen from other studies that when people experience difficulties in life, these can provide them with a broader perspective on things and perhaps make more trivial stresses seem unimportant,” explains Dr Rentfrow. “It’s possible that this is what we are seeing here.”

The researchers also looked at how Germany compared to 107 other countries for neurotic traits, to see whether there really was evidence of ‘German Angst’. They found that Germany ranks 20th, 31st, and 53rd for depression, anxiety, and neuroticism respectively. Additionally, other countries that have experienced significant trauma due to warfare, such as Japan, Afghanistan, and Vietnam, also did not score highly for neurotic traits, further suggesting that such traumatic events are not associated with increased neuroticism.

“Germany didn’t stand out as high in anything resembling angst compared with other countries, which suggests that maybe this stereotype of ‘German Angst’ isn’t entirely valid,” says Dr Rentfrow. “Clearly we need to be careful about national stereotypes.”

The researchers emphasise that their findings show only an association, and that this data does not show whether more severe bombing caused greater mental resilience, or whether other factors were at play.

Although this research may have implications for other war-torn countries, including the current situation in Syria cities, the study did not investigate potential neuroticism or resilience in these countries, so no wider conclusions can be drawn from this data.

Study participants filled out online questionnaires provided by the global Gosling-Potter Internet Project, including 44 questions to assess their personality and mental state. Of the sample, just under 60% were female and the mean age was 30 years old. Almost all (96%) of the respondents were White/Caucasian while just under one in three (30%) had a bachelor’s degree or higher, and overall the sample was broadly representative of the populations of the cities assessed. Although the researchers tried to control for the movement of people between different cities, there were limitations with the data available from the online survey and so this movement may have affected the results.

The data also could not tell whether increased resilience was associated with a recent event, or whether it was associated with an event from many years or even decades ago. However, there is broader literature to support the notion of traumas increasing resilience in individuals, and more research in this area would shed further light on the relationship and potential mechanisms at play. 

Experiencing traumatic events may be associated with greater mental resilience among residents rather than causing widespread angst, suggests a study published this week that investigated the effect of World War II bombing on the mental health of citizens in German cities.

Maybe this stereotype of ‘German Angst’ isn’t entirely valid
Jason Rentfrow
A house in Darmstadt destroyed by an Allied bombing raid.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Opinion: Surprising ways to beat anxiety and become mentally strong – according to science

$
0
0

Do you have anxiety? Have you tried just about everything to get over it, but it just keeps coming back? Perhaps you thought you had got over it, only for the symptoms to return with a vengeance? Whatever your circumstances, science can help you to beat anxiety for good.

Anxiety can present as fear, restlessness, an inability to focus at work or school, finding it hard to fall or stay asleep at night, or getting easily irritated. In social situations, it can make it hard to talk to others; you might feel like you’re constantly being judged, or have symptoms such as stuttering, sweating, blushing or an upset stomach.

It can appear out of the blue as a panic attack, when sudden spikes of anxiety make you feel like you’re about to have a heart attack, go mad or lose control. Or it can be present all the time, as in generalised anxiety disorder, when diffuse and pervasive worry consumes you and you look to the future with dread.

Most people experience it at some point, but if anxiety starts interfering with your life, sleep, ability to form relationships, or productivity at work or school, you might have an anxiety disorder. Research shows that if it’s left untreated, anxiety can lead to depression, early death and suicide. And while it can indeed lead to such serious health consequences, the medication that is prescribed to treat anxiety doesn’t often work in the long-term. Symptoms often return and you’re back where you started.

How science can help

The way you cope or handle things in life has a direct impact on how much anxiety you experience – tweak the way you’re coping, therefore, and you can lower your anxiety levels. Here are some of the top coping skills that have emerged from our study at the University of Cambridge, which will be presented at the 30th European Congress of Neuropsychopharmacology in Paris, and other scientific research.

Do you feel like your life is out of control? Do you find it hard to make decisions – or get things started? Well, one way to overcome indecision or get going on that new project is to “do it badly”.

This may sound strange, but the writer and poet GK Chesterton said that: “Anything worth doing is worth doing badly.” And he had a point. The reason this works so well is that it speeds up your decision-making process and catapults you straight into action. Otherwise, you could spend hours deciding how you should do something or what you should do, which can be very time-consuming and stressful.

People often want to do something “perfectly” or to wait for the “perfect time” before starting. But this can lead to procrastination, long delays or even prevent us from doing it at all. And that causes stress – and anxiety.

Instead, why not just start by “doing it badly” and without worrying about how it’s going to turn out. This will not only make it much easier to begin, but you’ll also find that you’re completing tasks much more quickly than before. More often than not, you’ll also discover that you’re not doing it that badly after all – even if you are, you can always fine tune it later.

Using “do it badly” as a motto gives you the courage to try new things, adds a little fun to everything, and stops you worrying too much about the outcome. It’s about doing it badly today and improving as you go. Ultimately, it’s about liberation.

Just jump right in …The National Guard via flickr, CC BY

Forgive yourself and ‘wait to worry’

Are you particularly critical of yourself and the blunders you make? Well, imagine if you had a friend who constantly pointed out everything that was wrong with you and your life. You’d probably want to get rid of them right away.

But people with anxiety often do this to themselves so frequently that they don’t even realise it anymore. They’re just not kind to themselves.

So perhaps it’s time to change and start forgiving ourselves for the mistakes we make. If you feel like you’ve embarrassed yourself in a situation, don’t criticise yourself – simply realise that you have this impulse to blame yourself, then drop the negative thought and redirect your attention back to the task at hand or whatever you were doing.

Another effective strategy is to “wait to worry”. If something went wrong and you feel compelled to worry (because you think you screwed up), don’t do this immediately. Instead, postpone your worry – set aside 10 minutes each day during which you can worry about anything.

If you do this, you’ll find that you won’t perceive the situation which triggered the initial anxiety to be as bothersome or worrisome when you come back to it later. And our thoughts actually decay very quickly if we don’t feed them with energy.

Find purpose in life by helping others

It’s also worth considering how much of your day is spent with someone else in mind? If it’s very little or none at all, then you’re at a high risk of poor mental health. Regardless of how much we work or the amount of money we make, we can’t be truly happy until we know that someone else needs us and depends on our productivity or love.

This doesn’t mean that we need people’s praise, but doing something with someone else in mind takes the spotlight off of us (and our anxieties and worries) and places it onto others – and how we can make a difference to them.

Being connected to people has regularly been shown to be one of the most potent buffers against poor mental health. The neurologist Viktor Frankl wrote: "For people who think there’s nothing to live for, nothing more to expect from life … the question is getting these people to realise that life is still expecting something from them."

Knowing that someone else needs you makes it easier to endure the toughest times. You’ll know the “why” for your existence and will be able to bear almost any “how”.

The ConversationSo how can you make yourself important in someone else’s life? It could be as simple as taking care of a child or elderly parent, volunteering, or finishing work that might benefit future generations. Even if these people never realise what you’ve done for them, it doesn’t matter because youwill know. And this will make you realise the uniqueness and importance of your life.

This article was originally published on The Conversation. Read the original article.

The opinions expressed in this article are those of the individual author(s) and do not represent the views of the University of Cambridge.

Most people experience anxiety at some point in their lives, but for some it can be a crippling condition. Writing for The Conversation, Olivia Remes, a PhD candidate at the Cambridge Institute of Public Health, looks at what science tells us about beating the disorder.

Anxiety centred

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Drugs: how to pick a winner in clinical trials

$
0
0

“Did not meet primary endpoint.”

Prosaic words, but they can mean a billion dollar failure has just happened.

The average cost of taking a scientific discovery all the way through to a drug on a shelf is enormous – last year it was estimated at $2.6 billion by the Tufts Center for the Study of Drug Development.

One reason the figure is so high is because it also includes the cost of failure.  Recent years have seen some very high-profile failures of drug candidates that either did not meet the ‘primary endpoint’ (they didn’t work) or had their trials halted owing to serious side effects.

“It’s only natural that some drugs will fail in clinical trials – the process exists to ensure that treatments are safe and effective for patients,” says Professor Ian Wilkinson, Director of the Cambridge Clinical Trials Unit (CCTU) on the Cambridge Biomedical Campus. “But what’s unexpected is the high number of drugs that fail in phase III. You’d think that by this stage the molecule would be a sufficiently good candidate to make it through.”

He explains that failures in phases I and II – when the drug is tested for safety and dosage in healthy volunteers and patients – are inevitable. However, a great many molecules don’t make it through phase III, the stage at which the drug’s effectiveness is tested in large numbers of patients before regulatory approval is given. In fact only 10–20% of drugs that enter phase I are ultimately licensed.

“The problem with failing at phase III is it’s very expensive – a single drug trial can cost around $500m.”

He continues: “There’s a human impact for the thousands of patients who enrolled on the trial. For patients with cancer, it’s sometimes their last available treatment option,” says Wilkinson. “It’s also really unhelpful economically. Pharma companies have less money to put back into R&D, and it becomes even harder to fund drug development.”

This is why Wilkinson, along with a team of clinicians, scientists and pharmaceutical collaborators, together with statisticians at the Medical Research Council Biostatistics Unit, has been taking a hard look at the early phases of clinical trials. Their aim is to ask what can be done to get an early indication that a potential drug will make it to market.

“Traditionally, clinical trials have been organised to test safety first and efficacy last,” he explains. “It’s a cautious step-by-step approach adopted to ensure that pharma companies can satisfy regulators that the drug is safe.

“For many drugs this has worked well. But we are in a landscape where drug targets are more challenging – think for instance of conditions like psychiatric disorders and dementia. Leaving questions of whether a drug is effective to the final stages is now too risky and expensive.”

On any one day, the CCTU (one of the UK units accredited by the National Institute for Health Research) might be coordinating up to 20 trials in various phases for potential treatments for cancer, stroke, infections, dementia, heart attack, and so on.

Many of the trials are now designed with what Wilkinson calls “added value” built in at very early stages to give indications of whether the drug might work. This could include a biomarker that shows a drug for cirrhosis is reaching the liver, or a drug for heart disease is lowering cholesterol. “These are read-outs. They don’t show the drug works for the disease, but if the results are negative then there’s no point in progressing to later stages.”

The trials are also run ‘adaptively’. “We look at data for each person as it comes in… once we have enough information to guide us, we make a decision that might change the trial. It’s a quite different approach to the traditional rigidity of trials. It maximises the value of information a trial can yield.”

In recent years, pharmaceutical companies like GSK and AstraZeneca (AZ) have championed the need for rigorous trial design to weed out likely failures earlier in the process.

GSK has its only trials unit in the UK in the same building as the CCTU. There, GSK researchers work alongside Cambridge clinicians and scientists on first-in-man studies. A more targeted approach to testing medicines in patients is a key component of a Strategic Partnership between GSK, the University of Cambridge and Cambridge University Hospitals NHS Foundation Trust (CUH), which has the long-term ambition of jointly delivering new medicines to patients in the next five to ten years.

A few years ago, AZ analysed its drug pipeline before embarking on a major revision of its R&D strategy to increase the chance of successful transition to phase III and beyond. One area AZ identified as being crucial to success is to identify a causal relationship between target and disease. This might seem obvious but so-called mistaken causation has led to late failures right across the drugs industry. The usual cause is confounding – where a factor that does not itself cause a disease is associated with factors that do increase disease risk.

Professor John Danesh and colleagues at the Department of Public Health and Primary Care have pioneered a new way of finding evidence for causality before a patient is ever involved. Called ‘Mendelian randomisation’, it’s akin to a trial carried out by nature itself.

“Misinterpreting correlation as causation is a big problem,” explains Dr James Peters, who works with Danesh. “An increase in a protein biomarker in patients with atherosclerosis might suggest it’s important in the disease, but it’s not a valid drug target unless it plays a causal role. The conventional way to test this is to block the protein with a drug in a clinical trial, which is expensive, time-consuming and not always ethical.

“In phase III trials, the randomisation of participants helps to average out all differences apart from whether they are receiving the drug. Instead, we take advantage of the natural randomisation of genetic variants that occurs during reproduction.”

Some genetic variants can increase or decrease certain proteins that have been linked to a disease. If these variants can be identified – by computationally analysing enormous genetic datasets – then researchers can compare groups of people to see whether having the variant also increases the risk of a disease.

The team has used this method to look retrospectively at why two phase III trials for a potential cardiovascular drug failed. “The genetic evidence showed that the drug target was not valid,” says Peters. “We would have advised against taking this drug to a clinical trial.”

But it’s not just about predicting failures, Danesh’s team is picking winners. Evidence for the role of an inflammatory protein in atherosclerosis has now resulted in a clinical trial to see if an arthritis drug can be repurposed for atherosclerosis.

The researchers are helping industrial collaborators to prioritise potential drug targets and predict side effects. They also hope to expand their capabilities to test large numbers of variants for different potential targets in an automated fashion – a high-throughput approach to therapeutic target prioritisation.

Meanwhile, Wilkinson is planning ahead to avoid a different type of limitation: expertise. “There is a lack of individuals trained to design and deliver innovative clinical trials, and this is now impacting on drug development,” he explains.

Last year, an Experimental Medicine Training Initiative was launched to train medics how to run innovative clinical trials. Wilkinson is its Director and it’s supported by the University in partnership with CUH, Cambridge Biomedical Research Centre, and AZ/MedImmune and GSK.

“We all believe that the failure rate for drug candidates making it through phase III is unacceptably high,” he says. “Less than one in a thousand molecules discovered in the lab make it through to being a drug. We want to be sure that we can answer the billion dollar question of which are most likely to be winners.”

Read more about research on future therapeutics in Research Horizons magazine. 

When a drug fails late on in clinical trials it’s a major setback for launching new medicines. It can cost millions, even billions, of research and development funds. Now, an ‘adaptive’ approach to clinical trials and a genetic tool for predicting success are increasing the odds of picking a winner. 

We all believe that the failure rate for drug candidates making it through phase III is unacceptably high. We want to be sure that we can answer the billion dollar question of which are most likely to be winners.
Ian Wilkinson
Medication

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Study reveals mysterious equality with which grains pack it in

$
0
0

At the moment they come together, the individual grains in materials like sand and snow appear to have exactly the same probability of combining into any one of their many billions of possible arrangements, researchers have shown.

The finding, by an international team of academics at the University of Cambridge, UK, and Brandeis University in the US, appears to confirm a decades-old mathematical theory which has never been proven, but provides the basis for better understanding granular materials – one of the most industrially significant classes of material on the planet.

A granular material is anything that comprises solid particles that can be seen individually with the naked eye. Examples include sand, gravel, snow, coal, coffee, and rice.

If correct, the theory demonstrated in the new study points to a fact of remarkable – and rather mysterious – mathematical symmetry. It means, for example, that every single possible arrangement of the grains of sand within a sand dune is exactly as probable as any another.

The study was led by Stefano Martiniani, who is based at New York University but undertook the research while completing his PhD at St John’s College, University of Cambridge.

“Granular materials are so widely-used that understanding their physics is very important,” Martiniani said. “This theory gives us a very simple and elegant way to describe their behaviour. Clearly, something very special is happening in their physics at the moment when grains pack together in this way.”

The conjecture that Martiniani tested was first proposed in 1989 by the Cambridge physicist Sir Sam F. Edwards, in an effort to better understand the physical properties of granular materials.

Globally, these are the second-most processed type of material in industry (after water) and staples of sectors such as energy, food and pharmaceuticals. In the natural world, vast granular assemblies, such as sand dunes, interact directly with wind, water and vegetation. Yet the physical laws that determine how they behave in different conditions are still poorly understood. Sand, for example, behaves like a solid when jammed together, but flows like a liquid when loose.

Understanding more about the mechanics of granular materials is of huge practical importance. When they jam during industrial processing, for example, it can cause significant disruption and damage. Equally, the potential for granular materials to “unjam” can be disastrous, such as when soil or snow suddenly loosens, causing a landslide or avalanche.

At the heart of Edwards’ proposal was a simple hypothesis: If one does not explicitly add a bias when preparing a jammed packing of granular materials – for example by pouring sand into a container – then any possible arrangement of the grains within a certain volume will occur with the same probability.

This is the analogue of the assumption that is at the heart of equilibrium statistical mechanics – that all states with the same energy occur with equal probability. As a result the Edwards hypothesis offered a way for researchers to develop a statistical mechanics framework for granular materials, which has been an area of intense activity in the last couple of decades.

But the hypothesis was impossible to test – not least because above a handful of grains, the number of possible arrangements becomes unfathomably huge. Edwards himself died in 2015, with his theory still the subject of heated scientific debate.

Now, Martiniani and colleagues have been able to put his conjecture to a direct test, and to their surprise they found that it broadly holds true. Provided that the grains are at the point where they have just jammed together (or are just about to separate), all possible configurations are indeed equally likely.

Helpfully, this critical point – known as the jamming transition – is also the point of practical significance for many of the granular materials used in industry. Although Martiniani modelled a system comprising soft spheres, a bit like sponge tennis balls, many granular materials are hard grains that cannot be compressed further once in a packed state.

“Apart from being a very beautiful theory, this study gives us the confidence that Edwards’ framework was correct,” Martiniani said. “That means that we can use it as a lens through which to look at a whole range of related problems.”

Aside from informing existing processes that involve granular materials, there is a wider significance to better understanding their mechanics. In physics, a “system” is anything that involves discrete particles operating as part of a wider network. Although bigger in scale, the way in which icebergs function as part of an ice floe, or the way that individual vehicles move within a flow of traffic (and indeed sometimes jam), can be studied using a similar theoretical basis.

Martiniani’s study was undertaken during his PhD, while he was a Gates Scholar, under the supervision of Professor Daan Frenkel from the Department of Chemistry. It built on earlier research in which he developed new methods for calculating the probability of granular systems packing into different configurations, despite the vast numbers involved. In work published last year, for example, he and colleagues used computer modelling to work out how many ways a system containing 128 tennis balls could potentially be arranged. The answer turned out to be ten unquadragintilliard – a number so huge that it vastly exceeds the total number of particles in the universe.

In the new study, the researchers employed a sampling technique which attempts to compute the probability of different arrangements of grains without actually looking at the frequency with which these arrangements occur. Rather than taking an average from random samples, the method involves calculating the limits of the possibility of specific arrangements, and then calculates the overall probability from this.

The team applied this to a computer model of 64 soft spheres - an imaginary system which could therefore be “over-compressed” after reaching the jamming transition point. In an over-compressed state, the different arrangements were found to have different probabilities of occurrence. But as the system decompressed to the point of the jamming transition, at which the grains were effectively just touching, the researchers found that all probabilities became equal – exactly as Edwards predicted.

“In 1989, we didn’t really have the means of studying whether Edwards was right or not,” Martiniani added. “Now that we do, we can understand more about how granular materials work; how they flow, why they get stuck, and how we can use and manage them better in a whole range of different situations.”

The study, Numerical test of the Edwards conjecture shows that all packings become equally probable at jamming is published in the journal Nature Physics. DOI: 10.1038/nphys4168.

For the first time, researchers have been able to test a theory explaining the physics of how substances like sand and gravel pack together, helping them to understand more about some of the most industrially-processed materials on the planet.

Granular materials are so widely-used that understanding their physics is very important. Clearly, something very special is happening at the moment when grains pack together in this way.
Stefano Martiniani
A huge range of materials are classified as granular – including sand, gravel, snow, nuts, coal, rice, barley, coffee and cereals. Globally, they are the second-most processed type of material in industry, after water.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Cambridge museums recognised with substantial Arts Council England funding

$
0
0

Kettle’s Yard, who work in partnership with UCM, has also been awarded £1,163,028 as part of ACE’s National Portfolio, further enhancing the role of the University’s Museums and Botanic Garden as the largest cultural provider in the region.   

Today’s announcement will enable the University museums and collections to continue their mission of connecting more people with world-class collections of more than eight million objects, to reach new audiences who may face barriers to enjoying and participating in the museums, as well as facilitating and sharing exceptional international research for both Cambridge academics and a global community of researchers and scholars.

The total economic impact of Cambridge University’s Museums and Botanic Garden was estimated to be at least £16m in 2015-16. The collections contribute to major academic studies at home and abroad, and last year the museums and Botanic Garden welcomed nearly one million visitors through their doors.

Professor Eilis Ferran, Pro-Vice-Chancellor for Institutional and International Relations at Cambridge, said: “This significant announcement from Arts Council England recognises the important role that the University’s Museums and Botanic Garden play in research, learning, understanding and enjoyment. The University of Cambridge Museums exist for all of us, and are open free to the public, and this funding will enable them to continue to share this resource for the future.

“Audience-focused programmes like India Unboxed are a perfect example of our museums’ engagement with communities not just at home, but many thousands of miles away, too. We are working with Indian designers and artists to enable more people to experience our outstanding collections from India, marking the UK-India Year of Culture 2017.

Fitzwilliam Museum Director and Chair of the University of Cambridge Museums Steering Group, Tim Knox said: “National Portfolio status for both the UCM and Kettle’s Yard is a huge privilege and one we take very seriously. Arts Council England has fully supported the huge changes and development we have made in bringing the work of the museums and the Botanic Garden together, and with their continued support there is plenty of exciting work yet to be done.”

UCM Museums Officer Liz Hide said: “The eight University of Cambridge Museums and Botanic Garden represent the country’s highest concentration of internationally important collections outside London. Since 2012, Arts Council funding has transformed the way the museums can work together to open up their collections for everyone. We are delighted that Arts Council England recognises the positive impacts through their continuing support.”

Arts Council funding will enable the UCM to continue its work reaching out to public audiences and bringing the University’s exceptional collections and research to many more people, through popular programmes such as Summer at the Museums and a huge range of partnership work with schools, charities, community groups and other cultural organisations across Cambridgeshire and the wider region.

Through temporary and permanent exhibitions, visitors can explore paintings by Titian, Monet and Picasso, biological and geological specimens collected by Charles Darwin, Isaac Newton’s notebooks, Captain Scott’s last letters, early hominid tools discovered in East Africa by Louis Leakey, rare material from now-extinct Dodo, local archaeological and natural history collections, one of the world’s finest collections of casts of Greek and Roman sculpture, and meteorites and moon rock from beyond our planet.

Both Kettle’s Yard and the Museum of Zoology are currently undergoing major redevelopments and will reopen in the coming year with substantially improved exhibitions, visitor facilities and resources, demonstrating the University’s ongoing commitment to improve and increase access to its collections.

Recent innovations and successes for UCM have included Cam Lates – alternative, after-hours events at the museums featuring improv comedy, music and film screenings , Twilight at the Museums, where families explore the museums by torchlight after hours– and the launch of India Unboxed, a wide-ranging programme of exhibitions, events and a film series

Cambridge’s reputation as a centre of excellence for museums and culture in the UK received a vital boost today when Arts Council England (ACE) awarded University of Cambridge Museums (UCM) more than £4.8m and National Portfolio Organisation status from 2018-2022.

The University of Cambridge Museums exist for all of us.
Eilis Ferran
Twilight at the Museums, The Fitzwilliam Museum

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Milner Therapeutics Institute: a drug discovery ecosystem

$
0
0

Professor Tony Kouzarides is the founding Director of the Milner Therapeutics Institute, which is due to open in 2018 on the Cambridge Biomedical Campus. The ecosystem he sees thriving within its walls is one in which academic researchers (“experts in the biology of diseases”) work closely with pharmaceutical companies (“who know what’s needed to get the drug to clinic”) to find new medicines. Put simply, he says, the Institute will be “a pipeline for drug discovery within an academic setting.”

While the labs are being fitted out with robotics for customised drug screening, gene-editing facilities to rewrite DNA and bioinformatics support to help scientists deal with huge datasets, the partnerships between industry and academia are already under way.

In June 2015, a research agreement was signed between the University of Cambridge, the Wellcome Trust Sanger Institute and the Babraham Institute with three pharmaceutical companies – AstraZeneca (AZ), Astex and GSK. Since then, Pfizer, Shionogi and Elysium Pharmaceuticals have joined the Milner Therapeutics Consortium, the outreach programme of the Institute.

With this one agreement, doors opened. Dr Kathryn Chapman, Executive Manager of the Milner Therapeutics Institute, explains: “Forming the Consortium means there’s now a free exchange of potential drug molecules between pharma and academia. This sounds straightforward but, before the agreement, this could take a year because of confidentiality and material transfer contracts. Now it takes two to three weeks. It lowers barriers of engagement, it speeds up research and it can involve hundreds of molecules in one go.”

One consequence is drugs that have already been approved for use in certain diseases are now being tested for use in other diseases – a practice called repositioning or repurposing.

“An academic might have developed a brain disease model using an organoid – a mini organ in a Petri dish,” explains Kouzarides. “We can use this to test drugs that have been licensed for use in other diseases such as arthritis or cancer.”

It also means that novel therapeutic agents across the entire portfolio of drugs being developed by each of the companies can be screened at an early stage in biological assays, to see whether any are worth progressing along the drug development pipeline.

For example, one of the Consortium’s first collaborative projects is a partnership between AZ and Professor Carlos Caldas at the Cancer Research UK Cambridge Institute.

Breast cancer consists of several different genomic subtypes, which makes effective treatment challenging and prognosis variable. Some subtypes respond well to particular drugs or drug combinations whereas others are resistant. Caldas has pioneered the development of a biobank of patient-derived breast cancer cells and tissues that have greater predictive power for clinical outcome than other preclinical models (such as cancer cell lines).Carlos and AZ are now working together to test how different subtypes of breast cancer respond to different AZ compounds and compound combinations, as well as looking at potential drug-resistance mechanisms.

From 2018, the Consortium will form a major part of the Milner Therapeutics Institute, which has been made possible through a £5m donation from Dr Jonathan Milner, a former member of Kouzarides’ research group and entrepreneur. Milner and Kouzarides are two of the founders of leading Cambridge biotechnology company Abcam.

“One of the main aims of the Institute will be to develop multiple disease models to understand how drugs could work on the real disease,” explains Kouzarides. “We plan to focus on some of the most challenging diseases to start with – cancer, neurodegeneration and inflammation – but we are disease agnostic. If we have a method of testing for efficacy and a library of molecules to test, then we’ll test!”

Kouzarides’ enthusiasm for making sure the ‘Petri-dish-to-pill’ pipeline works comes from his own positive experience of a collaboration with GSK that has resulted in a leukaemia drug now being used in the clinic to treat patients.

It came about through serendipity. “GSK was developing a molecule called I-BET against an epigenetic protein. I was a consultant on the project and became aware that the molecule could be effective against mixed lineage leukaemia (MLL), the most common type of leukaemia in children under two years old. We had the cell assays and disease models in Cambridge, and we asked to test the drug. It worked and it’s now in the clinic.

“I started to wonder why this pharma–academia collaboration doesn’t happen more often. People have been talking about the translational gap between fundamental research and the clinic for years, and it’s still there. While serendipity is good – and many amazing medical innovations have come out of chance encounters – we can’t trust only to chance.

“The world needs new medicines to be developed. It’s time-consuming and costly, and that’s why we need an ecosystem that will nurture and speed up the success.”

The Milner Institute will be within the Capella building at the Cambridge Biomedical Campus, alongside the relocated Wellcome Trust/MRC Cambridge Stem Cell Institute, the Cambridge Institute of Therapeutic Immunology and Infectious Disease, and The Cambridge Centre for Haematopoiesis and Haematological Malignancies.

Tony Kouzarides is passionate about ecosystems: well-balanced communities that flourish on mutual and dynamic interactions. But the ecosystems that excite him are not made up of plants, animals and environments. They’re made up of experts.

The world needs new medicines to be developed. It’s time-consuming and costly, and that’s why we need an ecosystem that will nurture and speed up the success.
Tony Kouzarides

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

‘Bulges’ in volcanoes could be used to predict eruptions

$
0
0

Using a technique called ‘seismic noise interferometry’ combined with geophysical measurements, the researchers measured the energy moving through a volcano. They found that there is a good correlation between the speed at which the energy travelled and the amount of bulging and shrinking observed in the rock. The technique could be used to predict more accurately when a volcano will erupt. Their results are reported in the journal Science Advances.

Data was collected by the US Geological Survey across Kīlauea in Hawaii, a very active volcano with a lake of bubbling lava just beneath its summit. During a four-year period, the researchers used sensors to measure relative changes in the velocity of seismic waves moving through the volcano over time. They then compared their results with a second set of data which measured tiny changes in the angle of the volcano over the same time period.

As Kīlauea is such an active volcano, it is constantly bulging and shrinking as pressure in the magma chamber beneath the summit increases and decreases. Kīlauea’s current eruption started in 1983, and it spews and sputters lava almost constantly. Earlier this year, a large part of the volcano fell away and it opened up a huge ‘waterfall’ of lava into the ocean below. Due to this high volume of activity, Kīlauea is also one of the most-studied volcanoes on Earth.

The Cambridge researchers used seismic noise to detect what was controlling Kīlauea’s movement. Seismic noise is a persistent low-level vibration in the Earth, caused by everything from earthquakes to waves in the ocean, and can often be read on a single sensor as random noise. But by pairing sensors together, the researchers were able to observe energy passing between the two, therefore allowing them to isolate the seismic noise that was coming from the volcano.

“We were interested in how the energy travelling between the sensors changes, whether it’s getting faster or slower,” said Clare Donaldson, a PhD student in Cambridge’s Department of Earth Sciences, and the paper’s first author. “We want to know whether the seismic velocity changes reflect increasing pressure in the volcano, as volcanoes bulge out before an eruption. This is crucial for eruption forecasting.”

One to two kilometres below Kīlauea’s lava lake, there is a reservoir of magma. As the amount of magma changes in this underground reservoir, the whole summit of the volcano bulges and shrinks. At the same time, the seismic velocity changes. As the magma chamber fills up, it causes an increase in pressure, which leads to cracks closing in the surrounding rock and producing faster seismic waves – and vice versa.

“This is the first time that we’ve been able to compare seismic noise with deformation over such a long period, and the strong correlation between the two shows that this could be a new way of predicting volcanic eruptions,” said Donaldson.

Volcano seismology has traditionally measured small earthquakes at volcanoes. When magma moves underground, it often sets off tiny earthquakes, as it cracks its way through solid rock. Detecting these earthquakes is therefore very useful for eruption prediction. But sometimes magma can flow silently, through pre-existing pathways, and no earthquakes may occur. This new technique will still detect the changes caused by the magma flow.

Seismic noise occurs continuously, and is sensitive to changes that would otherwise have been missed. The researchers anticipate that this new research will allow the method to be used at the hundreds of active volcanoes around the world.

Reference
C. Donaldson et al. ‘Relative seismic velocity variations correlate with deformation at Kīlauea volcano’. Science Advances (2017) DOI: 10.1126/sciadv.1700219 

Inset image: Lava Waterfall, Kilauea Volcano, Hawaii. Credit: Dhilung Kirat

A team of researchers from the University of Cambridge have developed a new way of measuring the pressure inside volcanoes, and found that it can be a reliable indicator of future eruptions.

This could be a new way of predicting volcanic eruptions.
Clare Donaldson
Kiauea

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

‘France’s Samuel Pepys’ is elevated from the footnotes of history

$
0
0

Pierre de L’Estoile has been described as France’s Samuel Pepys. Like Pepys, he lived in singularly interesting times. Like Pepys, he documented in his journals both the inner world of his household and the wider world of politics and gossip. He also compiled scrapbooks of the partisan and often scurrilous broadsheets that circulated in his Paris neighbourhood as rival factions in a religious upheaval sought to discredit each other as heretics, unbelievers or opportunists.

Born in 1546, L’Estoile lived through the eight civil wars that became known as France’s Wars of Religion, a succession of violent struggles between Catholics and Protestants that drew in all of the great powers of Reformation Europe. The troubles began with the massacre of Protestants worshipping at the village of Vassy on 1 March 1562 and the advance of the Protestant armies that followed. Ten years later, thousands of Protestants lay dead on the streets of Paris, slaughtered by Catholic militia in the infamous St Bartholomew’s Day massacre. Only with the Edict of Nantes in 1598 did rival parties unite behind the Catholic convert, King Henri IV, and establish a fragile peace.

Edited extracts of L’Estoile’s journals were published following his death. Ever since, historians have scoured his records for unparalleled first-hand accounts of events that range from political scandals to everyday criminality and wondrous portents in the sky. Meanwhile, the diarist himself has appeared solely in the footnotes of histories devoted to the period. In Pierre de L’Estoille and his World in the Wars of Religion, historian Tom Hamilton places this remarkable Parisian centre stage of a narrative that sheds new light on a fascinating and turbulent period of French history.

The Wars of Religion are conventionally framed as a conflict sharply divided between Catholics and Protestants, and driven by the political ambition of powerful noble families. Hamilton’s meticulous archival research into L’Estoile’s life as head of a large household (his two marriages produced 18 children, nine of whom survived to maturity), and holder of important positions in the royal bureaucracy, looks instead at the impact of the civil wars on everyday life. He reveals a society marked by ambiguous religious allegiances and conflicting political solidarities that could split families apart.

Hamilton’s book, the first in any language to concentrate on L’Estoile, examines a life that is both ordinary and extraordinary. He uses the diarist’s writing and collecting to rethink the complex flavours and textures of a world disrupted by religious and political conflict.

L’Estoile was born into a wealthy Catholic family of royal office-holders. Yet his father chose a Protestant scholar as tutor and protector for his young son. The diarist recalled that his father’s instructions, given on his deathbed, were that his son (“one of the most precious gifts that God has given me”) was to be raised a pious and god-fearing Catholic. Tellingly, however, the boy was not to be nourished in “the abuses and superstitions of the Church”.

Years later, L’Estoile was to pass on his father’s moderate religious legacy to his own children, only to see his eldest son rebel and join the armies of the zealous Catholic League, dying tragically less than a year after reconciling with his family.

The religious choice of the L’Estoile family, argues Hamilton, offers a new perspective on the history of Catholicism. It reveals how French (Gallican) Catholics could be familiar with Protestants, who were critical of Rome and what L’Estoile called “the rotten trunk of the papacy”, and at the same time give their allegiance to the “most Christian king” of France as the head of a national church, a distinctively French branch of Catholicism.

Social status gave L’Estoile privileged access to information and a means to sustain his family during the day-to-day struggle of life through the civil wars. Not only was L’Estoile a landowner (although he seldom visited his estates and mishandled his financial affairs), he also held positions as a royal secretary and officer in the Paris Chancery. During the final civil war, L’Estoile was privy to seditious correspondence between factions, and to add to his collection he “copied it at that very moment on one of the desks in the Chancery”. He also had a role in print licensing and got to know the printers of the rue Saint Jacques. Many became friends for life and one, at least, “printed nothing, however secret, about which he did not inform me”.

L’Estoile’s house in the neighbourhood of Saint-André-des-Arts, on the Left Bank of the River Seine, was just a few minutes’ walk from his workplace on the Île de la Cité. He lived surrounded by family and colleagues, a courtyard away from his mother. His house had been the scene of the murder of a previous occupant: he purchased it for a knockdown sum and never mentioned its grim history in his diaries. An inventory of L’Estoile’s worldly goods made on his death offers clues about the man he was. He owned few clothes, and those he did possess were shabby. Neither was there anything exceptional about the family’s accoutrements, although he differed from his neighbours in not displaying devotional paintings of the Virgin Mary or saints.

Located at the top of the house, L’Estoile’s study and cabinet of curiosities tell another story.  In a space out of bounds to the rest of his family, L’Estoile amassed one of the largest libraries and painting collections in Paris. Here he wrote and edited his journals, organised his collections, and met with scholarly friends. In this respect he engaged with men of his class across Europe caught up in the growing mania for collecting. Some – including English and German diplomats – went out of their way to visit him.

While other collectors gathered learned manuscripts or items of natural history to prove their erudition, L’Estoile collected printed ephemera which he had made into volumes he called his ‘drolleries’ (trifles), mocking their risible exaggerations. The scale of these drolleries was far from trifling. By 1589, L’Estoile had collected more than 500 different publications that documented the twists and turns of the civil wars. At his death in 1611, his collection had swelled to contain thousands of volumes.

Meanwhile, L’Estoile’s journals offer a glimpse of some of the terrible privations suffered by poor Parisians as food supplies were disrupted and prices escalated. During the Siege of Paris in 1590, when the French royal army surrounded a city taken over by the Catholic League, he describes taking a walk with two relatives and spotting a desperate woman eating the skin of a dog. So shocked are the men by this sight that L’Estoile’s brother-in-law undertakes to record it himself, lest L’Estoile’s account later be dismissed as fabricated.

Remaining in Paris during a period when many moderate Catholics sought safety elsewhere, L’Estoile put his own life on the line. In 1591 the Catholic League drew up lists of people known to oppose its views, marking their names with the letters P for pendu (hanged), D for dagué (knifed) or C for chassé (exiled). L’Estoile was shown this list and saw his name marked D and those of several of his relatives marked P.  Fortunately for L’Estoile and family, the League’s soldiers refused to carry out its orders.

L’Estoile’s political affiliations were avowedly royalist. He believed that only a strong king could bring peace. In his collecting, however, he amassed literature of every political hue, confessing that his collecting impulse overrode any sense of prudence. At a time of intermittent print censorship, people caught possessing or disseminating defamatory material were in danger of execution. L’Estoile himself wrote of anti-royalist prints in his collection that he “should have thrown them into the fire, as they deserved”.

Fortuitously for future historians, L’Estoile was stubbornly fixated on collecting. As a Chancery official charged with book licensing, and a cousin of the Parisian criminal lieutenant in charge of the book burnings, he claimed he was keeping exemplary copies for posterity, to preserve memories of the wickedness and confusion of his times.

Tom Hamilton is a Junior Research Fellow at Trinity College, Cambridge. Pierre de L’Estoile and his World in the Wars of Religion is published by Oxford University Press.

The journals and scrapbooks of Pierre de L’Estoile have for generations provided a vivid picture of France in a time of religious upheaval. Now Cambridge historian Tom Hamilton has written the first book devoted to the life of L’Estoile as a diarist, collector and man about town. 

Hamilton’s book examines a life both ordinary and extraordinary. He uses the diarist’s writing and collecting to rethink the complex flavours and textures of a world disrupted by religious and political conflict.
Painting of the St Bartholomew's Day Massacre

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Tree rings pinpoint eruption of Icelandic volcano to half a century before human settlement

$
0
0

The team, which included volcanologists, climatologists, geographers and historians among others, used a combination of scientific and historical evidence to pinpoint the eruption date of the Katla volcano between late 822 CE and early 823 CE, decades before the earliest settlers arrived. Their results are reported in the journal Geology.

In a similar way to how fossils can be used to understand the development and evolution of life on Earth, different types of environmental evidence can be used to understand what the Earth’s climate was like in the past and why. The ‘fingerprints’ contained in tree rings and ice cores help scientists to estimate past climatic conditions and extend our understanding of the interaction between humans and the environment hundreds and thousands of years back in time.

“In our work, we’re trying to reconstruct past natural temperature and precipitation variability from tree rings – trying to reveal when it was cold and wet or warm and dry for instance,” said Professor Ulf Büntgen of Cambridge’s Department of Geography, the paper’s lead author. “We’re also interested in detecting and understanding key drivers of the Earth’s climate dynamics and their possible linkages with changes in human history.”

Currently, Iceland is for the most part treeless. However, before the first permanent settlers arrived in the late 9th century, it was most likely covered by extensive woodland. Early settlers harvested most of the trees they found on the island to establish an agricultural-based society, and the trees never recovered.

In 2003, a spring flood of the Thverá River exposed hundreds of birch trees which had been buried for centuries beneath layers of volcanic sediment. The so-called Drumbabót forest is the best-preserved prehistoric forest in Iceland, and had been buried by an eruption of the nearby Katla volcano, Iceland’s most active volcanic system.

Volcanic eruptions are often responsible for an abrupt period of cooling, but only with a precise date of eruption can researchers definitively account for the variability in climate. Büntgen, who uses the information locked within tree rings to reconstruct past climate conditions, used the trees exposed by the 2003 flood to pinpoint when this particular eruption took place.

The team behind the current work have previously confirmed that in 775 CE, a large solar flare caused a spike in radiocarbon levels in the Earth’s atmosphere, which would be stored in the wood of trees that were alive at the time. By measuring the radiocarbon levels in one of the Drumbabót trees, Büntgen and his colleagues were able to pinpoint the year 775 in the tree rings, and measure outward to the bark to count the number of years to the Katla eruption, when the tree died. The outermost tree ring had completely formed and a new one had not yet started, meaning that the eruption occurred after autumn 822 and before spring 823, before the next year’s growth had begun. Iceland was not settled until around 870, so this particular forest was destroyed almost half a century before humans arrived.

The unique tree ring results were then linked with those of co-authors Professors Christine Lane and Clive Oppenheimer, also from Cambridge’s Department of Geography. Lane and Oppenheimer used independent lines of ash (tephra) and ice core evidence to detect fingerprints of the Katla eruption.

In addition to the scientific results, the team also involved historians who analysed written documentary evidence from Europe and Asia, and found that there was a severe cold spell consistent with the timing of the reconstructed Katla eruption.

“It was a happy coincidence that we were able to use all these different archives and techniques to date this eruption,” said Büntgen. “Data and methods we are using are constantly getting better, and by building more links with the humanities, we can see the real effects volcanoes have on human society.”

Reference
Büntgen et al. ‘Multi-proxy dating of Iceland’s major pre-settlement Katla eruption to 822-823 CE.’ Geology (2017). DOI: 10.1130/G39269.1

 

An international group of researchers has dated a large volcanic eruption in Iceland to within a few months. The eruption, which is the oldest volcanic eruption to be precisely dated at high northern latitudes, occurred shortly before the first permanent human settlements were established, when parts of the now mostly treeless island were still covered with forest. 

It was a happy coincidence that we were able to use all these different archives and techniques to date this eruption.
Ulf Büntgen
Drumbabót forest in Iceland

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Young Leaders meet the Queen

$
0
0

The Queen’s Young Leaders Programme celebrates the achievements of inspiring young people from across the Commonwealth who are dedicated to driving change in their communities.

This year’s winners are working in support of a range of issues and join former graduates of the Programme which was established by the Queen Elizabeth Diamond Jubilee Trust in 2014.

The Trust is run in partnership with Comic Relief, the Royal Commonwealth Society and ICE, with last night’s ceremony at Buckingham Palace hosted by the Queen, The Duke of York and Prince Harry and broadcast live on BBC’s The One Show. The Chairman of The Queen Elizabeth Diamond Jubilee Trust, Sir John Major, Dame Tanni Grey-Thompson, Sir Mo Farah and Caspar Lee were also in attendance.

ICE provides tailored support to the award winners to develop their skills as leaders, in the form of a one-year online leadership course. The centrepiece of this is a week-long residential placement in the UK. The first three days of the residential were based at Madingley Hall, where the award winners enjoyed:

  • Lectures and seminars with global academics and industry experts - including Oli Barrett MBE, founder of StartUp UK; business creativity guru Fredrik Härén; and Dr Karen Salt, co-director of Nottingham University's Centre for Research in Race and Rights
  • Team-building activities focused on strengthening the network of QYLS and alumni from 53 Commonwealth countries
  • One-to-one mentoring with coaches who have studied on ICE’s coaching programme
  • Mindfulness, meditation and yoga sessions during their downtime
  • A social programme, including punting on the Cam and open-air cinema at Madingley Hall

The winners then moved on to London, where they visited Number 10, the BBC and organisations such as Facebook, Google DeepMind, Jamie’s Fifteen, AMV BBDO and Oxfam to take part in development activities to help drive their work forward.  For some, a highlight was meeting David Beckham, an enthusiastic supporter of the Queen’s Young Leaders Programme.

Frances Brown, Mentoring and Course Director for the programme, said: "We at ICE are always overwhelmed by the talent and potential of the Queen’s Young Leaders and thrilled to play a vital part in developing their skills, building confidence and strengthening their global network. We are proud to support and encourage these exceptional young leaders who are a true reflection of the potential of youth in The Commonwealth."

Vladyslava Kravchenko, a Queen's Young Leader from Malta, said: "It's been absolutely great. ICE is a great setting and we're learning really interesting and useful things. The group of Queen's Young Leaders this year is incredible and I was inspired to meet so many motivated and impassioned young people."

The search is on for 2018 winners, applications are now open and sixty inspiring young leaders will be selected by peers later in the year. 

Young people who are working to preserve the environment, tackle bullying in schools, and promoting gender equality were recognised by the Queen at Buckingham Palace last night – following a year of leadership studies provided by the University of Cambridge Institute of Continuing Education (ICE).

We are always overwhelmed by the talent and potential of the Queen’s Young Leaders.
Frances Brown
Some of this year’s Young Leaders pictured during their residency at University of Cambridge Institute of Continuing Education

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Opinion: Parliament and Brexit

$
0
0

Demands to reclaim sovereignty were an important part of the UK’s decision to leave the European Union.  “Take back control of our laws”, the Leave campaigners exhorted the British electorate in 2016.

The expectation was that Parliament would be a major beneficiary. After all, the so-called democratic deficit in EU politics has long been synonymous with a diminution of the powers of national legislatures. Parliament, it was supposed, would play a key role in the process of the UK’s withdrawal from the EU, and in the legislative arrangements required to ensure it would take place with minimal disruption to the UK.

It quickly became clear that the Prime Minister’s view was rather different. Parliament appeared to be an irritant to the executive’s attempts to pursue its version of Brexit. Having lost the initial attempt to trigger Article 50 without Parliamentary engagement thanks to the legal case brought by Gina Miller and others, Theresa May appeared to think that any attempts to amend or improve the draft legislation was an assault on her intention of making a success of Brexit. Calls for unilateral offers on the rights of EU nationals, or to stay in the single market, were given short shrift. 

Not content with securing the EU Withdrawal Act without amendments, the Prime Minister called a snap general election in the hope of strengthening her parliamentary majority and undermining what she seemed to perceive as twin dangers: the nine Liberal Democrats MPs, and the “unelected” House of Lords, where pro-European voices remained rather louder than in the Commons.

Paradoxically, her catastrophic gamble resulted in a hung parliament that has weakened the Prime Minister’s hand. It has also created the conditions for greater cross-party working, for a less clear-cut withdrawal, and for increased leverage for Parliament.

Prior to the election, the PM could rely on the Salisbury Convention to ensure that Labour would ultimately not defy the will of the Commons or the Government’s 2015 manifesto pledge to hold a referendum and be bound by the results of the referendum. The outcome of the 2017 General Election ensures that the Opposition could reasonably claim that the Government does not have a majority and, hence, needs to adopt a more consensual approach to withdrawal.

What role is there, then, for Parliament?

The Queen’s Speech was dominated by Brexit, in a way that the General Election was not. Eight pieces of legislation were flagged up. Among them was the all-encompassing “Repeal Bill” (now demoted from the “Great Repeal Bill” originally proposed) required to repeal the 1972 European Communities Act, to enshrine EU law into UK law, and to ensure there are no gaps in the Statute Book on the day the UK leaves the EU. Alongside it were other bills on trade, customs, immigration, agriculture, fisheries and nuclear safeguards. For each of these pieces of legislation the Government will need to secure a majority in both Houses. What are its chances of doing so? 

The confidence and supply deal with the DUP includes Brexit-related matters. The results of the first vote on the Queen’s Speech, with a Government majority of 14, show that the Government can get business through the Commons. Whether it will do so well on more contentious matters where just a handful of Tory rebels could alter the outcome is an open question – fascinating for academics, a nightmare for Government whips.

The first post-election vote in the Lords saw a clear government majority to reject an amendment on remaining in the single market and customs union. That vote, however, is not a good indicator of what may follow. The amendment was not supported by the Labour frontbench, so although it was proposed by Labour peer Lord Adonis, and secured some rebel Labour support, it fell far short of the numbers that would come about if Labour put a whip on.

The experience of the 2015-17 Parliament was very clear: where Labour and the Liberal Democrats work together, with some crossbench support, they can defeat the Government. Votes in the Commons will be tight throughout the coming session. Votes in the Lords, meanwhile, may swing wildly according to whether the largest opposition party wishes to let the Government set the Brexit agenda, or prefers to cooperate with other parties (and rebel Tories) to shape Brexit.

The general election result increases leverage for Parliament when it comes to Brexit. Here, Baroness Smith of Newnham, a lecturer in the Department of POLIS, reflects on recent turmoil and the tightening of Commons votes as Brexit edges closer.    

For each of these pieces of legislation the Government will need to secure a majority in both Houses. What are its chances of doing so?
Julie Smith
Theresa May at PMQs

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

‘Brain training’ app found to improve memory in people with mild cognitive impairment

$
0
0

Amnestic mild cognitive impairment (aMCI) has been described as the transitional stage between ‘healthy ageing’ and dementia. It is characterised by day-to-day memory difficulties and problems of motivation. At present, there are no approved drug treatments for the cognitive impairments of patients affected by the condition.

Cognitive training has shown some benefits, such as speed of attentional processing, for patients with aMCI, but training packages are typically repetitive and boring, affecting patients’ motivation. To overcome this problem, researchers from the Departments of Psychiatry and Clinical Neurosciences and the Behavioural and Clinical Neuroscience Institute at the University of Cambridge developed ‘Game Show’, a memory game app, in collaboration with patients with aMCI, and tested its effects on cognition and motivation.

The researchers randomly assigned forty-two patients with amnestic MCI to either the cognitive training or control group. Participants in the cognitive training group played the memory game for a total of eight one-hour sessions over a four-week period; participants in the control group continued their clinic visits as usual.

In the game, which participants played on an iPad, the player takes part in a game show to win gold coins. In each round, they are challenged to associate different geometric patterns with different locations. Each correct answer allows the player to earn more coins. Rounds continue until completion or after six incorrect attempts are made. The better the player gets, the higher the number of geometric patterns presented – this helps tailor the difficulty of the game to the individual’s performance to keep them motivated and engaged. A game show host encourages the player to maintain and progress beyond their last played level.

Screenshot from Game Show. Credit: Sahakian Lab

The results showed that patients who played the game made around a third fewer errors, needed fewer trials and improved their memory score by around 40%, showing that they had correctly remembered the locations of more information at the first attempt on a test of episodic memory. Episodic memory is important for day-to-day activities and is used, for example, when remembering where we left our keys in the house or where we parked our car in a multi-story car park. Compared to the control group, the cognitive training group also retained more complex visual information after training. 

In addition, participants in the cognitive training group indicated that they enjoyed playing the game and were motivated to continue playing across the eight hours of cognitive training. Their confidence and subjective memory also increased with gameplay. The researchers say that this demonstrates that games can help maximise engagement with cognitive training.

“Good brain health is as important as good physical health. There's increasing evidence that brain training can be beneficial for boosting cognition and brain health, but it needs to be based on sound research and developed with patients,” says Professor Barbara Sahakian, co-inventor of the game: “It also need to be enjoyable enough to motivate users to keep to their programmes. Our game allowed us to individualise a patient’s cognitive training programme and make it fun and enjoyable for them to use.”

Dr George Savulich, the lead scientist on the study, adds: “Patients found the game interesting and engaging and felt motivated to keep training throughout the eight hours. We hope to extend these findings in future studies of healthy ageing and mild Alzheimer’s disease.”

The researchers hope to follow this published study up with a future large-scale study and to determine how long the cognitive improvements persist.

The design of ‘Game Show’ was based on published research from the Sahakian Laboratory at the University of Cambridge. The study was funded by Janssen Pharmaceuticals/J&J and Wellcome.

In 2015, Professor Sahakian and colleagues showed that another iPad game developed by her team was effective at improving the memory of patients with schizophrenia, helping them in their daily lives at work and living independently. The Wizard memory game is available through PEAK via the App Store and Google Play.

Reference
George Savulich, Thomas Piercy, Chris Fox, John Suckling, James Rowe, John O'Brien, Barbara Sahakian. Cognitive training using a novel memory game on an iPad in patients with amnestic mild cognitive impairment (aMCI). The International Journal of Neuropsychopharmacology; 3 July 2017; DOI: 10.1093/ijnp/pyx040

A ‘brain training’ game developed by researchers at the University of Cambridge could help improve the memory of patients in the very earliest stages of dementia, suggests a study published today in The International Journal of Neuropsychopharmacology.

There's increasing evidence that brain training can be beneficial for boosting cognition and brain health, but it needs to be based on sound research
Barbara Sahakian
Screenshot of Game Show

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Cambridge celebrates Stephen Hawking’s 75th birthday

$
0
0

The event, on the theme of Gravity and Black Holes and organised by the Centre for Theoretical Cosmology where Professor Hawking is based, featured public lectures from Professors Brian Cox, Gabriela González, Martin Rees and Professor Hawking himself. Many of Professor Hawking’s current and former students and colleagues were in attendance to celebrate his life and career in science.

Professor Cox, from CERN and the University of Manchester, is well-known for his popular science series on the BBC, including Wonders of the Universe. He spoke of Professor Hawking’s great contributions to the public understanding of physics, and how he was inspired by Hawking’s 1988 bestseller A Brief History of Time as a teenager.

“Physics challenges us to think very carefully about our place in the Universe,” he said during his lecture on Our Place in the Universe. And echoing another great science communicator, the physicist Richard Feynman, he said, “The most valuable thing about science is not what it teaches us about nature, or the spin-off technologies, it’s a state of mind. The state of mind is that not knowing is a powerful thing.”

Professor González from Louisiana State University is a former spokesperson for LIGO, and updated the conference on the latest research at LIGO, which announced the first detection of gravitational waves – ripples in spacetime – in early 2016. Since then, there have been two more detections. She outlined a planned network of gravitational wave detectors, which would allow scientists to detect gravitational waves with ever-greater precision.

In his lecture From Mars to the Multiverse, Professor Martin Rees, Emeritus Professor in Cambridge’s Institute of Astronomy, discussed the likelihood of finding life on other planets, and of the eternal allure of outer space. “It’s a dangerous illusion to think that space offers a solution to Earth’s problems – we’ve got to solve them right here,” he said. “We are stewards at an incredibly important time in the Earth’s history.”

(click to enlarge)

In his lecture, Professor Hawking reflected on his life and career, and discussed his current research. He talked about when he was diagnosed with motor neurone disease when he was a PhD student at Cambridge, and was give just two years to live.

“At first I became depressed. There didn’t seem any point in finishing my PhD, because I didn’t know if I’d be alive long enough to finish it,” he said. “But after my expectations had been reduced to zero, every new day became a bonus, and I began to appreciate everything I did have. Where there is life, there is hope.”

Professor Hawking also discussed his academic work, which broke new ground on the basic laws which govern the universe, including the revelation that black holes have a temperature and produce radiation, now known as Hawking radiation. At the same time, he also sought to explain many of these complex scientific ideas to a wider audience through popular books, most notably his bestseller A Brief History of Time.

“I thought I might make a modest amount, to help support my children at school, and help with the rising costs of my care,” he said. “But the main reason is I enjoyed it. I think it’s important for scientists to explain their work, especially in cosmology. I never expected A Brief History of Time to do as well as it did. Not everyone may have finished it, or understood everything they read. But at least they would have gotten the idea that we live in a Universe governed by rational laws that we can discover and understand.

“It has been a glorious time to be alive and doing research into theoretical physics. Our picture of the Universe has changed a great deal in the last 50 years, and I’m happy if I’ve made a small contribution. Remember to look up at the stars and not down at your feet. Be curious. And however difficult life may seem, there is always something you can do and succeed at. It matters that you just don’t give up.”

All of the lectures are available to watch online

Some of the biggest names in science took part in a special public event yesterday to celebrate the life and work of Stephen Hawking, on the occasion of his 75th birthday. 

It has been a glorious time to be alive and doing research into theoretical physics.
Stephen Hawking
Stephen Hawking

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Artificial bile ducts grown in lab and transplanted into mice could help treat liver disease in children

$
0
0

In research published in the journal Nature Medicine, the researchers grew 3D cellular structure which, once transplanted into mice, developed into normal, functioning bile ducts.

Bile ducts are long, tube-like structures that carry bile, which is secreted by the liver and is essential for helping us digest food. If the ducts do not work correctly, for example in the childhood disease biliary atresia, this can lead to damaging build of bile in the liver.

The study suggests that it will be feasible to generate and transplant artificial human bile ducts using a combination of cell transplantation and tissue engineering technology. This approach provides hope for the future treatment of diseases of the bile duct; at present, the only option is a liver transplant.

The University of Cambridge research team, led by Professor Ludovic Vallier and Dr Fotios Sampaziotis from the Wellcome-MRC Cambridge Stem Cell Institute and Dr Kourosh Saeb-Parsy from the Department of Surgery, extracted healthy cells (cholangiocytes) from bile ducts and grew these into functioning 3D duct structures known as biliary organoids.  When transplanted into mice, the biliary organoids assembled into intricate tubular structures, resembling bile ducts.

The researchers, in collaboration with Mr Alex Justin and Dr Athina Markaki from the Department of Engineering, then investigated whether the biliary organoids could be grown on a ‘biodegradable collagen scaffold’, which could be shaped into a tube and used to repair damaged bile ducts in the body.  After four weeks, the cells had fully covered the miniature scaffolding resulting in artificial tubes which exhibited key features of a normal, functioning bile duct.  These artificial ducts were then used to replace damaged bile ducts in mice.  The artificial duct transplants were successful, with the animals surviving without further complications. 

“Our work has the potential to transform the treatment of bile duct disorders,” explains Professor Vallier. “At the moment, our only option is liver transplantation, so we are limited by the availability of healthy organs for transplantation. In future, we believe it will be possible to generate large quantities of bioengineered tissue that could replace diseased bile ducts and provide a powerful new therapeutic option without this reliance on organ transplants.”

“This demonstrates the power of tissue engineering and regenerative medicine,” adds Dr Sampaziotis. “These artificial bile ducts will not only be useful for transplanting, but could also be used to model other diseases of the bile duct and potentially develop and test new drug treatments.”

Professor Vallier is part of the Department of Surgery at the University of Cambridge and his team are jointly based at the Wellcome Trust-MRC Cambridge Stem Cell Institute and the Wellcome Trust Sanger Institute. 

The work was supported by the Medical Research Council, Sparks children’s medical research charity and the European Research Council.

Reference
Sampaziotis, F et al. Reconstruction of the murine extrahepatic biliary tree using primary extrahepatic cholangiocyte organoids. Nature Medicine; 3 July 2017; DOI: 10.1038/nm.4360

Cambridge scientists have developed a new method for growing and transplanting artificial bile ducts that could in future be used to help treat liver disease in children, reducing the need for liver transplantation. 

Our work has the potential to transform the treatment of bile duct disorders
Ludovic Vallier
mage of a mouse gallbladder following repair with a bioengineered patch of tissue incorporating human 'bile duct' cells, shown in green. The human bile duct cells have fully repaired and replaced the damaged mouse epithelium

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

The capital of drinking: did 19th-century Liverpool deserve its reputation?

$
0
0

“Liverpool had a problem with drink.” With this opening sentence, David Beckingham’s new book, The Licensed City: Regulating Drink in Liverpool, 1830-1920, gets straight to the heart of his subject matter.

Beckingham is by no means the only scholar to look at the history of alcohol consumption in an iconic city. But he takes a distinctive geographical approach. He probes how problem drinking was defined largely in relation to public drinking, and links the increase in recorded drunkenness in police statistics to an emerging negative national reputation.

The Licensed City charts a period in which Liverpool saw phenomenal growth as a booming port with global reach across Britain’s empire. Its population grew tenfold – soaring from just 77,000 in 1801 to 700,000 at the end of the century – as the city absorbed workers seeking employment on ships, in its docks and allied industries.

There were two very different Liverpools in evidence: one the projection of the aspirations of the city’s grandees, their wealth set in stone in some of the country’s finest mid-Victorian buildings; the other the grim reality of insanitary, cheaply constructed housing, with streets that in the popular imagination quickly became “haunts of crime”. The gap between the two, Beckingham argues, became a source of municipal anxiety.

Outlets serving and supplying drink expanded in parallel with poverty, pubs and bottle shops (off licences) clustering in areas that were home to low-paid dock workers and casual labourers. Of these, many were Irish migrants who had fled famine at home, only, in the words of a contemporary commentator, “to find death abroad.” They became easy scapegoats in this city of sanitary disaster.

In the second half of the 19th century, the reputation of the city hit a new low with mortality rates substantially higher than the national average. On 28 August 1866, The Times newspaper (citing a local Liverpool paper) reported that: “Liverpool has been pronounced the most drunken, the most criminal, the most pauper-oppressed, and the most death-stricken town in England.”

To unpack this record, Beckingham’s primary sources are police reports, and minute books of the city council and licensing magistrates, all held in Liverpool Record Office, as well as local newspapers. They reveal how the anxieties of residents and reform-minded politicians – keen to arrest an emerging national sentiment that the city’s leaders were failing in their civic responsibility – were made manifest in public battles against booze.

Many of those calling for reform belonged to different wings of the temperance movement, and were adamant that the city’s problems had their roots in the over-supply of drink. In some streets it was estimated that there was one public house for every 13 people. As one reformer declared: “Public houses lead to drunkenness and drunkenness to crime.” The pub thus became a focal point in the fight to redeem Liverpool’s municipal reputation. 

More specifically, so too did the system that gave them their licences.The licensing of businesses serving and supplying alcohol can be dated to the 16th century. Beckingham profiles how this ancient system, legislated for nationally and dispensed by local magistrates, was adapted to tackle Liverpool’s distinctive urban challenges. 

One such problem for regulators was the use of pubs by prostitutes, keen to solicit the attentions of sailors enjoying leave on shore.  Parliament resisted banning women from pubs, though in 1872 it did legislate that publicans should restrict the time spent in them by reputed prostitutes.  Prosecutions proved difficult, however, because it was not clear when “reasonable refreshment” ended and more dubious behaviour began. 

The magistrates had another tool at their disposal, however, because licences came up for renewal on an annual basis. This gave Liverpool’s magistrates scope to address problem premises because they could review the management of pubs and threaten to cancel licences. In 1909, for example, a licence was renewed on condition that the publican limit the drinking time of known prostitutes to just four minutes – a measure that now seems laughably impractical.

The licensing committee also enforced opening hours and age rules for customers, which were successively tightened across the period, and clamped down on so-called ‘child messengers’ sent to pubs to fetch beer. 

They also used their discretion to manipulate the layout of pubs in order to target problem behaviours. Many of these were associated with women, from prostitutes in pubs to anxieties about effects on families from mothers who drank, even to the degrading effects of bar work on women.

Responses included restricting the numbers of doors to pubs, even the times at which they could be used, so that women drinkers could be spotted by bar staff, police inspectors and even other customers.  Specific features in pubs also came under scrutiny. The cosy corners known as snugs (now much loved today by heritage fans) were notorious for allowing immoral activities to go on undetected, with the connivance if not involvement of barmaids.

Design also became a way that brewers could fight back, with their own attempts to reform and reclaim the reputation of the public house.  Likewise, battles over licensing were later lauded as a central factor in changing Liverpool’s record and reputation for drunkenness.

Liverpool provides a striking case study for debates surrounding alcohol consumption and the limits of government intervention in the social lives of citizens. Many of the themes in Beckingham’s book have much wider relevance – both to other cities and other periods, including our own.

His contribution is to draw attention to perceptions of problem behaviours: Victorian drinking ‘problems’ were often working-class public order offences identifiable because they were easily visible.  Middle-class drinking, behind closed doors, largely escaped official scrutiny. That distinction is both our inheritance and modern licensing’s challenge, he concludes, if it is to respond to today’s definitions of problem drinking.

The Licensed City: Regulating Drink in Liverpool, 1830-1920 by David Beckingham is published by Liverpool University Press.

 

 

In his new book, geographer David Beckingham looks at the rigorous licencing regime that Liverpool’s authorities put in place to tighten their grip on problem drinking in the pubs that proliferated across the city.  Similar attitudes frame today’s perceptions of public and private alcohol consumption.

Public houses lead to drunkenness and drunkenness to crime.
19th-century reformer
The Parrot Hotel, Liverpool, 1908

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Tracking inequality in India: the story of a pioneer

$
0
0

The widening gap between India’s rich and poor is captured by the National Sample Survey (NSS), an organisation founded in 1950, which gathers data from roughly 14,000 Indian villages and localities to provide a snapshot of how the population at large is faring. The NSS and its pioneering role in the measurement of poverty and inequality are some of the important subjects addressed by a conference that starts tomorrow 5 July in Cambridge to explore how different modern societies have gauged social and economic disparity.

Since Indian Independence in 1947, the NSS has conducted more than 70 rounds of surveys, providing much-needed data about household consumption, social inequality, educational attainment and healthcare outcomes. NSS data serves as a backbone to Indian economic planning, public welfare provision and academic research.

The story behind the NSS goes back to 1913, when a brilliant young man called Prasanta Chandra Mahalanobis arrived at King’s College, Cambridge, to study mathematics.

It is said that Mahalanobis had intended to become a student in London but applied to King’s after visiting its world-famous chapel and missing the last train back to the capital. He graduated with a BA in natural science, receiving top marks in his physics final exam.

During his time at Cambridge, he interacted with another outstanding Indian mathematician, Srinivasa Ramanujan. Influenced by the British journal Biometrika, Mahalanobis began experimenting with new statistical methods for studying and measuring large-scale phenomena – occurrences so widespread and diverse by nature that they are difficult to gauge.

A man of diverse scientific interests, Mahalanobis combined statistics with other emerging disciplines, including anthropology, physics and economics, to develop novel approaches for estimating population distribution, crop yields and household consumption.

Mahalanobis is known for his pioneering work in descriptive statistics – and his name is remembered by the ‘Mahalanobis distance’, a measurement used in studies of population. For many years he taught at Presidency College (Kolkata) where, in 1931, he was responsible for founding the Indian Statistical Institute (ISI).

Today the ISI employs a staff of more than 1,000 people and is a leading international centre for research in applied mathematics, data science and computing.

With funding from the Philomathia Foundation, Dr Poornima Paidipaty (Faculty of History) has embarked on a study focusing on Mahalanobis’s most important contribution at the ISI: his visionary work on the development of large-scale surveys of India’s rural population in response to the country’s drive to realign itself as an industrial force with global reach.

Her research is part of a much larger project on ‘Historicising the Measurement of Inequality’, which is directed by Dr Pedro Ramos-Pinto and examines global histories of quantifying and framing socio-economic disparity.

Starting in the late 1930s, the ISI undertook a series of pioneering pilot surveys to gauge Indian household incomes at a time of huge social and historical upheaval. Sampling offered Indian scientists new tools for generating data on phenomena that had never been comprehensively or accurately measured before. In its early years as a research and training centre, the ISI used sampling to study everything from changing patterns in tea consumption to estimating crop acreage.

This research became more urgent after Independence, when government planners needed more reliable economic data to frame programmes aimed at rapid industrialisation, poverty alleviation and development. Lacking a strong household income tax regime, Indian bureaucrats lacked the fine-grained statistical information used by economists in developed countries to accurately estimate GDP.

Mahalanobis and his colleagues at the ISI offered a unique solution to these problems and designed a pioneering large scale sampling exercise to estimate the size, composition and condition of the Indian economy. As an approach to measurement, it was an original (and at the time, highly risky) endeavour. Many doubted that random sampling could accurately represent the totality of Indian social and economic life.

In 1950, Mahalanobis launched the National Sample Survey (NSS) to undertake the ambitious task of providing a comprehensive picture of India’s domestic economy. In first rounds of research, 1833 villages and residential areas were surveyed. This limited sample was used to represent the nation as a whole, which totalled roughly 360 million people at the time.

During this early period, critics complained that urban areas were over-represented and that surveyors were unfamiliar with the struggles and transformations facing remote regions and rural villages. It took many years for Mahalanobis and colleagues to design a survey that would capture, with an acceptable level of accuracy, the data that the government sought.

Due in part to his widespread academic interests, and his interactions with intellectuals from fields other than mathematics, Mahalanobis’s work incorporated cutting edge research in the social and computational sciences of the postwar era. He collaborated with top economists and mathematicians from around the world, and brought leading scientists to Kolkata for extended periods of time.

Ronald Fisher, JBS Haldane, Norbert Wiener, Andrey Kolmogorov, Jerzy Neyman, Joan Robinson and Simon Kuznets were among the many researchers sponsored by the ISI to collaborate on the Institute’s teaching and ongoing survey efforts in the 1950s and 1960s.

During its first decade, NSS researchers had to address numerous and complicated issues. What size and distribution of survey sites would best represent the nation in its entirety? How should surveyors account for India’s significant informal sector and for labour that was paid in kind, rather than cash?

Measuring national productivity required that researchers account for all productive labour – not just monetised transactions. Similarly, how should surveyors include women’s labour? Survey teams had to build rapport with their subjects, and in many cases, even teach them how to estimate monthly consumption and expenditure. The accuracy of data relied on social ties and mutual education – not just rote completion of questionnaires.

Over time, the NSS not only became a valued and relied upon institution, it influenced researchers and policymakers around the globe. Chinese officials sent their statisticians to Kolkata to learn from Mahalanobis’s staff in the 1950s, and the ISI served as a model for the American statistician Gertrude Cox, for the organisation of statistical training in the USA.

With her background in science studies and South Asian history, Paidipaty is well-equipped to understand the technical as well as the social relationships that allowed Indian planners and scientists to define and steer the national economy. Her research draws on the extensive archives of the ISI, which offer unique insights as to how Indian household life was measured in the early decades after Independence and Partition, and how policymakers framed and understand shifting standards of living.

Paidipaty’s work demonstrates that sampling, as a technique of economic measurement, was intimately tied to mid-century economic planning. Under Nehru’s leadership, the Indian state focused its developmental efforts on rapid industrialisation and growth, but achieving these objectives required new tools for defining and measuring the national economy. What were the different, discreet parts of an economy and how did they relate to one another?

Pinning down such abstractions, and offering concrete, tangible data, was indispensible to the work of managing India’s planned economy. The early history of sampling roughly overlapped with early experiments in economic planning. Mahalanobis was a member of India’s Planning Commission from 1953 until 1967, and directed the nation’s Second Five Year Plan.

In 2014, India’s government dissolved the Planning Commission, arguing that pro-growth policies ought to be achieved through unfettered markets rather than planned policy interventions. Yet, even without a formal planning apparatus, the significance of large-scale sampling has only grown over the last 70 years.

Since the 1980s, economists around the world, including those at the World Bank and the IMF, have embraced and underscored the importance household sampling. Not only do they provide large-scale aggregative statistics, they are a crucial source of fine-grained and qualitatively rich data.

The NSS has been an on-going subject of debate amongst economists, but is also a crucial source of information. Angus Deaton, the recipient of the 2015 Nobel prize in economics, in some of his most influential work used NSS data to help the Indian government recalibrate how it defined and measured poverty. Within the current Indian context, in which economic growth and rising inequality are once again at the centre of public debate, it has become all the more important to understand the history of data, how it is produced and what numbers really represent.

As a nation, India is undergoing profound transformation, but rapid growth has come hand in hand with rising inequality as well as growing disparity between rural and urban areas. NSS data remains one of the best resources for understanding and tracking these changes. As more of this information circulates in the public domain, it becomes all the more crucial to appreciate how such data is produced. Paidipaty’s work on the history of the NSS offers a fascinating glimpse into one of the most significant and early mid-century precursors to contemporary developments in big data.

'Measuring Matters: Histories of Assessing Inequality' is on 5-7 July 2017 and will take place at the Alison Richard Building, 7 West Road, Cambridge. The conference is sponsored by Cambridge's Centre for Research in the Arts, Social Sciences and Humanities (CRASSH). 

India’s booming business centres and gleaming shopping malls mask a grimmer reality. While one section of the population gets richer, another section gets poorer. In the countryside, farmers and others ‘left behind’ by the economic surge find themselves in increasingly desperate circumstances. In many cases their plight, exacerbated by crippling debt, has led to suicide.

Within the current Indian context, in which economic growth and rising inequality are once again at the centre of public debate, it has become all the more important to understand the history of data, how it is produced and what numbers really represent.
Women working in the rice paddy fields in Odisha, one of the the poorest regions of India

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 
Viewing all 4368 articles
Browse latest View live


Latest Images