Quantcast
Channel: University of Cambridge - Latest news
Viewing all 4368 articles
Browse latest View live

Richard III – case closed after 529 years

$
0
0

An international research team has provided overwhelming evidence that the skeleton discovered under a car park in Leicester indeed represents the remains of King Richard III - closing what is probably the oldest forensic case solved to date.

Analysis of all the available evidence confirms identity of King Richard III to the point of 99.999% (at its most conservative).

The team of researchers, including geneticist Dr Peter Forster from Murray Edwards College and the McDonald Institute for Archaeological Research, and led by Cambridge graduate Dr Turi King have published their findings online today in the journal Nature Communications.

The researchers collected DNA from living relatives of Richard III and analysed several genetic markers, including the complete mitochondrial genomes, inherited through the maternal line, and Y-chromosomal markers, inherited through the paternal line, from both the skeletal remains and the living relatives.

While the Y-chromosomal markers differ, the mitochondrial genome shows a genetic match between the skeleton and the maternal line relatives. The former result is not unsurprising as the chances for a false-paternity event is fairly high after so many generations.

Forster said: “Although the false paternity means we cannot look forward in time, we can trace King Richard’s Y lineage back into prehistory. Historically, the male line of the Plantagenets is recorded back until AD1028 in N France. Using King Richard’s genetic profile, we can go back much further: Richard’s G2a type traces back to the first farmers who migrated from the Near East and Anatolia (modern Turkey) to Europe about 8000 years ago, quickly spreading along the Mediterranean and into Central Europe and France by 5500BC.

"These pioneer farmers carried predominantly G2a types, which today are quite rare, around 1 percent in Europe (see map). And one of these Anatolian farmers was King Richard’s immigrant male ancestor. Incidentally, the descendants of the Plantagenets not only became Kings of England but also of Jerusalem, bringing the migration of this Y chromosome type full circle.”


Map shows locations of 14 living men who are close genetic matches to King Richard –their G2a type is quite rare, around 1 percent in Europe today.

Analysis of the mitochondrial DNA shows a match between Richard III and modern female-line relatives Michael Ibsen and Wendy Duldig. The male line of descent is broken at one or more points in the line between Richard III and living male-line relatives descended from Henry Somerset, 5th Duke of Beaufort.

This paper is also the first to carry out a statistical analysis of all the evidence together to prove beyond reasonable doubt that Skeleton 1 from the Greyfriars site in Leicester is indeed the remains of King Richard III.

The researchers also used genetic markers to determine hair and eye colour of Richard III and found that with probably blond hair - at least during childhood - and almost certainly blue eyes, Richard III looked most similar to his depiction in one of the earliest portraits of him that survived, that in the Society of Antiquaries in London.

“Our paper covers all the genetic and genealogical analysis involved in the identification of the remains of Skeleton 1 from the Greyfriars site in Leicester and is the first to draw together all the strands of evidence to come to a conclusion about the identity of those remains,” said Dr Turi King from the University of Leicester, who lead the research. 

“Even with our highly conservative analysis, the evidence is overwhelming that these are indeed the remains of King Richard III, thereby closing an over 500 year old missing person’s case.”

Historically, the male line of the Plantagenets is recorded until Hugues, Count of Perche (documented AD1028 in N France).
Prehistorically, Richard’s male ancestor, carrying a G2a-type, arrived with the first farmers from the Near East and Anatolia (modern Turkey) to Europe about 8000 years ago, quickly spreading along the Mediterranean and into Central Europe and France by 5500BC.

The research team now plans to sequence the complete genome of Richard III to learn more about the last English king to die in battle.

Adapted from a University of Leicester press release.

DNA and genealogical study confirms identity of remains found in Leicester and uncovers new truths about his appearance and Plantagenet lineage.

Although the false paternity means we cannot look forward in time, we can trace King Richard’s Y lineage back into prehistory
Peter Forster
Skull and bones of Richard III

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Kettle's Yard new exhibition: Beauty and Revolution

$
0
0

Beauty and Revolution traces Finlay’s artistic development from the 1960s to the 1980s; from the poems that made him Britain’s most internationally acclaimed concrete poet, to the images and texts that marked his engagement with the ideas of the French Revolution. It also presents his famous garden, Little Sparta, through photographs and film.

Visitors to the exhibition will encounter Finlay’s work through prints and paper sculptures, photographs and film; its title Beauty and Revolution reflects the visual wonder of the works on show, which cannot be divorced from Finlay’s engagement with revolutionary ideas.

Finlay’s experiments with language developed first of all through his poetry. In 1961 he co-founded the Wild Hawthorn Press to bring a new international perspective to the Scottish literary scene. His fascination with the visual effect of words in space soon resulted in the creation of a distinctive visual language that is embodied first in his standing and folding poems, and later in poem prints, emblems, medallions and inscriptions. These works range in their themes from the world of the fishing-boat to the heroes of the French Revolution, often incorporating the artist’s characteristically humorous puns on words.

The last room of the exhibition will show original prints by some of the major photographers who have recorded his work, together with the colour film from 1973 that provides the first visual documentation of Little Sparta– his classically inspired garden in the Pentland Hills near Edinburgh.

Finlay first met and began to correspond with Jim Ede, the founder of Kettle’s Yard, in the autumn of 1964. In the same year, a group of Cambridge students had started to exhibit, publish and write about his concrete poetry; one of them was the art historian Stephen Bann, who subsequently built up an extensive collection of Finlay’s works. An internationally recognised authority on Finlay’s art, Professor Bann is the curator of Beauty and Revolution.

This new exhibition at Kettle’s Yard offers a unique opportunity to view Bann’s collection of poems, prints and sculptures alongside Kettle’s Yard’s permanent collection, which Finlay much admired though he himself took a different path. An inscribed stone that was later acquired for the collection is entitled KETTLE’S YARD / CAMBRIDGE / ENGLAND IS THE / LOUVRE OF THE PEBBLE (1995).

Beauty and Revolution: The Poetry and Art of Ian Hamilton Finlay runs from Saturday 6 December 2014– Sunday 1 March 2015

 

A new exhibition at Kettle’s Yard celebrating the work of the Scottish poet and artist Ian Hamilton Finlay (1925–2006) opens this Saturday (6 December).

Finlay's fascination with the visual effect of words in space resulted in the creation of a distinctive visual language.
Catameringue by Ian Hamilton Finlay

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

First comprehensive characterisation of genetic diversity in Sub-Saharan Africa

$
0
0
School girls in the Central African Republic

“Although many studies have focused on studying genetic risk factors for disease in European populations, this is an understudied area in Africa,” says Dr Deepti Gurdasani, lead author on the study and a Postdoctoral Fellow at the Wellcome Trust Sanger Institute, UK “Infectious and non-infectious diseases are highly prevalent in Africa and the risk factors for these diseases may be very different from those in European populations.”

“Given the evolutionary history of many African populations, we expect them to be genetically more diverse than Europeans and other populations. However we know little about the nature and extent of this diversity and we need to understand this to identify genetic risk factors for disease.”

Dr Manjinder Sandhu (Department of Medicine, and lead senior author from the Wellcome Trust Sanger Institute) and colleagues collected genetic data from more than 1800 people – including 320 whole genome sequences from seven populations – to create a detailed characterisation of 18 ethnolinguistic groups in Sub-Saharan Africa. Genetic samples were collected through partnerships with doctors and researchers in Ethiopia, the Gambia, Ghana, Kenya, Nigeria, South Africa and Uganda.

The AGVP investigators, who are funded by the Wellcome Trust, the Bill and Melinda Gates Foundation and the UK Medical Research Council, found 30 million genetic variants in the seven sequenced populations, a fourth of which have not previously been identified in any population group. The authors show that in spite of this genetic diversity, it is possible to design new methods and tools to help understand this genetic variation and identify genetic risk factors for disease in Africa.

“The primary aim of the AGVP is to facilitate medical genetic research in Africa. We envisage that data from this project will provide a global resource for researchers, as well as facilitate genetic work in Africa, including those participating in the recently established pan-African Human Heredity and Health in Africa (H3Africa) genomic initiative," says Dr Charles Rotimi, senior author from the Centre for Research on Genomics and Global Health, National Human Genome Research Institute, National Institutes of Health, USA.

The authors also found evidence of extensive European or Middle Eastern genetic ancestry among several populations across Africa. These date back to 9,000 years ago in West Africa, supporting the hypothesis that Europeans may have migrated back to Africa during this period. Several of the populations studied are descended from the Bantu, a population of agriculturists and pastoralists thought to have expanded across large parts of Africa around 5,000 years ago.

The authors found that several hunter-gatherer lines joined the Bantu populations at different points in time in different parts of the continent. This provides important insights into hunter-gatherer populations that may have existed in Africa prior to the Bantu expansion. It also means that future genetic research may require a better understanding of this hunter-gatherer ancestry.

“The AGVP has provided interesting clues about ancient populations in Africa that pre-dated the Bantu expansion,” says Dr Manjinder Sandhu. “To better understand the genetic landscape of ancient Africa we will need to study modern genetic sequences from previously under-studied African populations, along with ancient DNA from archaeological sources.”

The study also provides interesting clues about possible genetic loci associated with increased susceptibility to high blood pressure and various infectious diseases, including malaria, Lassa fever and trypanosomiasis, all highly prevalent in some regions of Africa. These genetic variants seem to occur with different frequencies in disease endemic and non-endemic regions, suggesting that this may have occurred in response to the different environments these populations have been exposed to over time.

“The AGVP has substantially expanded on our understanding of African genome variation. It provides the first practical framework for genetic research in Africa and will be an invaluable resource for researchers across the world. In collaboration with research groups across Africa, we hope to extend this resource with large-scale sequencing studies in more of these diverse populations,” says Dr Sandhu.

Researchers from the African Genome Variation Project (AGVP) have published the first attempt to comprehensively characterise genetic diversity across Sub-Saharan Africa. The study of the world’s most genetically diverse region will provide an invaluable resource for medical researchers and provides insights into population movements over thousands of years of African history. These findings appear in the journal Nature.

The AGVP has substantially expanded on our understanding of African genome variation. It provides the first practical framework for genetic research in Africa and will be an invaluable resource for researchers across the world.
Dr Manjinder Sandhu
School girls in the Central African Republic (cropped)

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

Where there’s muck there’s aluminium (if not brass)

$
0
0

It started with a bacon roll and a microwave oven, and now it’s poised to transform the recycling of a packaging material that has been as unrecyclable as it is useful.

The bacon roll, as the story goes, was microwaved for so long it turned into a charred mass of carbon that began to glow red-hot. What was happening was an intense heating process called microwave-induced pyrolysis.

On hearing about the ‘over-microwaved’ bacon roll from an acquaintance, chemical engineers Professor Howard Chase and Dr Carlos Ludlow-Palafox (a PhD student at the time) at the University of Cambridge wondered whether the process could be exploited to recover useful materials from packaging wastes.

Particulate carbon is an efficient absorber of microwaves and can transfer this thermal energy to adjacent materials. If the adjacent material is organic, such as plastic or paper, it breaks apart (or pyrolyses) into smaller pieces; if the material is a metal attached to the plastic or paper, the metal can be recovered in a clean form after the attached organics are pyrolysed.

Fifteen years later, and the technology they developed is now being used in a commercial-scale plant designed, built and operated by Cambridge spin-out Enval Limited. Founded by Ludlow-Palafox, with Chase as R&D Director, Enval is using the plant to demonstrate the capabilities and economics of the process to investors and waste handlers.

Enval has focused on plastic–aluminium laminate packaging. Prized by manufacturers for its lightness, cheapness and ability to protect the contents from light and air, the packaging is commonly used for food, drink, toothpaste, pet food and cosmetic products.

However, the combination of plastic and aluminium in the packaging presents a technical recycling challenge that until now has been unsolved; instead, items packaged like this contribute to the millions of tonnes of rubbish disposed of in landfill each year. For the brands who package their consumer goods this way, the ‘recyclable logo’ on the packaging, and the sustainability credentials that go with this, has been all-elusive.

“We have carried out a life-cycle assessment of the packaging and it’s still environmentally better to use these laminates even though they are not recyclable, just because so little materials and energy goes into making and transporting them compared with alternatives like glassware and cans,” said Ludlow-Palafox.

“There is no real drive to replace them and their market use is increasing by about 10–15% every year. In the UK, roughly 160,000 tonnes of laminates are used per year for packaging, which means at least 16,000 tonnes of aluminium is going into the ground. Just imagine if we could routinely recycle this.”

The solution he and Chase developed with funding from the Engineering and Physical Sciences Research Council started in a relatively simple way: they placed a pile of particulate carbon and some shredded laminated packaging inside a conventional 1.2 kW kitchen microwave, replaced the air inside the oven with nitrogen and turned the microwave up to full power until the temperature increased to about 600°C.

When they opened the door two minutes later, the laminated material had been separated into clean aluminium flakes and hydrocarbon gases and oil.

The basic chemistry is still the same in the commercial-scale plant but the oven is now 150 kW and large enough to be housed in a 100 m2 industrial unit. It takes just three minutes to convert waste into aluminium for smelting, and hydrocarbons for fuel, and with no toxic emissions.

Now fully commissioned, the plant can recycle up to 2,000 tonnes of packaging a year – which, say the researchers, is roughly the amount handled by regional waste handlers – and it generates enough energy to run itself. Enval now has an arrangement with manufacturers of plastic–aluminium laminates to recycle their industrial scrap at less than what they would have spent on sending it to landfill.

The researchers have, in effect, turned into commercial waste handlers – something they would never have imagined back in the 1990s. “While we were getting into the world of laminates it didn’t cross our minds to start a company… we just wanted the process to become a reality,” said Ludlow-Palafox. “In the end, the investors [Cambridge Capital Group and Cambridge Angels] said there is no one else who knows the process as well as you, you might as well do it!

“We knew that the patented technology offered a genuine recycling route for this type of packaging but that the waste industry can be slow to take on new technology – the margins in environmental services are small, and we needed a working, full commercial-scale plant to convince them that the process was viable,” said Chase, who estimates that a plant like theirs would pay for itself within three years. “In parallel, we were being contacted by the brands who use the packaging, asking how they could help.”

The commercial-scale plant is part-funded by Nestlé and Kraft Foods/Mondelez International.

“It was a chicken and egg situation,” said Ludlow-Palafox. “No one is going to buy this technology unless this type of waste is separated for recycling, but the waste wasn’t going to be separated because there has been no process to recycle it. We had to break that negative loop somehow. Now we have the commercial-scale plant, we can show waste handlers the benefits and encourage local authorities to implement a selective collecting system.”

Meanwhile, the scientists are keeping an eye on future recycling prospects. Research into the microwave pyrolysis of different types of wastes continues in Chase’s group in the Department of Chemical Engineering and Biotechnology. “It’s crucial that we continue to look for new opportunities for recycling valuable materials while simultaneously eliminating the need to send wastes to landfill or incineration.

“We’ve demonstrated that a lot of troublesome waste materials can be pyrolysed using our microwave technology but it’s not always economically sensible to do it; the challenge now is to identify which processes are likely to be commercially viable, and which of those will attract the necessary investment funding to bring them into commercial reality. This is a business sector that is comparatively unfamiliar to most investors who regularly commit to innovation in other areas. By demonstrating the societal and economic benefits of green technologies, we hope to secure the necessary investment to transform innovation into successful commercial practice.”

Technology developed at the University of Cambridge lies at the heart of a commercial process that can turn toothpaste tubes and drinks pouches into both aluminium and fuel in just three minutes.

In the UK, roughly 160,000 tonnes of laminates are used per year for packaging, which means at least 16,000 tonnes of aluminium is going into the ground. Just imagine if we could routinely recycle this
Carlos Ludlow-Palafox

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Pembroke College elects new Master

$
0
0

He will be formally admitted to the Mastership at the beginning of October 2015 following the retirement in the summer of Sir Richard Dearlove, who has been Master since 2004.

Chris Smith, an Honorary Fellow of Pembroke since 2004, took a double first in English at Pembroke and was President of the Union Society.

His PhD was devoted to the idea of solitude in Romantic poetry, with reference to Wordsworth and Coleridge.

After a year at Harvard as a Kennedy Scholar he worked in housing in the London Borough of Islington, and was Member of Parliament for Islington South and Finsbury from 1983 to 2005. 

As Secretary of State for Culture, Media and Sport from 1997 to 2001 Chris Smith was responsible for the reintroduction of free admission to Britain’s museums and galleries.

In 2005 he was created a life peer and admitted to the House of Lords, where he sits as an independent peer.

He was founding Director of the Clore Leadership Programme from 2003 to 2008, served as Chairman of the Environment Agency between July 2008 and September 2014, and has been the Chairman of the Advertising Standards Authority since July 2007.

Among his other contributions to public life he is Chairman of the Wordsworth Trust and Chairman of the Art Fund. He is also a keen mountaineer.

Commenting on his election, Lord Smith said: "I love Pembroke. I have held it a privilege to have maintained my connection with the College through the years, and I am honoured to have been asked to lead the College community as Master.

"I look forward to helping our Fellows, students and staff build on the successes of recent years at a period of real opportunity and significant challenges for Pembroke and Cambridge."

 

 

 

Pembroke College is delighted to announce the election of Lord Smith of Finsbury as its next Master.

I look forward to helping our Fellows, students and staff build on the successes of recent years at a period of real opportunity and significant challenges for Pembroke and Cambridge.
Lord Smith of Finsbury

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

‘Satiety hormone’ leptin links obesity to high blood pressure

$
0
0

Being obese or overweight is a major risk factor for the development of high blood pressure and cardiovascular disease. Whilst a number of factors may be involved, the precise explanation for the link between these two conditions has been unclear.

In a study published today in the journal Cell, a research team led by Professor Michael Cowley, Monash University, Australia, in collaboration with Professor Sadaf Farooqi, from the University of Cambridge, UK, studied mice and humans who have problems producing or processing the hormone leptin and compared them with ‘healthy’ individuals to see whether this hormone could provide the link. Leptin is made by fat and circulates in the bloodstream to reach the brain, where it acts as a signal for energy reserves, adjusting both energy expenditure and the sensation of hunger – hence it is sometimes referred to as the ‘satiety hormone’.

The group showed that some obese people who were lacking the hormone leptin because of a genetic disorder had low blood pressure despite being very heavy. This was also the case for people lacking the gene for the leptin receptor in the brain, meaning that the brain was unable to respond to the hormone.

Modelling the human condition, Professor Cowley’s team in Australia showed that mice with normal leptin signalling developed an increase in blood pressure when they became obese on a high fat diet. These effects were not seen in mice that lacked leptin or where leptin was unable to work because of a defect or block on the leptin receptor.

These experiments demonstrate that leptin signalling is necessary for obesity-induced increased blood pressure. The clinical studies in severely obese humans showed that these observations are relevant to humans.

Professor Cowley said: “High blood pressure is a well-known consequence of obesity. Our study explains the mechanism behind this link, showing that leptin, a hormone secreted by fat, increases blood pressure.”

The researchers are now investigating the precise pathways in the brain by which leptin acts to regulate blood pressure.

Professor Farooqi, from the Wellcome Trust-Medical Research Council Institute of Metabolic Science added: “We now know that leptin regulates both our weight and our blood pressure through its action on the brain. Targeting this action could offer a useful way of helping people fight obesity and associated problems such as high blood pressure and heart disease.”

This work was supported in Australia by the Heart Foundation of Australia, The National Health and Medical Research Council of Australia, Monash University and Pfizer Australia; and in the UK by the Wellcome Trust, the Medical Research Council, NIHR Cambridge Biomedical Research Centre and the Bernard Wolfe Health Neuroscience Fund.

Reference
Simonds, SE et al. Leptin Mediates the Increase in Blood Pressure Associated with Obesity. Cell; 4 December 2014

Leptin, a hormone that regulates the amount of fat stored in the body, also drives the increase in blood pressure that occurs with weight gain, according to researchers from Monash University and the University of Cambridge.

Targeting this action could offer a useful way of helping people fight obesity and associated problems such as high blood pressure and heart disease
Sadaf Farooqi
Under Pressure (crop, colour-corrected)

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

Visions of plague

$
0
0

We are in the midst of the worst Ebola outbreak known in human history. Our screens are filled with nightmarish yet strangely familiar imagery. Men in space-age protective suits, lugging wrapped-up bodies over to hastily dug pits. Clinical tents in poor yet exotic locations, gleaming incongruously. Bodies in the streets.

As Ebola continues its trail of death and terror, many will be unaware that we also continue to live with another killer – plague. The most recent pandemic (the third) started in rural China in 1855 but exploded when it reached Hong Kong in 1894, sweeping the world and killing over 12 million people. Although not considered an active threat since 1959, recent cases of plague have occurred in Bolivia, China, Madagascar and the USA.

Dr Christos Lynteris is a social anthropologist based at the Centre for Research in the Arts, Social Sciences and Humanities (CRASSH). During work on plague among marmots on the Chinese–Russian border, Lynteris started to consider how plague is represented, how knowledge about plague is captured and how we interact with what we see when we see the traits of plague.    

“The third pandemic was born around the same time as modern photographic techniques, and the ability to capture and transmit images of the third plague pandemic transformed public consciousness. It opened up an era where the meaning of health emergencies is publicly negotiated, rather than predetermined by any single scientific or governmental authority.”
     
Last year, he was awarded a European Research Council grant to find, collate and analyse the largest database of plague imagery in history; as the only exhaustive visual record of any infectious disease epidemic and its impact on social life and thought in the modern era, it will be an invaluable resource to historians, anthropologists and epidemiologists alike.

Tracking the images down takes painstaking investigative work for Lynteris and his team (Lukas Engelmann, Nick Evans and Branwyn Poleykett), sifting through photographic remnants of the old colonial powers to pick out the diseased, the dying, the depictions of human Yersinia pestis infection. “Many of the images are not held in the places where outbreaks occurred. An archive in Alabama might hold a hundred images of the plague in North China because that’s where the missionaries were from. Foreign doctors, missionaries, reporters from across the world go to other corners of the planet to work with plague epidemics; it can be a tricky web to untangle.”

He describes the imagery as a “strange combination of journalistic war reporting, crime scene photography and medical imagery.” Some of it is so graphic and distressing that part of the grant stipulates the digital archive must be kept in a locked room at all times, the ‘plague room’ as cheerfully Lynteris refers to it. Entering anywhere with such a moniker is slightly unnerving.

“OK, this next one really isn’t very pretty,” said Lynteris, showing one of the thousands of images they have already collected. Lynteris probably says this a lot these days. The photo, taken in Madagascar in 1899, feels familiar. The tents. The pits. The suited spacemen. If not for the sepia, this could be West Africa in 2014.

“There is a clear visual paradigm of plague inherited from imperial and colonial history that is emerging as we gather more and more images, an expression of diseased environments we still live with,” explained Lynteris.

“The visual paradigm in the Madagascar photo is replicated throughout the third pandemic and in other outbreaks since, even over a century later, despite the fact that the medical paradigm has completely shifted – we know far more about infectious diseases now than in 1899, so why are we seeing the same imagery? By taking the aesthetic regime from a hundred years ago and replicating it today you are inadvertently replicating a long surpassed medical model.”

Asked whether governments and media are propagating these portrayals because this is what people expect, even need to see, Lynteris said: “I’m not sure, but something is not right here. It’s the components and rationale behind these visual paradigms that we will explore.”

Not all the imagery is gruesome. Some resemble forensic architectural photos. “When the plague hit the USA, investigators would meticulously photograph every house in the infected area – cellars, floors, beams – looking for clues as to the conditions that facilitate plague.”
 
In another set of images from an outbreak in Manchuria in 1911 that killed 60,000, Lynteris highlights an imperialist propaganda war being fought out in the plague depictions. Russia and China were trying to claim providence of the area, with both determined to prove that it was they who were the most scientific and could tame the plague.

“The Chinese were trying to present an image of high science and hygienic modernity, full of medical teams with microscopes and charts. They depicted plague as an urban planning problem that can be scoured by fire.” There are many pictures of burning houses, but not a single human body.

The Russians went a different way (“it’s a horror show”). The images are entirely militaristic, as if an army invaded a land where everyone was already dead. The aim was to show that the Chinese had no control, that death was rife and unstoppable without Russian force: “it was intended to scare, show oriental barbarity with dogs eating corpses and exposed plague pits.” Images like these are why the ‘plague room’ is kept locked.

The team has had to create a language of plague to make sure the database is fully searchable, and aim to have it live and open access by the time the project finishes in 2018. They are working not just with other anthropologists and medical historians but with epidemiologists. There are fundamental assumptions about plague that life scientists are starting to question, and these archives may hold clues as to what led to mistaken assumptions in the first place.

Most importantly the team is focusing on the relation between the ethics and aesthetics of plague photography. “The implications of this in the age of social media are immense. How do we capture an outbreak like Ebola with our cameras? How does this reflect our responsibility towards the victims, but also in terms of global health?” It’s alarming, Lynteris says, that there seems to be no difference between how we depict outbreaks today and how we did 100 years ago. “In the post-colonial world epidemic photography is still stubbornly colonial.”

Many of the images mentioned in this article are too gruesome to be displayed here. All inset images credit: Wellcome Trust.

A new research project is compiling the largest database of plague imagery ever amassed, focusing on a pandemic that peaked in the early 20th century and continues to this day.

By taking the aesthetic regime from a hundred years ago and replicating it today you are inadvertently replicating a long surpassed medical model
Christos Lynteris
Encoffining body, Changchun, 1911

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Cambridge part of winning bid to integrate driverless cars into everyday life

$
0
0

The University of Cambridge is a partner in the ‘UK Autodrive’ consortium, which was announced as the winner of the UK Government’s £10 million ‘Introducing Driverless Cars’ competition in the 2014 Autumn Statement.

The aim of the project, which involves local authorities, technology and automotive businesses and other academic institutions, is to establish the UK as a global hub for the development of autonomous vehicle technologies, and to integrate driverless vehicles into existing urban environments by trialling them in two UK cities.

Not only will the programme help develop the new protocols and connected infrastructure required to deliver future autonomous mobility, it will allow the UK Autodrive team to test public reaction to both driverless cars and self-driving pods.

The funding, provided by Innovate UK, will be matched by the 12 consortium members to create a £19.2 million three year project which will be led by design and engineering consultants Arup. The feasibility studies and practical demonstrations will take place in Milton Keynes and Coventry, where the city councils are taking the lead in developing the urban infrastructure technologies required to support driverless mobility.

The University’s role involves looking at the feasibility of driverless public transport (L-SATS, or Low-Speed Autonomous Transport System), assessing the public’s reactions to and perceptions of autonomous vehicles, and assessing their possible impact on congestion. The studies will provide insights for vehicle manufacturers, cities, commercial operators, legislators and insurers to develop the legal framework for the roll-out of autonomous mobility.

On-road testing will include the real-world evaluation of passenger cars with increasing levels of autonomy, as well as the development and evaluation of lightweight fully autonomous self-driving pods designed for pedestrianised spaces.

“Many cars which are available today already have some degree of autonomy, through technologies such as automatic parking,” said Professor John Miles of the Department of Engineering, who designed the programme and established the consortium. “People are starting to accept many of these features as commonplace, and we will be testing some of the more advanced ‘driver assist’ technologies in the earlier part of the programme.”

“As well as developing and testing the in-car, car-to-car and car-to-infrastructure technologies that will be required to drive cars autonomously on our roads in the future, the project will also place great emphasis on the role and perceptions of drivers, pedestrians and other road users,” said Tim Armitage, the UK Autodrive Project Director at Arup.

The consortium’s plans for the practical demonstration phases is to start testing with single vehicles on closed roads, and to build up to a point where all road users, as well as legislators, the police and insurance companies, are confident about how driverless pods and fully and partially autonomous cars can operate safely on UK roads.

“Cars that drive themselves would represent the most significant transformation in road travel since the introduction of the internal combustion engine,” said Nick Jones, lead technologist at Innovate UK. “There are so many new and exciting technologies that can come together to make driverless cars a reality, but it’s vital that trials are carried out safely, that the public have confidence in that technology and we learn everything we can through the trials so that legal, regulation and protection issues don’t get in the way in the future.”

Business Secretary, Vince Cable said: “The UK is a world-leader in the development of driverless technology, and today’s announcement will see driverless cars take to city streets from 1 January. This not only puts us at the forefront of this transformational technology but it also opens up new opportunities for our economy and society.”

“Through the government’s industrial strategy we are backing the automotive sector as it goes from strength to strength. We are providing the right environment to give businesses the confidence to invest and create high skilled jobs.”

The partners in the consortium are Arup, Milton Keynes Council, Coventry Council, Jaguar Land Rover, Ford Motor Company, Tata Motors European Technical Centre, RDM Group, MIRA, Oxbotica, AXA, international law firm Wragge Lawrence Graham & Co, the Transport Systems Catapult, the University of Oxford, University of Cambridge, and the Open University.

The University is a partner in a three-year, multi-million pound project which will test public reaction to driverless cars, and conduct real-world testing on public roads around Milton Keynes and Coventry.

People are starting to accept many of these features as commonplace, and we will be testing some of the more advanced ‘driver assist’ technologies in the earlier part of the programme
John Miles
Google driverless cars

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

Using genome sequencing to track MRSA in under-resourced hospitals

$
0
0

Researchers from the University of Cambridge have used genome sequencing to monitor how the spread of methicillin-resistant Staphylococcus aureus (MRSA) occurs in under-resourced hospitals. By pinpointing how and when MRSA was transmitted over a three-month period at a hospital in northeast Thailand, the researchers are hoping their results will support evidence-based policies around infection control.

MRSA is a common cause of hospital-acquired infections, with the largest burden of infections occurring in under-resourced hospitals in the developing world. Whereas genome sequencing has previously been applied in well-resourced clinical settings to track the spread of MRSA, how transmission occurs in resource-limited settings is unknown. In a new study published today (9 December) in the journal Genome Research, researchers used genome sequencing to understand the spread of MRSA in a hospital with high transmission rates.

“In under-resourced hospitals and clinics, formal screening procedures for MRSA are not in place,” said Professor Sharon Peacock of the University of Cambridge and the Wellcome Trust Sanger Institute, who led the research. “Filling gaps in our understanding of how MRSA spreads in such settings is important, since this not only highlights the problem but also provides direction to interventions that tackle this and other hospital-based pathogens.”

The team of researchers from the UK, Thailand and Australia monitored all patients on two intensive care units (ICUs) at a hospital in northeast Thailand over a three-month period in order to track how and when MRSA was transmitted. During this time, five staff members and 46 patients tested positive at least once, which represented 16% of adult and 34% of paediatric patients. 

Conventional bacterial genotyping approaches do not provide enough discrimination between closely-related MRSA strains to be able to pinpoint transmission from one person to another, but whole genome sequencing addresses this problem. A total of 76 MRSA populations, or isolates, were sequenced, including up to two repeat isolates from patients who tested positive for MRSA in the first screen. None of the patients or staff members who tested positive for MRSA were asymptomatic carriers. 

By conventional typing, all of the MRSA identified belonged to sequence type 239, the dominant MRSA lineage worldwide. But, based on sequence data, there was considerable genetic diversity – including the presence or absence of clinically important genes such as those coding for antiseptic resistance and antibiotic resistance.

“A striking result from sequence data was the presence of multiple distinct clades, which suggests that several different variants of MRSA were circulating through the hospital at the same time,” said Peacock. “We also confirmed numerous transmission events between patients after admission to the ICU, and identified a ‘super-spreader’ in each unit.”

“Studies such as this provide information to help inform policy,” said Peacock, who is a member of the Department of Medicine and the Department of Pathology, and a Fellow of St John’s College. “It also highlights – in a concrete way – the importance of infection control including effective implementation of hand-washing, which is the most effective way to control MRSA.”

Following the results of the study, the hospital has implemented a comprehensive hand-washing policy, a project which is being overseen by Ben Cooper, one of the paper’s co-authors.

The research was funded by the Medical Research Council.

Whole genome sequencing of MRSA from a hospital in Asia has demonstrated patterns of transmission in a resource-limited setting, where formal screening procedures are not feasible.

This study highlights – in a concrete way – the importance of infection control including effective implementation of hand-washing, which is the most effective way to control MRSA
Sharon Peacock
Micrograph of Methicillin-Resistant Staphylococcus aureus (MRSA)

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

Delays in cancer diagnosis unlikely to be due to poor medical practice

$
0
0

In an opinion piece published today, a team of researchers including Dr Georgios Lyratzopoulos from the Department of Public Health and Primary Care at the University of Cambridge question government plans to rank general practices according to how promptly patients later found to have cancer were referred to specialist services for suspected cancer.

Most patients who have cancer diagnosed after the onset of symptoms are referred after one or two GP consultations (80%), but a substantial minority (20%) have three or more consultations before referral, explain Dr Georgios Lyratzopoulos at the University of Cambridge, and colleagues. This number is often considered by policy makers and cancer charities to reflect an avoidable delay.

While multiple GP consultations prolong diagnostic intervals and may affect clinical outcomes and care experience, they largely reflect the diagnostic difficulty (or ‘symptom signature’) of different cancers and the need for initial investigations, argue the authors. For example, cancers with fairly specific signs and symptoms (such as a palpable breast lump or a visible lesion) are easier to suspect and are less associated with multiple consultations than those with non-specific symptoms (such as back or abdominal pain).

“Diagnosing cancer soon after the onset of symptoms can better the outcomes for patients,” says Dr Lyratzopoulos. “But in order to achieve progress we need to have a clear understanding of why delays may occur. The reasons are multi-faceted and reviewing the evidence we find that professional performance is an unlikely cause of delays. So we believe that government plans to rank general practices by referral times are unlikely to be effective.”

In turn, the researchers believe that strategies that can help to improve the speed of diagnosis may include clinical decision support tools for doctors to use during the consultation, greater degree of communication between general practitioners and specialists, and easier access to specialist tests such as scans and endoscopies. But they stress that novel diagnostic tests will need to be developed for cancers that are more difficult to detect.

The authors highlight that in some patients, delays in diagnosis can also occur before patients present to doctors or after GPs have referred the patient. They also advocate better information for the public, the media, and policy makers about the origins of prolonged intervals between presentation and diagnosis of cancer.

The research was funded by the National Institute for Health Research and Cancer Research UK.

Adapted from a BMJ press release

Reference
Georgios Lyratzopoulos, Jane Wardle, and Greg Rubin. Rethinking diagnostic delay in cancer: how difficult is the diagnosis? BMJ; 10 Dec 2014

Delays in referrals for suspected cancer are unlikely to be down to poor performance by GPs, argue a team of researchers today in the British Medical Journal. Instead, they say that such delays largely reflect limitations in scientific knowledge and in the organisation and delivery of healthcare.

Diagnosing cancer soon after the onset of symptoms can better the outcomes for patients. But in order to achieve progress we need to have a clear understanding of why delays may occur
Georgios Lyratzopoulos
Stethoscope and notes

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

Supplement could reduce heart disease risk in people of low birth weight

$
0
0

Researchers at the Institute of Metabolic Science fed low birth weight rats a supplement of the molecule co-enzyme Q (CoQ) and found that in those rats that grew quickly after birth, the supplement prevented cells in the aorta from ageing prematurely, which can lead to heart disease. Scientists have known for some years that babies with a low birth weight who grow quickly are more likely to develop heart disease than those with a normal birth weight. This new study, published in The FASEB Journal, has identified a novel mechanism underlying this phenomenon and suggests a possible treatment.

Researchers funded by the British Heart Foundation and the Medical Research Council (MRC) fed pregnant rats either a control diet or a low protein, high carbohydrate diet. The mothers fed the low protein diet had pups with a low birth weight, but which grew quickly when suckled by a control-fed mother. When the researchers examined the aorta from these rats, they found that their cells had aged more quickly than those from the normal birth weight offspring and that this was associated with a deficit in CoQ in the aorta.

When the researchers gave the low birth weight rats extra CoQ in their diet after weaning, they found that this prevented the accelerated aging of and damage to their aortas. CoQ is produced naturally in the body and is required to ensure that the mitochondria – the cells’ ‘batteries’ – work properly and to protect cells from oxidative stress caused by highly reactive molecules known as free radicals, which can cause damage to proteins, membranes and genes.

The team also found that CoQ is reduced in white blood cells from low birth weight offspring and hence that levels of CoQ in the blood cells can be used to see how much damage there is in the aorta.

“Our study has answered a question that has puzzled doctors for some time now – why children of low birth weight who grow quickly are prone to heart disease in later life,” explains Professor Susan Ozanne from the MRC Metabolic Diseases Unit, who led the study. “We believe it’s because they are deficient in co-enzyme Q. As this molecule is also then deficient in the individual’s blood cells, it may be possible to develop a simple blood test capable of diagnosing the amount of damage to their aorta and therefore likely to develop heart disease.”

Dr Jane Tarry-Adkins, first author on the study, adds: “Although our study is only in rodents, it may one day have major implications for both the prevention and early treatment of heart disease. It suggests that it may be possible to treat at-risk individuals with a safe and cost-effective supplement that has the potential to prevent heart disease before they display any symptoms of the disease.”

Globally, cardiovascular disease is responsible for more deaths than any other disease, claiming an estimated 17.3 million lives in 2008, a number which is predicted to grow to over 23.3 million by 2030. Reliable, early diagnostic tests for cardiovascular disease risk could help reduce this burden. The researchers plan to establish whether their findings can be confirmed in humans and therefore make their prognostic potential a realistic possibility.

Reference
Tarry-Adkins, JL et al. Nutritional programming of Coenzyme Q – potential for prevention and intervention? FASEB; 29 Aug 2014

A simple supplement could be a safe and cost-effective way of reducing heart disease in individuals born with a low birth weight, suggests research from the University of Cambridge. The study, carried out in rats, also raises the possibility of developing a blood test to indicate how much damage there is in the aortas of these individuals.

Our study has answered a question that has puzzled doctors for some time now – why children of low birth weight who grow quickly are prone to heart disease in later life
Susan Ozanne
Heart pulse

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

‘Crown jewels’ of English lute music go online

$
0
0

Cambridge Digital Library is launching a new Music Collection with the online release of the 'crown jewels' of English lute music. Dating from the late 16th and early 17th century, the manuscripts contain handwritten copies of scores by John Dowland, Francis Cutting and dozens of other early modern composers.

Digital versions of the manuscripts will go online today (Thursday, 11 December 2014) as the first items in a new digital Music Collection, which will grow to reflect Cambridge University Library's important holdings in this area. The Library’s holdings range from music scores and texts on music to ephemera and concert programmes to archival materials documenting the life and work of composers. Such items play a crucial role in the preservation of musical heritage on a national and international level.

The new online collection of lute music comprises high-resolution zooming images of around 650 pieces contained in eight manuscripts, allowing full access to these unique items to anyone with an internet connection. Pieces from the collection range from celebratory jigs and dances, to popular ballads and sorrowful music for funerals, giving an extraordinary insight into the role and uses of music in early modern England.

The digitisation has been created in collaboration with the Lute Society. The images are accompanied by scholarly descriptions of the manuscripts’ physical properties and inventories of their contents. Four of the eight manuscripts were handwritten by Mathew Holmes, choirmaster at Westminster Abbey until 1621.

The hundreds of pieces, crammed into more than 600 pages, preserve a cross-section of the lute repertoire in common use in England during the period 1580-1615, the ‘Golden Age’ of English lute music. Together they form the leading source for the music of the best-known lute composers from the English Renaissance. The manuscripts shed light on the Tudor celebrities to whom some of the pieces are dedicated - from the courtier Sir Walter Raleigh to the Shakespearean actor Will Kemp.

John Robinson of the Lute Society commented: “A huge number of manuscripts have been lost over the centuries, which makes the survival of this collection all the more remarkable. Today it is an invaluable legacy for professional musicians and musicologists as well as amateur enthusiasts. Digitisation means that the original sources of lute music can be viewed, studied and played by people worldwide.”

Also digitised are personal items. Shown online for the first time is the lute book of an English government official which remained in his family for over 350 years, and a manuscript painstakingly reassembled from fragments cut up and used in the bindings of later printed books.

The European lute was probably derived from the Arabic Ud and was one of the most important musical instruments during the Middle Ages and Renaissance. The first lute music to have been written down dates from the late 15th century. English Tudor court musicians included paid lutenists. Learning the lute was an important part of the education of royalty and nobility.

The 1580s onwards saw the emergence of composers who developed the characteristically English lute music that flourished into the second decade of the 17th century. Among them are John Johnson, Francis Cutting, Anthony Holborne, Daniel Bacheler, John Dowland and Robert Johnson. This crucial period in English lute music coincides with the copying of the Cambridge lute manuscripts.

Robinson said: “The manuscripts are written in French tablature with the notation providing a guide to where to put the fingers on the lute neck, rather like chord shapes in modern guitar tutors. Nearly all the music is for Renaissance lute which is tuned as a guitar except the third string is a semitone lower on the lute. The unfamiliarity with tablature notation led to lute sources being largely excluded from mainstream musicology. Now both amateurs and professionals are taught to read it and generally prefer to sight read tablature, especially from copies of original sources.”

In putting this magnificent collection online, Cambridge University Library is not only making the manuscripts available to the existing international community of lute enthusiasts, but also reaching new audiences. Anne Jarvis, Cambridge University Librarian, said: “We are delighted to have had the opportunity to work with the Lute Society to put these unique manuscripts online as the first documents in our new Music Collection.”

Cambridge Digital Library’s new Music Collection is being launched with a private recital at the University of Cambridge by one of the world’s leading lutenists, Jakob Lindberg, who will be performing pieces found in these recently digitised manuscripts.

Handwritten copies of scores by composers of English lute music have been digitised in a programme to make a precious legacy available to professional and amateur musicians around the world. 

The collection is an invaluable legacy for professional musicians and musicologists as well as amateur enthusiasts. Digitisation means that the original sources of lute music can be viewed, studied and played.
John Robinson, Lute Society
Mr Knight’s galliard by John Dowland

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Gift will support financially disadvantaged students

$
0
0

A landmark donation from the Reuben Foundation will support the most financially disadvantaged undergraduate students in meeting the costs of their studies at Cambridge from 2014.

This is the largest Reuben Scholarship Programme established to date, and with match funding from the Cambridge Bursary Scheme, it will support at least 90 students over the next five years.

The Foundation’s generous gift is the largest made to the Cambridge Bursary Scheme for some time.

“I’m delighted the Reuben Foundation has enabled us to provide these Bursaries, which will make a significant contribution towards the accommodation and subsistence costs of studying at Cambridge,” said Professor Sir Leszek Borysiewicz, Cambridge’s Vice-Chancellor.

“A Cambridge education is a transformational experience, and their generosity will help ensure that basic living costs do not prevent talented students from benefiting from it.”

The Reubens said: “We are delighted to see the continued expansion of the Reuben Scholarship Programme, and look forward to a long partnership with Cambridge which will benefit many bright students over the coming years”.

The Reuben Scholarship Programme was created by the Reuben Foundation in 2012 in association with the University of Oxford, University College London and ARK Schools. The donation will be administered for the University by the Isaac Newton Trust.

 

A significant gift will make a major contribution to ensuring no student is deterred from studying at Cambridge due to their financial circumstances.

A Cambridge education is a transformational experience, and their generosity will help ensure that basic living costs do not prevent talented students from benefiting from it.
Professor Sir Leszek Borysiewicz, Vice-Chancellor of the University of Cambridge
More information
  • The Reuben Foundation is a UK registered charity focused on making essential contributions to healthcare and education in the UK and worldwide. Amongst many substantial contributions, the Reuben Foundation have funded major paediatric cancer and virology units at Great Ormond Street Hospital in London, and launched the ‘Team London’ volunteering scheme in partnership with the Mayor of London, which has raised over 100,000 new volunteers for existing voluntary organisations in the capital, allowing them to work more effectively to improve quality of life, offer more opportunities to youth and help lower crime. The Reuben Foundation is also principal funder of the BFI Reuben Library at BFI Southbank, which has become the leading centre for film knowledge in the UK, holding the world’s largest collection of written materials on film and television. For further information please visit www.reubenfoundation.com.

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Notes from Makeni: Fighting Ebola in West Africa

$
0
0

A Professor of Virology from the University of Cambridge’s Department of Pathology, arrived in Sierra Leone on December 1 to help set up a new diagnostics centre amid reports that the West African country is now the worst-hit by the Ebola crisis.

Ian Goodfellow, a norovirus specialist in the Department of Pathology, is part of the team of volunteers coordinated by Public Health England (PHE) to assist in the construction and running of the laboratory in Makeni, 120 miles to the north-east of the capital, Freetown. The laboratory is one of three being built in the country under the supervision of the UK Royal Engineers, with support from the Department for International Development.

One of the main difficulties faced by Sierra Leone and its neighbours in responding to the Ebola outbreak has been their limited capacity to test for the virus. Test results can currently take more than five days to come back. The Makeni lab, once it is fully functional, will be able to turn samples around in 24 hours or less. Together with the labs recently set up in Kerry Town and Port Loko, this will quadruple the country’s testing and diagnostic capacity, allowing health workers to isolate patients and contain the spread of the disease.

“Without a doubt the greatest challenge for our lab team so far has been logistics,” said Professor Goodfellow, writing from Makeni. “As we are the first team into the site we are responsible for setting up the laboratory. All the equipment and various reagents we need in the lab have had to be shipped from the UK, but getting them to Makeni has proven a real challenge.”

One part of that challenge has been tracking down missing equipment and reagents. “In many cases this has involved people driving around the various logistics hubs in Sierra Leone, going from one tent to another and opening boxes.” Once found, the items still have to be sent to Makeni site in various forms of transport, and manually off-loaded due to the lack of heavy lifting equipment. “As you can imagine, as lab scientists we are not accustomed to moving over 10 tons of equipment and reagents by hand. Doing this in 35C heat with over 90% humidity on a rather dangerous and very active building site has been testing.”

The lab is expected to open before the treatment centre, so that diagnostic capacity in the country as a whole can be increased.

Considering that the Makeni site has only been under construction for six weeks, he says: “It's really amazing to see how quickly the centre has been built. Everyone involved has done a fantastic job. The international community has now made real efforts to commit resources and expertise to control the outbreak but, as you can imagine, it is simply not possible to build these types of treatment centres overnight. Added to this is the need to recruit and train people effectively in the operation of such a centre. Safe working in these conditions is critical and is not something that can be done in an instant.”

The lab team, he adds, “is made up of 10 people from various parts of the UK and from various backgrounds. The majority are biomedical scientists with experience in clinical diagnostics. But we also have people from the PHE research institute at Porton Down. In addition there are two academics on the team. Neither one of us has worked with Ebola previously, but we are both experienced in containment and working with highly infectious materials."

Commenting on how his perception of the Ebola outbreak has evolved since his arrival, Professor Goodfellow says: “The most striking thing I've noticed is how ‘normal’ life appears here. The locals are going about their daily business as usual and the only evidence of the Ebola crisis on the ground is the lack of physical contact between people, the presence of bleach buckets everywhere and the various posters advising people how to avoid infection.”

Responding to the question of what else needs to be done, he said: “My main concern at present is that the media coverage suggests that the Ebola crisis is under control.” Indeed, on the same day of Prof Goodfellow’s arrival in Sierra Leone the World Health Organisation’s Assistant Director General declared that “the prognosis for Sierra Leone is very good”. This despite the fact that, only days before, the New York Times reported that Sierra Leone was poised to overtake Liberia in the number of Ebola cases.

Professor Goodfellow shows caution in his assessment: “There is evidence of slowing of the number of new cases in Liberia, but new cases are still appearing on a daily basis in all three affected countries [Sierra Leone, Liberia, Guinea]. A slowing of new cases is an indication that the outbreak is controlled. But it is essential that a response is sustained until every single Ebola infection is prevented. Without a sustained response it is likely the outbreak will continue for a long time, with many more fatalities.”

Commenting on Professor Goodfellow’s efforts to assist in the efforts to contain the Ebola outbreak, Professor David Dunne, Director of both the Wellcome Trust-Cambridge Centre for Global Health Research (WT-CCGHR) and the Cambridge-Africa Programme, said “Ian became directly aware of the problem of slow Ebola diagnosis in August, when he was a member of a WT-CCGHR delegation to discuss Cambridge-Africa support for health research training in the Gambia. The day we arrived in the Gambia the first Senegalese Ebola case was confirmed in Dakar, just over a 100 hundred miles away. Ian saw that slow diagnosis was a major worry for MRC-Gambia hospital staff who were contingency planning for Ebola’s arrival there. Thankfully, Senegal had no secondary cases, so Ebola did not arrive. But, Ian realised that his professional skills could directly help the people of West Africa in this crisis and it became inevitable that he would make this happen. Our University is committed to supporting African researchers through the Cambridge-Africa programme, and Cambridge-Africa is proud of him”

Professor Sir Leszek Borysiewicz, Vice-Chancellor of the University of Cambridge and Chairman of the UK International Development and Research Advisory Committee (DfID) added: “Ian Goodfellow’s exemplary commitment makes us all proud.  His decision to use his knowledge and experience as a volunteer, setting up the Makeni lab in Sierra Leone, underscores the University's mission: 'to contribute to society through the pursuit of education, learning and research at the highest levels of international excellence'. In a time of Grand Challenges, he also reminds us that individual action is the essential component of our response to crises. We are grateful to him, and to the Department and colleagues that are supporting him, in carrying out this important work.”

A University of Cambridge scientist is helping the efforts to contain the Ebola outbreak in Sierra Leone.

It is essential that a response is sustained until every single Ebola infection is prevented.
Professor Ian Goodfellow

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

New understanding of how magma moves underground

$
0
0

Using the most extensive dataset ever gathered from a volcanic eruption, an international team of researchers have developed a model of how huge magma-filled cracks form underneath volcanic systems and how they spread. The researchers, including scientists from the University of Cambridge, will use the results to help predict how molten rock moves underground, and whether or not it erupts.

A volcanic eruption in the Holuhraun area of central Iceland has now lasted over 100 days, with no end in sight. The eruption has received widespread attention and scientists have followed the activity closely since its onset at the Bárðarbunga central volcano.

Using GPS geodetic measurements and interferometric analysis of satellite synthetic aperture radar images, and earthquake observations, a team of scientists, including researchers from the University of Cambridge, has constructed a model for the formation of a huge magma-filled crack, or dyke, in late August 2014. Details are published today (15 December) in the journal Nature.

The dyke stretches for more than 45 kilometres, from the Bárðarbunga central volcano to the Holuhraun site, where magma has been erupting since the summer.

The dyke mostly formed in the two weeks before the main eruptive activity began. Its average opening is approximately 1.5 metres wide, focused from just beneath the surface to six kilometres down. The dyke volume grew to 0.5 cubic kilometres before the main eruption began. A model for the dyke also explains the unusual and varying direction of dyke segments that relate to interaction of topography and stresses in the ground caused by divergent plate movements in Iceland.

The rate of the dyke’s spread slowed as the magma reached natural barriers, leading to a build-up of pressure, which eventually broke through the barrier and created a new segment of crust. The results show how focused upwelling of magma under central volcanoes is effectively redistributed over long distances to create new upper crust at such divergent plate boundaries.

“This is probably the best-documented eruption ever,” said Professor Bob White of Cambridge’s Department of Earth Sciences, who used up to 70 broadband seismometers to monitor activity around Bárðarbunga and Holuhraun.

The team hopes that similar studies could be carried out in near real time to improve understanding and ability to forecast the evolution of lateral dykes in various tectonic settings.

The team of researchers, coordinated by the University of Iceland, also included researchers from the Icelandic Meteorological Office and eight universities in other countries. The research is part European research project FUTUREVOLC funded by the European Union.

An international team of geoscientists have demonstrated how magma-filled cracks form and spread underneath volcanic systems, such as the one extending from Iceland’s Bárðarbunga volcano to an eruptive site which has now been active for more than 100 days.

This is probably the best-documented eruption ever
Bob White
Bárðarbunga and Holuhraun

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

The lost art of risk management

$
0
0

Industrial investment is getting riskier. Globalisation has brought with it a wealth of opportunities for multinationals, such as access to low-cost labour and new markets, but it also has its downsides. Companies find themselves navigating a dynamic and unpredictable business environment and exposure to continuous change increases risk – both in maintaining global operations and in managing industrial investment. At the Centre for International Manufacturing, we have studied some of the world’s largest manufacturing companies and concluded that while they recognise the need to manage their risk more effectively, they do not have rigorous processes in place to identify, assess, manage and monitor the risks associated with investing in new projects.

Since the banking crisis in 2008, the world has woken up to the consequences of unregulated risk. Managing risk is now a major issue for corporate finance and governance and real progress has been made in understanding and articulating financial risk. However, there seems to be little evidence that the risks associated with the globalisation of manufacturing are being systematically managed, even though an ill-advised internationalisation project can jeopardise a company’s future. What appears to be happening is that companies are applying some of these new instruments of financial risk analysis to their global manufacturing investments. But these are complex tools designed for a different job and they can distort the fundamentals of risk management by failing to take account of the risks that are particular to manufacturing.

One of the main causes for concern is that companies seem to manage risk through a variety of methods, none of which has been designed specifically for the task. And while some of these methods are explicit, many are implicit, and all tend to be embedded within the company’s regular strategic and financial planning and evaluation processes. There is, in fact, no generally recognised comprehensive and systematic approach to analysing and mitigating the risks associated with industrial investments. In addition, risk management is usually carried out at the corporate level which tends to categorise risks quite broadly and may fail to consider the full spectrum of risk to which the company’s global operations may be vulnerable. This means that too few risks are being identified and those that are, are not being evaluated objectively. Too great a reliance on non-scientific methods of risk quantification also means that their assessment of the magnitude and likelihood of risks is often based on assumptions and the consequences of accumulated risk are not taken into account.

Other weaknesses in risk management are associated with organisational structures and cultures. It is often unclear who within the organisation has responsibility for particular kinds of risk. And even when people or teams do take ownership, they face the challenge common to all highly complex multinational organisations – the difficulty of sharing knowledge across functions which may be scattered across different sites around the world.

How can we help companies do better? It is clear from our research that companies need a formal, systematic process for risk management that has been designed specifically for global manufacturers. But first, we need to think more clearly about how we classify risk. The top-level view of risk is based on three broad categories: organisational, operational and external risk. Organisational risk relates to corporate strategy which, although it may have little apparent impact on individual investment projects, if something does go wrong the consequences for the project are likely to be significant. Operational risk relates to the complete set of activities which takes place across the value chain – project management, R&D, procurement, production, distribution and sales and marketing. In other words, any activity that can generate value for the company is also subject to risk. External risk relates to those economic, political and environmental shocks which may be difficult to predict but should, with the right mitigation capabilities in place, be possible to withstand.

These three types of risk are all connected. A risk in one area is liable to precipitate risk in a number of others. External economic slowdown, for example, can trigger a risk for production and for sales and marketing. Similarly, problems with the quality of raw materials – another external risk – create risk in procurement which in turn affects production, sales and marketing and after-sales service. So we could, in theory, develop a list of potential risks to help with classification. However, because risk is not static this can produce misleading and incomplete results.

Instead, by using a typology of industrial investment risks, companies can focus on the sources of risk in order to identify the specific risks to their project.

Having better understood how to classify risk we can develop a more rigorous process for risk management. And we have been doing just this, bringing together our research findings to develop a systematic approach to risk identification, assessment, administration and monitoring. For each of the following steps we have developed a set of structured approaches and analysis tools:

Identifying risk: first, you need to identify what changes to the business the project is likely to cause and categorise those changes. This then forms the basis of the risk analysis, looking first at how the project will affect generic risks and, from that, identifying specific risks. The specific risks should then be considered in the context of the investment objectives and potential changes to the network in order to build a qualitative rationale for prioritising risks. At the end of this first step, you should have a clear understanding of the key risks for your investment.

Assessing and managing risk: the second and third steps are connected. Once you have established the key risks, you can review them using probability and impact assessment tools. Only then can you develop appropriate risk mitigation capabilities at the operational network and project levels. When these are in place, you need to reassess your risk, taking your new risk mitigation capabilities into account. At this point, you should also go back to your financial investment valuation model to take account of the revised risk and develop scenarios based on your new risk mitigation capabilities.

Monitoring risk: Once under way, the project needs continual monitoring using both risk and risk mitigation capability indicators. By systematically reassessing risk, mitigation capabilities can be adjusted as the risk assessment changes. A structured approach of this kind also supports integration with other investment projects so that risk can be monitored at the project portfolio level.

A systematic approach to risk management which has been designed specifically for industrial investment has several important benefits. Say you were to consider investing in a high-return oriented plant to manufacture product X using technology Y at a particular geographic location. This approach could tell you if the risk will outweigh the expected return, taking account of the risk interdependencies and the cost of putting in place mitigation capabilities. None of the traditional approaches to risk management used by multinationals would be able to arrive securely at that conclusion. As well as helping companies improve their risk management, the Elsevier approach also provides better risk reporting for regulators, investors and auditors.

Most global manufacturing process improvement programmes are limited to capability and development and performance measurement and the notion of ‘risk’ and ‘risk management’ tends to be mentioned in a somewhat cavalier manner. This approach puts risk and risk management at the heart of the global manufacturing process.

Dr Mukesh Kumar from the Centre for International Manufacturing suggests that multinational manufacturers are taking unnecessary risks with their industrial investments – and he offers a solution.

There seens to be little evidence that the risks associated with the globalisation of manufacturing are being systematically managed, even though an ill-advised internationalisation project can jeopardise a company's future
Mukesh Kumar
Two Backgammon Players, from the Book of Games, Chess, Dice and Boards, 1282 (vellum)

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Lighter planes are the future

$
0
0

The study, by the Universities of Sheffield, Cambridge and UCL (University College London), is the first to carry out a comprehensive life cycle assessment (LCA) of a composite plane, such as the Boeing Dreamliner 787 or Airbus 350, and extrapolate the results to the global fleet.

The LCA covers manufacture, use and disposal, using publicly available information on the Boeing Dreamliner 787 fuselage and from the supply chain – such as the energy usage of the robots that construct the planes. The study compares the results to the traditional – and heavier – aluminium planes.

Emissions during the manufacture of composite planes are over double those of aluminium planes. But because the lighter aircraft use significantly less fuel, these increased emissions are offset after just a few international flights. Over its lifetime, a composite plane creates up to 20 per cent fewer CO2 emissions than its aluminium equivalent.

Professor in Advanced Materials Technologies at the University of Sheffield, Alma Hodzic, says: “This study shows that the fuel consumption savings with composites far outweigh the increased environmental impact from their manufacture. Despite ongoing debates within the industry, the environmental and financial savings from composites mean that these materials offer a much better solution.”

The researchers fed the data from the LCA into a wider transport model to gauge the impact on CO2 emissions as composite planes are introduced into the global fleet over the next 25 years, taking into account other factors including population, economic prosperity, oil prices and speed of adoption of the new technology.

The study– published in the International Journal of Life Cycle Assessment– estimated that by 2050, composite planes could reduce emissions from the global fleet by 14-15 per cent relative to a fleet that maintains its existing aluminium-based configuration. 

Professor in Energy and Transport at UCL, Andreas Schäfer, explains: “The overall emissions reduction for the global fleet is lower than the reduction for an individual plane, partly, because by 2050, not all the fleet will be of composite construction. New planes entering the fleet before 2020 could still be in use by 2050, but the faster the uptake of this technology, the greater the environmental benefits will be.”

Dr Lynette Dray from Cambridge's Department of Architecture agrees: “Given that global air traffic is projected to increase four-fold between now and 2050, changing the materials used could avoid 500 million tonnes of CO2 emissions in 2050 alone, a value that roughly corresponds to current emission levels.”

Professor Hodzic adds: “The industry target is to halve CO2 emissions for all aircraft by 2020 and while composites will contribute to this, it cannot be achieved by the introduction of lighter composite planes alone. However, our findings show that composites – alongside other technology and efficiency measures – should be part of the picture.”

A global fleet of composite planes could reduce carbon emissions by up to 15 per cent, but the lighter planes alone will not enable the aviation industry to meet its emissions targets, according to new research.

Given that global air traffic is projected to increase four-fold between now and 2050, changing the materials used could avoid 500 million tonnes of CO2 emissions in 2050 alone, a value that roughly corresponds to current emission levels
Lynette Dray
Airbus A350 XWB MSN001

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Earliest known piece of polyphonic music discovered

$
0
0

The earliest known practical example of polyphonic music - a piece of choral music written for more than one part - has been found in a British Library manuscript in London.

The inscription is believed to date back to the start of the 10th century and is the setting of a short chant dedicated to Boniface, patron Saint of Germany. It is the earliest practical example of a piece of polyphonic music – the term given to music that combines more than one independent melody – ever discovered.

Written using an early form of notation that predates the invention of the stave, it was inked into the space at the end of a manuscript of the Life of Bishop Maternianus of Reims.

The piece was discovered by Giovanni Varelli, a PhD student from St John’s College, University of Cambridge, while he was working on an internship at the British Library. He discovered the manuscript by chance, and was struck by the unusual form of the notation. Varelli specialises in early musical notation, and realised that it consisted of two vocal parts, each complementing the other.

Polyphony defined most European music up until the 20th century, but it is not clear exactly when it emerged. Treatises which lay out the theoretical basis for music with two independent vocal parts survive from the early Middle Ages, but until now the earliest known examples of a practical piece written specifically for more than one voice came from a collection known as The Winchester Troper, which dates back to the year 1000.

Varelli’s research suggests that the author of the newly-found piece – a short “antiphon” with a second voice providing a vocal accompaniment – was writing around the year 900.

As well as its age, the piece is also significant because it deviates from the convention laid out in treatises at the time. This suggests that even at this embryonic stage, composers were experimenting with form and breaking the rules of polyphony almost at the same time as they were being written.

“What’s interesting here is that we are looking at the birth of polyphonic music and we are not seeing what we expected,” Varelli said.

“Typically, polyphonic music is seen as having developed from a set of fixed rules and almost mechanical practice. This changes how we understand that development precisely because whoever wrote it was breaking those rules. It shows that music at this time was in a state of flux and development, the conventions were less rules to be followed, than a starting point from which one might explore new compositional paths.”

The piece is technically known as an “organum”, an early type of polyphonic music based on plainsong, in which an accompaniment was sung above or below the melody.

The fact that it was an early example of music for two parts had probably gone unnoticed because the author used a very early form of musical notation for the polyphonic piece, which would have been indecipherable to most modern readers. “When I tried to work out the melody I realised that the music written above was the same as the one outlined by the notation used for the chant and that this sort of 'diagram' was therefore a two-voice piece based on the antiphon for St Boniface”, Varelli said. “The chant notation essentially gives the direction of the melody and when it goes up or down, the organum notation consistently agreed, giving us also the exact intervals for the chant.”

Who wrote the music, and which monastic house it came from, remains a mystery, but through meticulous detective work Varelli has been able to pin its likely origins down to one of a number of ecclesiastical centres in what is now north-west Germany, somewhere around Paderborn or Düsseldorf.

This is partly because the type of plainchant notation – sometimes known as Eastern Palaeofrankish – was most used in Germany at that time. In addition, however, an unknown scribe had added a Latin inscription at the top of the page which when translated reads: “which is celebrated on December 1”.

This odd comment, a reference to the Saint’s Day for Maternianus, alludes to the fact that unlike most monastic houses, which celebrated Maternianus on April 30, a handful of communities in north-western Germany did so on December 1. Combined with the notation itself, this makes it likely that whoever wrote the music was based in that region.

“The music was added some time after the main saint’s life was written,” Varelli added. “The main text was written at the beginning of the 10th century, and on this basis, we can conservatively estimate that this addition was made some time in the very first decades of the same century”.

“The rules being applied here laid the foundations for those that developed and governed the majority of western music history for the next thousand years. This discovery shows how they were evolving, and how they existed in a constant state of transformation, around the year 900.”

Nicolas Bell, music curator at the British Library, said "This is an exciting discovery. When this manuscript was first catalogued in the eighteenth century, nobody was able to understand these unusual symbols. We are delighted that Giovanni Varelli has been able to decipher them and understand their importance to the history of music."

The video shows the piece being performed by Quintin Beer (left) and John Clapham (right), both music undergraduates at St John’s College, University of Cambridge.

New research has uncovered the earliest known practical piece of polyphonic music, an example of the principles that laid the foundations of European musical tradition.

Typically, polyphonic music is seen as having developed from a set of fixed rules and almost mechanical practice. This changes how we understand that development precisely because whoever wrote it was breaking those rules.
Giovanni Varelli
The music was written around the year 900, and represents the earliest example of polyphonic music intended for practical use.

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Festive tastes have changed but Christmas is still a cracker

$
0
0

Long-standing festive treats, such as sherry and brandy, are declining in popularity, according to the research by a joint team from Cambridge University Press and Lancaster University.

And the study also reveals that we’re less likely to pour custard over our Yuletide dessert – though we still enjoy a slice of Christmas cake.

Champagne, vodka and gin are now our favourite festive tipples, according to the research team. The study, which compares spoken British English today with recordings from the 1990s, allows researchers to analyse how language has changed.

The research team from Lancaster and Cambridge University Press have revealed how our tastes in food and drink have changed over time. Based on recent recordings, traditional British favourites and festive holiday dinner staples, such as Yorkshire puddings and custard, have been overtaken by takeaways, notably pizza and curry, which have risen in frequency by four and five times respectively.

Despite the rise in convenience food, talk of over-indulgence, especially around holiday seasons like Christmas, is also more prevalent today. Compared to the 1990s, we talk about calories twice as much as we used to.  But our words don’t necessarily translate to actions; our most talked about food item at Christmas – as well as the rest of the year – is cake.

The initial findings also suggest that alcohol is more important to British English speakers than coffee. While tea remains the nation’s favourite drink, spoken an average 255 times per million words, hot drinks are dwarfed when compared to booze.

When it comes to Christmas stalwarts, sherry and brandy appear to have fallen out of favour over the last 20 years, replaced by vodka, gin and even champagne, all of which are being talked about more.

While our tastes may have changed, our desire to spend time with our families at Christmas seems unrelenting and, when we aren’t with our families, we are more likely to talk about them.

We talk about family almost twice as much as we used to, as it seems that somehow the concept of family is more salient in British English speech today than it was two decades ago.

Dr Claire Dembry, Senior Language Research Manager at Cambridge University Press, said: “This analysis presents an interesting insight into how our use of language has changed over time. Our tastes in food and drink have certainly changed, but our interest in the family seems to be ever increasing.

“Coupled with the finding that we are also significantly less self-centred in our speech than we used to be, with a reduction in frequency of words associated with self – we say ‘I’ or ‘me’ 47 per cent less than we used to – this suggests Christmas may be even happier this year.  But perhaps this can be attributed to all the talk of champagne!” 

These are only the initial findings from a small pilot of the project, named the ‘Spoken British National Corpus 2014’, which is now under way. The Corpus is a very large collection of recordings of real-life informal, spoken interactions between speakers of British English from across the United Kingdom.

Professor Tony McEnery, from the ESRC Centre for Corpus Approaches to Social Sciences (CASS) at Lancaster University, explained: “We need to gather hundreds, if not thousands, of conversations to create a full spoken corpus, so we can continue to analyse the way language has changed over the last 20 years.

“This is an ambitious project and we are calling for people to send us MP3 files of their everyday, informal conversations in exchange for a small payment to help our team to delve deeper into spoken language and to shed more light on the way our spoken language changes over time.”

People who wish to submit recordings to the research team should visit: http://www.cambridge.org/bnc. 

The aim of the Spoken British National Corpus 2014 is to compile a very large collection of recordings of real-life, informal, spoken interactions between speakers of British English from across the United Kingdom. The collaboration between Lancaster and Cambridge University Press brings together the best resources available for this task.

Cambridge University Press is greatly experienced at collecting very large English corpora, and it already has the infrastructure in place to undertake such a large compilation project. CASS at Lancaster University has the linguistic research expertise necessary to ensure that the Spoken BNC2014 will be as useful, and accessible as possible for a wide range of purposes.

The academic community will benefit from access to a large corpus of British English speech that is balanced according to a selection of useful demographic criteria, including gender, age, region and social class.

The core team includes Professor Tony McEnery – Lancaster University, Dr Claire Dembry – Cambridge University Press, Dr Vaclav Brezina – Lancaster University and Mr Robbie Love – Lancaster University.

Some of Britain’s traditional Christmas favourites are losing their appeal, a new study of spoken English has revealed.

This analysis presents an interesting insight into how our use of language has changed over time
Claire Dembry
Christmas Cake by Eldriva via Flickr

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Research Excellence Framework confirms Cambridge’s global strength and depth in research

$
0
0
Methicillin-Resistant Staphylococcus aureus

Cambridge returned some 2,200 academics to the REF. 47% of its submissions have been awarded the highest rating of 4* overall, meaning they are ‘world-leading’. This is an increase from 32% in 2008. A further 40% of submissions were rated 3* overall (internationally excellent).

Professor Sir Leszek Borysiewicz, Vice Chancellor of the University of Cambridge, says: “These results demonstrate Cambridge’s strength in depth across research, in particular confirming our global leadership in the pure and applied sciences, clinical medicine, and in subjects as diverse as the Classics and business and management studies.

“The significant increase we have seen both in our average score and in the proportion of our research rated world leading is a reflection of the phenomenal research underway at Cambridge.”

The REF assesses the quality and impact of research submitted by UK universities across all disciplines. The results will be used by the four UK higher education funding bodies to allocate block-grant research funding to universities from 2015-16. It was previously known as the Research Assessment Exercise (RAE), and was last conducted in 2008.

In this year’s REF, the University has also seen a significant increase in its average weighted score – the ‘grade point average’ – for its research, rising to 3.33 this year (from 2.98 in 2008).

For the purpose of the REF, each academic discipline was assigned to one of 32 out of a possible 36 units of assessment such as Clinical Medicine, Chemistry, and Business and Management Studies. Each unit was judged by three criteria – Outputs, Environment and, for the first time, Impact (defined as ‘an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia’).

Amongst the case studies submitted by the University of Cambridge for Impact was the research that led to a new drug to treat multiple sclerosis. The drug, marketed under the name Lemtrada, is based on a long-standing programme of research at the University of Cambridge and this year received approval by the National Institute for Health and Care Excellence (NICE) for use in people with relapsing-remitting multiple sclerosis. Clinical trials have shown that the drug reduces disease activity, limits the accumulation of further disability over time and may even allow some existing damage to recover.


The story of Lemtrada, which began life as Campath-1H, stretches back as far as 1975 and research to develop monoclonal antibodies – artificially-produced antibodies, a key component of our immune system which rids the body of invading organisms; this work was to win César Milstein and George Köhler the Nobel Prize for Physiology or Medicine in 1984.

Campath-1H was originally developed as an immunosuppressant to prevent the rejection of bone marrow transplants. It was identified as a potential treatment for multiple sclerosis by Professor Alastair Compston of the Department of Clinical Neurosciences, in the late 1980s. The first MS patient was treated with the drug in 1991 and evidence began to mount that the drug would be effective, if used to treat people before the disease process had progressed too far. Eventually, the results of phase III clinical studies, published in 2012, confirmed that the drug is effective both in MS patients who are previously untreated (‘first-line’ therapy) and those who have already failed another treatment.

Further examples of ‘Impact’ case studies

Almost nine out of ten (87%) University of Cambridge submissions for the UK’s Research Excellence Framework (REF) have been rated as ‘world leading’ or ‘internationally excellent’, demonstrating the institution’s strength in research, figures released today show.

These results demonstrate Cambridge’s strength in depth across research, in particular confirming our global leadership in the pure and applied sciences, clinical medicine, and in subjects as diverse as the Classics and business and management studies
Professor Sir Leszek Borysiewicz
Methicillin-Resistant Staphylococcus aureus

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 
Viewing all 4368 articles
Browse latest View live




Latest Images