Quantcast
Channel: University of Cambridge - Latest news
Viewing all 4346 articles
Browse latest View live

Order matters: sequence of genetic mutations determines how cancer behaves

0
0
Red blood cells (illustration)

Most of the genetic mutations that cause cancer result from environmental ‘damage’ (for example, through smoking or as a result of over-exposure to sunlight) or from spontaneous errors as cells divide. In a study published today, researchers at the Department of Haematology, the Cambridge Institute for Medical Research and the Wellcome Trust/Medical Research Council Stem Cell Institute show for the first time that the order in which such mutations occur can have an impact on disease severity and response to therapy.

The researchers examined genetically distinct single stem cells taken from patients with myeloproliferative neoplasms (MPNs), a group of bone marrow disorders that are characterised by the over-production of mature blood cells together with an increased risk of both blood clots and leukaemia. These disorders are identified at a much earlier stage than most cancers because the increased number of blood cells is readily detectable in blood counts taken during routine clinical check-ups for completely different problems.

Approximately one in ten of MPN patients carry mutations in both the JAK2 gene and the TET2 gene. By studying these individuals, the research team was able to determine which mutation came first and to study the effect of mutation order on the behaviour of single blood stem cells.

Using samples collected primarily from patients attending Addenbrooke’s Hospital, part of the Cambridge University Hospitals, researchers showed that patients who acquire mutations in JAK2 prior to those in TET2 display aberrant blood counts over a decade earlier, are more likely to develop a more severe red blood cell disease subtype, are more likely to suffer a blood clot, and their cells respond differently to drugs that inhibit JAK2.

Dr David Kent, one of the study’s lead authors, says: “This surprising finding could help us offer more accurate prognoses to MPN patients based on their mutation order and tailor potential therapies towards them. For example, our results predict that targeted JAK2 therapy would be more effective in patients with one mutation order but not the other.”

Professor Tony Green, who led the study, adds: “This is the first time that mutation order has been shown to affect any cancer, and it is likely that this phenomenon occurs in many types of malignancy. These results show how study of the MPNs provides unparalleled access to the earliest stages of tumour development (inaccessible in other cancers, which usually cannot be detected until many mutations have accumulated). This should give us powerful insights into the origins of cancer.”

Work in the Green Lab is supported in part by Leukaemia and Lymphoma Research and Cancer Research UK.

Dr Matt Kaiser, Head of Research at Leukaemia & Lymphoma Research, said: “We are becoming more and more aware that a cancer’s genetic signature can vary from patient to patient, and we are becoming better at personalising treatment to match this. The discovery that the order in which genetic errors occur can have such a big impact on cancer progression adds an important extra layer of complexity that will help tailor treatment for patients with MPNs. The technology to do this sort of study has been available only recently and it shows once again how pioneering research into blood cancers can reveal fundamental insights into cancer in general.”

Dr Áine McCarthy, Science Information Officer at Cancer Research UK, says: “The methods used in this pioneering research could help improve our understanding of how cancer cells develop mutations and when they do so. This interesting study suggests that the order in which genetic faults appear can affect how patients respond to different drugs – this insight could help doctors personalise treatment to make it more effective for each patient.”

Reference
Ortmann, CA and Kent, DG et al. The Impact of Mutation Order on Myeloproliferative Neoplasms. NEJM; 11 Feb 2015

Additional funding came from the Kay Kendall Leukaemia Fund; the NIHR Cambridge Biomedical Research Centre; the Cambridge Experimental Cancer Medicine Centre; the Leukemia & Lymphoma Society of America; the Canadian Institutes of Health Research; and the Lady Tata Memorial Trust.

The order in which genetic mutations are acquired determines how an individual cancer behaves, according to research from the University of Cambridge, published today in the New England Journal of Medicine.

This is the first time that mutation order has been shown to affect any cancer, and it is likely that this phenomenon occurs in many types of malignancy
Tony Green
Red blood cells (illustration)

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Cambridge Honorary Degrees 2015

0
0

The nominees are:

Sir John Eliot Gardiner (Honorary Fellow of King's College), conductor:
Doctor of Music

Mr Neil MacGregor, art historian and museum director: Doctor of Letters

Sir James Mirrlees (Fellow of Trinity College), economist: Doctor of Science

Julia, Baroness Neuberger (Newnham College), rabbi and medical ethicist:
Doctor of Divinity

Judge Hisashi Owada (Honorary Fellow of Trinity College), judge and
diplomat: Doctor of Law

Sir Michael Rawlins, physician and pharmacologist: Doctor of Science

Dame Paula Rego (Honorary Fellow of Murray Edwards College), artist:
Doctor of Letters

Professor Judith Jarvis Thomson (Newnham College), philosopher: Doctor of Letters

The University Council has submitted to the Regent House, the University's Governing Body, the names of eight renowned individuals, seeking authority for their admission to Honorary Doctorates at a Congregation in the Senate House on Wednesday 17 June 2015, at which the Chancellor, Lord Sainsbury of Turville, will preside.

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Wow - it's the Cambridge Science Festival

0
0

In recognition of International Women’s Day and Cambridge’s WOW - Women of the World Festival, which takes place on the eve of this year’s two-week science extravaganza, Cambridge Science Festival turns the spotlight onto women.

Celebrating past and present achievements of women, there will be a host of thought-provoking and inspiring talks and debates throughout the Festivals.

As part of WOW, a joint discussion between panels in Cambridge and at Southbank Centre, London, The Education Emergency, will examine the difference education makes to women and girls worldwide. Mariam Khalique, the former teacher of Nobel Prize winner Malala Yousafzai, who will take part in the discussion on 8 March, said: "Women have been a victim of injustice and discrimination for centuries in many parts of the world and they are being marginalised in the name of religion. In many cases, women are coerced in the name of culture and tradition. The only weapon women can fight with to have their freedom and dignity is education. My life experience tells me that a girl with a book is much stronger than one without.”

Another WOW event, Wonderwomen: Discoverers and Pioneers will investigate what difference women have made to the world. Challenging why female scientists have been denied their place in history are Professor Monica Grady of the Rosetta project, Deborah Jaffé, author of Ingenious Women, Aurelia Hibbert from Cambridge University’s Eco-Racing team and Dr Jenny Tillotson, Reader in Sensory Fashion at Central St Martin’s. This event will be hosted by Professor Tim Bussey from Science Grrl’s ‘She blinded me with science’ campaign.

One of the most topical events during the Science Festival will take place on Saturday 21 March. During the public debate, Gender and conservation: does it matter? a panel will tackle the issue of underrepresentation of women in environmental-science faculty positions and in conservation practice, particularly in positions of conservation leadership. This debate explores the ways in which biodiversity conservation could be improved if there was more gender equality amongst its leaders and explores how this might be achieved.

Dr Rosie Trevelyan, a British biologist and director of the Cambridge office of the Tropical Biology Association, who will be speaking at this event said: “Why do more men than women reach senior leadership positions? There are fewer senior women than men in conservation science – a pattern that mirrors business and politics.

“I am interested in debating how we create more gender-balanced leadership. Where does change need to take place and when?  How can more female voices be effectively heard?   What would our world be like if we had equal numbers of women leaders in conservation science?  Will it happen? And if so, when?”

Also on the panel will be Pamela Abbott from Nature England, who commented: “Women are under-represented in the leadership of many conservation organisations and academic institutions. It is vitally important for the future success of conservation that we nurture and benefit from the talents of everyone who aspires to move the conservation agenda forward. Understanding, challenging and removing barriers to career progression for women will bring about diversity in conservation leadership that reflects society as a whole.”

On Monday 9 March, Rachel McKendry, Professor of Biomedical Nanotechnology at University College London, will be giving the Annual WiSETI lecture: The Mobile Revolution: From M-Health to M-Powering Women. Professor McKendry will be speaking about her life and work. Her research lies at the cutting edge of infectious diseases, nanotechnology, telecommunications and Big Data. One of the aims of this lecture is to highlight the issues that particularly affect women in science, technology, engineering, medicine and maths (STEMM) and contribute to low retention rates in these subjects.

Also on Monday 9 March, Christine Bartram of the University of Cambridge herbarium will explore the role of women in 19th Century botany using historic sources from the herbarium and rare books from the Cory Library, during her talk Women in botany at the Botanic Garden.

Women working for the British Antarctic Survey will talk about their experiences on the ice at the Polar Museum on Tuesday 10 March. The panel discussion will end with a late-night opening of the Museum, during which visitors will be able to meet the women from the British Antarctic Survey who work at the Poles.

WOW Cambridge and the Cambridge Science Festival turn spotlight onto women, education and science.

The only weapon women can fight with to have their freedom and dignity is education. My life experience tells me that a girl with a book is much stronger than one without.
Mariam Khalique
More information

For further information concerning the Science Festival, please visit: www.sciencefestival.cam.ac.uk. For further information concerning the WOW Cambridge Festival, please visit: www.wowcambridge.cam.ac.uk.

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Bejewelled backdrop to coronations did not cost a king’s ransom

0
0

Cambridge conservation scientist Spike Bucklow uncovered the knock-down cost of the 1260 AD ‘Westminter Retable’ while researching his latest book ‘Riddle of the Image’, which delves into the materials used in medieval works of art.

Commissioned by Henry III during the construction of Westminster Abbey, the altarpiece’s use of fake gemstones is already well documented. However, what has not been known until now is just how little the king would have paid for the Retable, the oldest known panel painting in England.

Using centuries-old records of accounts from Westminster Abbey, Bucklow was able to determine prices for the amount of wood used, the area of glass needed, each pigment of paint, and the wages the carpenters and painters were paid. This information was combined with practice-based research into the Retable whilst it was being restored at the Hamilton Kerr Institute.

“This is bargain basement stuff, it was all dirt cheap,” he said. “While some of the other objects in Riddle of the Image would have been cost the same as a farm or country home, the Westminster Abbey altarpiece would have cost no more than eight cows or about £5 in 13th century money.

“Historians have often thought that a financially constrained Henry was cutting corners, but you don’t spend as much as he did on the rest of the Abbey and then cut corners on the most visual and most important area for the crowning of monarchs.”

Rather than penny-pinching to preserve pounds, crowns and shillings, Bucklow believes that Henry III deliberately chose cheap materials and fake gemstones to accentuate one of the key themes of the altarpiece – miraculous transformations.

“It is no coincidence that all three surviving painted scenes show Christ involved in a transformation. Transformation is key to the whole Retable. It was the backdrop for transformations in a very real sense. In front of it, once in a generation, someone was turned into a monarch, while much more often, bread and wine were transformed into the body and blood of Christ.

“To make a fake gem you take sand and ash and transform something ordinary into something beautiful. Henry is telling us that art is above gold. We know how engaged he was with artists of the day. I really believe that he was dedicating human ingenuity and skill to God. He’s making a statement.”

As well as determining the cost of the Westminster Retable, The Riddle of the Image is an attempt to look at medieval works of art through the eyes of those who commissioned and made them. Bucklow believes that our modern-day appreciation of cultural artefacts – such as mobile phones – is completely divorced from our understanding of the materials that go into their making.

In medieval times, however, there was a widespread knowledge of artists’ materials that contributed deeper meaning to objects such as the Metz Pontifical (c.1316) and the Macclesfield Psalter (c.1330), both beautiful illuminated manuscripts now in the Fitzwilliam Museum, as well as the Thornham Parva Retable, which was also restored at the Hamilton Kerr Institute, and the Wilton Diptych, Richard II’s iconic portable altarpiece.

Bucklow believes this is because many of the pigments and materials used in the pre-modern world for artistic purposes also had common, everyday uses such as cochineal and lapis lazuli being used in make-up and medicine. (Red dyes were used in heart tonics and the blue stone was used to 'dispel melancholy' and lower fevers.) As such, artists' materials were readily available from apothecaries of the day.

By examining the science of the materials, as well as the techniques of medieval artists, Bucklow hopes to further the reader and art-world’s understanding and appreciation of the paintings, and medieval art in general.

Each chapter in the book is devoted to one of five objects and each builds on the cultural relevance of materials, exploring the connections between artists’ materials and their everyday life; showing how materials could be used philosophically and playfully.

For example, in one of the book’s featured artworks, two blues, one of which cost ten times as much as the other, were used side by side, even though they could not be told apart with the naked eye. In another manuscript, the strange choice of materials matched the bizarre contorted hybrid figures seen swarming across the page margins.

The Riddle of the Image, published by Reaktion Books, is available now.

Research into England’s oldest medieval altarpiece – which for centuries provided the backdrop to Westminster Abbey coronations – has revealed that it cost no more than the rather unprincely equivalent of eight cows.

The Westminster Abbey altarpiece would have cost no more than eight cows or about £5 in 13th century money. This is bargain basement stuff.
Spike Bucklow.
Detail from the Westminster Retable

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Cambridge partners with India to fight multidrug resistant TB

0
0
This illustration depicts a three-dimensional (3D) computer-generated image of a cluster of rod-shaped drug-resistant Mycobacterium tuberculosis bacteria, the pathogen responsible for causing the disease tuberculosis (TB). The artistic recreation was based upon scanning electron micrographic imagery.

The Cambridge-Chennai Centre Partnership on Antimicrobial Resistant Tuberculosis will bring together a multidisciplinary team of international researchers, and will be led by Professor Sharon Peacock and Dr Soumya Swaminathan.  The team, including Professors Lalita Ramakrishnan, Ken Smith, Tom Blundell and Andres Floto, will focus on developing new diagnostic tools and treatments to address the sharp rise in cases of multidrug resistant tuberculosis (TB).

This will include research into:

  • the use of emerging sequence-based diagnostics to improve the accuracy of individual patient treatment for drug resistant TB
  • predicting the impact of genetic mutations on drug resistance based on modelling of bacterial genome data
  • the development of an in-depth understanding of bacterial genes associated with so-called ‘drug-tolerance’, where the drug’s ability to kill the bacteria gradually weakens
  • novel approaches to treatment of TB based on enhancing the body’s immune system to enable it to fight infection.

The partnership will generate a rich and lasting clinical and genomic dataset for studying TB, and the transfer of scientific training and technology will foster future international collaborative projects.

“I am delighted that Cambridge has been given the opportunity to work on a disease of global importance through the development of this partnership,” said Professor Sharon Peacock. “Chennai was the site for many of the early MRC-funded TB treatment trials, and the chance to explore new therapies and diagnostics to improve patient outcome through the use of state-of-the-art technologies represents an exciting opportunity.”

The funding is part of a landmark collaboration between the MRC and the Government of India DBT. Nearly £3.5million will be invested by the UK, through the MRC and the Newton Fund, a new initiative intended to strengthen research and innovation partnerships between the UK and emerging knowledge economies, with matched funding provided by DBT.

Prof K. VijayRaghavan, Secretary, Department of Biotechnology added: “The Department of Biotechnology, Government of India is delighted to partner with the MRC in creating research centres which will address vexing challenges in medicine through quality science and collaboration. India is committed to working with the best in the world, for India and for the world. We are acutely aware that the fruits of our partnership can mean better lives for the most- needy everywhere and are committed to make the collaboration succeed.”

The University of Cambridge has been awarded £2 million from the UK Medical Research Council and the Government of India’s Department for Biotechnology to develop a partnership with the National Institute for Research in Tuberculosis (NIRT) in Chennai.

I am delighted that Cambridge has been given the opportunity to work on a disease of global importance through the development of this partnership
Sharon Peacock
This illustration depicts a three-dimensional (3D) computer-generated image of a cluster of rod-shaped drug-resistant Mycobacterium tuberculosis bacteria, the pathogen responsible for causing the disease tuberculosis (TB). The artistic recreation was base

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Cambridge Drug Discovery Institute to fast-track development of new treatments for dementia

0
0
Elderly hands

Dementia affects over 830,000 people in the UK and costs the UK economy £23 billion a year. Increasing political focus on improving the outlook for people with dementia in recent years has led to small increases in research funding, but there remains a desperate lack of effective treatments for those with the condition. It has been 12 years since the last treatment for dementia was licensed in the UK and while current treatments help with symptoms, they are only modestly effective and not suitable for all dementias.  At the G8 Dementia Summit one year ago, health leaders from across the world pledged a research ambition for a disease-modifying therapy for dementia by 2025.

Alzheimer’s Research UK’s Drug Discovery Alliance will make a major contribution to delivering this ambition – a network of Drug Discovery Institutes dedicated to early stage drug discovery. Each Institute will be led by a Chief Scientific Officer working in tandem with some of the UK’s leading academic researchers based at each of the three universities and Alzheimer’s Research UK’s own in-house research leaders. New ideas and breakthroughs from academic research teams in each university, and beyond, will be driven straight into the hands of dedicated biology and chemistry teams in each Institute, expert in designing and developing potential new medicines.

The Cambridge Drug Discovery Institute will be located on the Cambridge Biomedical Campus, the centrepiece of the largest biotech cluster outside the United States, and involves many members of Cambridge Neuroscience, a multidisciplinary network of researchers across the city. Its academic lead will be Professor David Rubinsztein, Wellcome Trust Principal Research Fellow and Deputy Director of the Cambridge Institute for Medical Research.

Professor Rubinsztein says: “The new institute will be a world class environment in which to conduct research aimed at transforming the lives of patients living with dementia. It will build on our strengths in basic research and its translation into new treatments for patients. Its location on the Cambridge Biomedical Campus will give it unparalleled access to scientists, clinical researchers and patient cohorts, as well as strong links with pharma and biotech companies in the region.”

With one dementia researcher for every six working on cancer, attracting new expertise to tackle the growing global health problem is crucial. Over the next five years, the Drug Discovery Institutes aim to attract around 90 world-class researchers into dementia drug discovery, who will be equipped with the latest technology and infrastructure through the hosting universities.

Dr Eric Karran, Director of Research at Alzheimer’s Research UK, said: “Academic research is a goldmine of knowledge about diseases such as Alzheimer’s, and by tapping into the innovation, creativity, ideas and flexibility of scientists in these universities, we can re-energise the search for new dementia treatments. Working in universities and hospitals alongside people affected by dementia and their families, academic researchers are best placed to take research breakthroughs and progress them into real world benefits for the people that so desperately need them.

“The Drug Discovery Alliance is one of the first of its kind for dementia research in the world. We’re providing the investment and infrastructure that is needed to maintain and grow a healthy pipeline of potential new treatments to take forward into clinical testing. It’s only by boosting the number of promising leads to follow-up, that we’ll have the best chance of developing pioneering medicines that can change the outlook of this devastating condition.”

The Drug Discovery Alliance builds on the experiences of similar initiatives driven by cancer charities over the last two decades, which are now starting to deliver effective new treatments to patients.

Alzheimer’s Research UK, the world’s largest dedicated dementia research charity, has announced a £30 million Drug Discovery Alliance, launching three flagship Drug Discovery Institutes at the Universities of Cambridge, Oxford and UCL (University College London). The Drug Discovery Institutes will see 90 new research scientists employed in state-of-the-art facilities to fast-track the development of new treatments for Alzheimer’s disease and other dementias.

The new institute will be a world class environment in which to conduct research aimed at transforming the lives of patients living with dementia
David Rubinsztein
My Mom's Hands

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

Firing up the proton smasher

0
0
Large Hadron Collider

While it slept, we were allowed into the tunnels.

The Large Hadron Collider (LHC) had shut down for two years to upgrade following the discovery of the Higgs boson. In the main ring, 175 m underground, chunks had been cut out of the snaking tubes for essential maintenance. These tubes fire protons in opposite directions, whipping them ever faster until they almost reach the speed of light. Along the 27 km run are four ‘experiments’: vast machines envelop the points at which tubes intersect and particles collide to capture the results. The largest of these, ATLAS, is the size of a six-storey building.

Each collision, known as an ‘event’, produces a splurge of elementary particles such as quarks, gluons and – as we now know – Higgs bosons. On average, events occur 40 million times a second in the LHC. 

The precision required for these events is exquisite. Our guide tells us to imagine two people standing six miles apart and each simultaneously firing a gun so that the bullets meet exactly head-on. Except instead of bullets, imagine needles. Inside the tunnels, engineers zip past on bicycles – the best way to get around underground unless you’re a proton. Next to every lift shaft is a bike rack.

In the next few months, the LHC will be switched back on. The 2012 triumph of demonstrating the Higgs boson affirmed the Standard Model: the elegant solution to the building blocks of the Universe. Now, with an anticipated almost doubling of energy for the LHC’s second run, physicists are aiming to “go beyond” the Standard Model.

One of the central goals is to prove or disprove the theory of supersymmetry: the “prime candidate” theory for unlocking the mystery of the dark matter in our Universe.

“Observable matter only makes up 5% of the Universe; the rest is what we call dark matter. We know it’s there because we can see galaxies rotating at velocities which require surrounding matter for such gravitational pull – but, unlike the part of the galaxies that we can see, we cannot detect it optically,” said Professor Val Gibson, Head of the Cambridge High Energy Physics (HEP) group.     

Supersymmetry theory essentially predicts that every particle in the Standard Model has a matching particle waiting to be found. These partner particles (or ‘sparticles’) could be candidates for dark matter, but we haven’t yet seen them – perhaps because they are heavier and take more energy to generate, a problem LHC Run II could overcome.

“Supersymmetry theory predicts there is a sister particle of the electron called a ‘selectron’, which would have integer ‘spin’: its intrinsic angular momentum. For the quark, there would be a supersymmetric ‘squark’, and so on for every elementary particle we know,” said Gibson. If supersymmetry is correct, there would also be a further four Higgs bosons for us to discover.


“Proton collisions in the LHC might produce a heavy supersymmetric particle which decays into its lightest form, a light neutral particle, but different from those we know about in the Standard Model,” said Gibson.

“We have been looking for supersymmetry particles throughout the first run of the LHC, and the increase in power for Run II means we can look at higher energies, higher mass, and gradually blot out more areas of the map in which supersymmetrical particles could be hiding.”

Will supersymmetry be proved by the end of next year, or will the data show it’s a red herring? For HEP research associate Dr Jordi Garra Ticó, what is really fundamental is experimental evidence. “I just want to see what nature has prepared for us, whether that’s consistent with some current theory or whether it’s something else that no one has ever thought about yet, outside of current knowledge.”

The two experiments that Cambridge researchers work on are the mighty ATLAS and the more subtle LHCb – known as LHC ‘Beauty’ – which is Gibson and Garra Ticó’s focus. Beauty complements the power of ATLAS, allowing scientists to ‘creep up’ on new physics by capturing rare particle decays that happen every 100 million events.

Garra Ticó spent six months in Cambridge before taking up residence at CERN, where he works on LHCb. LHCb’s 10 million events a second create 35 kbyte of data each, a figure that is expected to go up to 60 kbyte during Run II – too much to ever imagine storing. “There is no guidebook,” he explained. “These machines are prototypes of themselves.”

ATLAS, the biggest experiment, feels like the lair of a colossal hibernating robot. Engineers perch in the crevices of the giant machine, tinkering away like tiny cleaner birds removing parasites. And sealed in the heart of this monster is layer upon layer of the most intricate electronics ever devised.     

Dr Dave Robinson arrived in CERN as a PhD student in 1985, and joined the Cambridge HEP group in 1993. He went back to CERN in 2004 – expecting a stint of “one to two years” – and has remained. He is now Project Leader for the most critical detector system within ATLAS, the Inner Detector, which includes the ‘semi-conductor tracker’ (SCT), partially built
in Cambridge.

Each collision event inside ATLAS leaves an impression on the layers of silicon that make up the SCT like an onion skin – enabling scientists to reconstruct the trajectory of particles in the events. “The sensitivity of the tracker is vital for making precise measurements of the thousands of particles generated by the head-on collisions between protons, including decay products from particles like b-quarks which only exist for picoseconds after the collision,” said Robinson.

He is currently working with Gibson and colleagues at the Cavendish Laboratory on the next generation of radiation-proof silicon technology in preparation for the LHC shutdown of 2020, the next time they will be able to get at the SCT, which is otherwise permanently locked in the core of ATLAS. The technology will have an impact on areas like satellite telecommunications, where cheaper, radiation-hardened electronics could have a huge effect.

This, for Gibson, is the way science works: solving technical problems to reveal nature’s hidden secrets, and then seeing the wider applications. She recalls being in CERN when she was a postdoc in the 1980s at the same time as Tim Berners-Lee, who was working on computer-sharing software to solve the anticipated data deluge from LHC-precursor UA1. He ended up calling it the World Wide Web.

Inset image – top: representation of the Higgs Boson particle; Credit: CERN.

Inset image – bottom: Professor Val Gibson.

 

The Large Hadron Collider is being brought back to life, ready for Run II of the “world’s greatest physics experiment”. Cambridge physicists are among the army who keep it alive.

I just want to see what nature has prepared for us, whether that’s consistent with some current theory or whether it’s something else that no one has ever thought about yet, outside of current knowledge
Jordi Garra Ticó
Large Hadron Collider

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Molecular inhibitor breaks cycle that leads to Alzheimer’s

0
0

A molecule that can block the progress of Alzheimer’s disease at a crucial stage in its development has been identified by researchers in a new study, raising the prospect that more such molecules may now be found.

The report shows that a molecular chaperone, a type of molecule that occurs naturally in humans, can play the role of an “inhibitor” part-way through the molecular process that is thought to cause Alzheimer’s, breaking the cycle of events that scientists believe leads to the disease.

Specifically, the molecule, called Brichos, sticks to threads made up of malfunctioning proteins, called amyloid fibrils, which are the hallmark of the disease. By doing so, it stops these threads from coming into contact with other proteins, thereby helping to avoid the formation of highly toxic clusters that enable the condition to proliferate in the brain.

This step – where fibrils made up of malfunctioning proteins assist in the formation of toxic clusters – is considered to be one of the most critical stages in the development of Alzheimer’s in sufferers. By finding a molecule that prevents it from occurring, scientists have moved closer to identifying a substance that could eventually be used to treat the disease. The discovery was made possible by an overall strategy that could now be applied to find other molecules with similar capabilities, extending the range of options for future drug development.

The research was carried out by an international team comprising academics from the Department of Chemistry at the University of Cambridge, the Karolinska Institute in Stockholm, Lund University, the Swedish University of Agricultural Sciences, and Tallinn University. Their findings are reported in the journal Nature Structural & Molecular Biology.

Dr Samuel Cohen, a Research Fellow at St John’s College, Cambridge, and a lead author of the report, said: “A great deal of work in this field has gone into understanding which microscopic processes are important in the development of Alzheimer’s disease; now we are now starting to reap the rewards of this hard work. Our study shows, for the first time, one of these critical processes being specifically inhibited, and reveals that by doing so we can prevent the toxic effects of protein aggregation that are associated with this terrible condition.”

Alzheimer’s disease is one of a number of conditions caused by naturally occurring protein molecules folding into the wrong shape and then sticking together – or nucleating – with other proteins to create thin filamentous structures called amyloid fibrils. Proteins perform important functions in the body by folding into a particular shape, but sometimes they can misfold, potentially kick-starting this deadly process.

Recent research, much of it by the academics behind the latest study, has however suggested a second critical step in the disease’s development. After amyloid fibrils first form from misfolded proteins, they help other proteins which come into contact with them to misfold and form small clusters, called oligomers. These oligomers are highly toxic to nerve cells and are now thought to be responsible for the devastating effects of Alzheimer's disease.

This second stage, known as secondary nucleation, sets off a chain reaction which creates many more toxic oligomers, and ultimately amyloid fibrils, generating the toxic effects that eventually manifest themselves as Alzheimer’s. Without the secondary nucleation process, single molecules would have to misfold and form toxic clusters unaided, which is a much slower and far less devastating process.

By studying the molecular processes by which each of these steps takes effect, the research team assembled a wealth of data that enabled them to model not only what happens during the progression of Alzheimer’s disease, but also what might happen if one stage in the process was somehow switched off.

“We had reached a stage where we knew what the data should look like if we inhibited any given step in the process, including secondary nucleation,” Cohen said. “Working closely with our collaborators in Sweden - who had developed groundbreaking experimental methods to monitor the process - we were able to identify a molecule that produced exactly the results that we were hoping to see in experiments.”

The results indicated that the molecule, Brichos, effectively inhibits secondary nucleation. Typically, Brichos functions as a “molecular chaperone” in humans; a term given to "housekeeping" molecules that help proteins to avoid misfolding and aggregation. Lab tests, however, revealed that when this molecular chaperone encounters an amyloid fibril, it binds itself to catalytic sites on its surface. This essentially forms a coating that prevents the fibrils from assisting other proteins in misfolding and nucleating into toxic oligomers.

The research team then carried out further tests in which living mouse brain tissue was exposed to amyloid-beta, the specific protein that forms the amyloid fibrils in Alzheimer’s disease. Allowing the amyloid-beta to misfold and form amyloids increased toxicity in the tissue significantly. When this happened in the presence of the molecular chaperone, however, amyloid fibrils still formed but the toxicity did not develop in the brain tissue, confirming that the molecule had suppressed the chain reaction from secondary nucleation that feeds the catastrophic production of oligomers leading to Alzheimer’s disease.

By modelling what might happen if secondary nucleation is switched off and then finding a molecule that performs that function, the research team suggest that they have discovered a strategy that may lead to the identification of other molecules that could have a similar effect.

“It may not actually be too difficult to find other molecules that do this, it’s just that it hasn't been clear what to look for until recently,” Cohen said. “It's striking that nature – through molecular chaperones – has evolved a similar approach to our own by focusing on very specifically inhibiting the key steps leading to Alzheimer's.  A good tactic now is to search for other molecules that have this same highly targeted effect and to see if these can be used as the starting point for developing a future therapy.”

The other members of the Cambridge team were Dr Tuomas Knowles, Dr Paolo Arosio, Professor Michele Vendruscolo and Professor Chris Dobson. All are members of the Centre for Misfolding Diseases, which is based in the University's Department of Chemistry.

A molecular chaperone has been found to inhibit a key stage in the development of Alzheimer’s disease and break the toxic chain reaction that leads to the death of brain cells, a new study shows. The research provides an effective basis for searching for candidate molecules that could be used to treat the condition.

It may not actually be too difficult to find other molecules that do this, it’s just that it hasn't been clear what to look for until recently. A good tactic now is to search for other molecules that have this same highly targeted effect and to see if these can be used as the starting point for developing a future therapy.
Sam Cohen
Transmission electron microscopy image showing a molecular chaperone (the black dots) binding to thread-like amyloid-beta (Aβ42)

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

“You’ve got a friend in me” Bringing designers and animators together

0
0

The adventures of Woody and Buzz Lightyear have been charming children – and adults – worldwide for 20 years this year. As well as a razor-sharp, hilarious script, Toy Story was the first full-length feature film made entirely using computer-generated imagery, marking the arrival of a new way of creating visual effects in three dimensions.

But the underlying mathematics that brought the toys to life, and continues to be used by a thriving visual effects industry, has actually been around since the 1960s. It’s embedded in how the automotive, aeronautical and other manufacturing industries design their products.

The two branches of design – called subdivision surfaces (used by animators) and Non-Uniform Rational B-Splines (NURBS, used by the manufacturing industry) – have the same mathematical roots, but they have evolved in different directions.

Recently, however, researchers at Cambridge’s Computer Laboratory have found a way to reconcile the two divergent paths, enabling product designers to access the easier and less-constraining tools used by the animation industry.

This all sounds like good news for the product designers. But, as lead researcher Neil Dodgson, Professor of Graphics and Imaging, explained, “there is understandable caution. Although the method used by designers gives greater freedom and increased usability, manufacturers have a back catalogue of existing models and around 45 years of experience. A move away from the method used by the manufacturing industry has to be sufficiently advantageous to warrant making.” Dodgson believes that current research is providing that advantage.

The NURBS method was developed at a time when computers were very limited in their capabilities; by the time the subdivision surface method was commercialised, in the late 1990s, computers had vastly increased memories and processing power.

Essentially, the two methods address different priorities. When animators model three-dimensional surfaces, they want ease of design and their ‘product’ lives only on the screen. An engineer, by contrast, needs a design tool that is mathematically able to handle a wide range of requirements, including specifying cutting paths, mould shapes and objects that can actually be manufactured.

Dodgson’s team’s first breakthrough was to demonstrate, in 2009, a mathematical framework that made NURBS compatible with subdivision methods. Like a ‘bolt-on’ application, NURBS-based design could be imported to subdivision methods at the press of a button.

“It had been thought that the two methods had diverged so much as to be incompatible. Suddenly, we had a method that theoretically offered the manufacturing industry the flexibility the artists enjoy in subdivision. But the ‘theoretically’ is important… in practice there were two stumbling blocks.”

With funding from the Engineering and Physical Sciences Research Council, his team has spent the past few years ironing out the problems. Ironically, one of the problems was to make creasing possible.

Take the Mercedes car. Part of its distinctive shape is the presence of two furrowed creases running the length of the hood. In fact, almost all cars have a crease somewhere. The NURBS method can accomplish this, and so can the animators’ subdivision method, but the researchers’ NURBS-compatible subdivision method had cases that just did not work. Now, however, the problem of creasing has been solved by Dr Jirí Kosinka.

The second challenge was to enable ‘trimming’. In NURBS design, holes and complicated joins are often made by mathematically trimming away part of the NURBS surface, which adds a further layer of mathematical complexity on top of the basic NURBS method. Subdivision does not need trimming, because it has the flexibility to allow holes and complex joins within its basic mathematical structure.

PhD student Jingjing Shen has tackled this problem by developing a method that will convert a trimmed NURBS surface to an ordinary, untrimmed, subdivision surface. Her current challenge is to extend this work from ordinary subdivision to NURBS-compatible subdivision.

“So will the industry take up our method? Well, a new piece of research might help persuade them,” said Dodgson. While the researchers in Cambridge were perfecting their conversion method, researchers in Europe and the USA have spent a decade developing a computational approach called isogeometric analysis (IGA) that would allow manufacturers to carry out design and simulation using the same tools.

New designs of products such as cars, planes and ships have to be rigorously tested using simulation software to be sure they will work – and work safely – once manufactured. At the moment, it is necessary to convert data from NURBS into a different geometrical representation for the analysis and testing phase. The engineers carrying out the analysis have to take the NURBS designs and then spend weeks or months creating new models that can be fed into the simulation system.

“Although IGA would enormously speed this process up, it cannot be used by product designers because it hasn’t been able to handle trimming,” said Dodgson. “We think we offer a way to avoid this problem.” Dodgson’s Austrian collaborators have recently developed IGA for subdivision surfaces and Dodgson suggests that the trimming problem can be completely avoided with Shen’s method for converting trimmed NURBS to untrimmed subdivision for analysis.

Dodgson points towards the example of a leaky teapot as an indication of how important the link between design and analysis is. When a teapot is designed using NURBS, the cutting and trimming needed to fit a spout to the body of the teapot leaves a tiny gap at one edge of the join. The same would be true for fitting the nose of an aeroplane to the body.

“At the production stage, these gaps don’t matter because the gaps are truly tiny. At sub-micrometer in size, they are smaller than the machining tools can cope with, so they simply vanish in the actual product,” he explained.

“But before you get to the production stage, when the design is going through simulation testing, they do matter. Any gap in a teapot would cause it to leak, in theory, and so the software throws up errors.”

Dodgson believes that his conversion method can solve these difficulties: “When you convert from trimmed NURBS to subdivision, the gaps vanish: there is a true mathematical join between previously disjointed surfaces.”

He added: “This, combined with IGA, and subdivision’s increased flexibility and usability, all look very promising for being able to design and analyse automatically, and quickly feed the results back into re-design.” The researchers believe that the new process they are developing could make a vast difference to manufacturing design. Or, in the words of Buzz Lightyear, to infinity and beyond.

Inset images: credit: Jirí Kosinka

Aircraft designers and animators use different digital technologies to achieve the same goal: creating a three-dimensional image that can be manipulated. But a new method that links the two could vastly speed up how product designers create and simulate the performance of their products.

Suddenly, we had a method that theoretically offered the manufacturing industry the flexibility the artists enjoy in subdivision
Neil Dodgson
Reflection lines on a creased structure

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Study finds increased DNA mutations in children of teenage fathers

0
0

A genetic study of over 24,000 parents and their children has shown that the children of teenage fathers have unexpectedly high levels of DNA mutations.

Mutations, the result of DNA copying errors during cell division, can occur in different cells of the body and at different times during life. Some, such as those that occur in 'germ cells'– which create sperm or eggs – cause changes affecting the individual's offspring.

Previously, it was thought that germ cells in both boys and girls go through a similar number of cell divisions, and should have roughly the same rates of DNA mutation by the time an individual reaches puberty.  

Now, a new study shows that the number of cell divisions – and consequently DNA mutation rates – experienced by the germ cells of teenage boys is six times higher than for those of girls, and that DNA mutations passed down to the children of teenage fathers are higher as a result.

Researchers say the increased DNA mutations in the reproductive cells of adolescent boys could explain why the children of teenage fathers have a higher risk for disorders such as autism, schizophrenia and spina bifida.

Men produce germ cells throughout their lives, and it was previously assumed that DNA mutation in germ cells increased as men get older – more cell division and greater DNA mutation has occurred as men age.

However, the latest results show that the germ cells of adolescent boys are an exception to this aging rule.

Researchers have shown that male germ cells go through around 150 cell divisions by puberty, compared to the 22 cell divisions experienced by female oocytes (immature egg cells). This raises in tandem the rates of DNA mutation incurred by cell division in the germ cells of teenage boys – creating higher chances of hereditary disease in children conceived by adolescent fathers.

The researchers say that this could be the result of unknown cell divisions during male childhood or a spike in DNA error during puberty – although the reasons are currently unclear.  

Prior to the new findings, male germ cells were thought to undergo 30 divisions by puberty. The results overturn previous notions that the younger the man, the less cell division and the less risk of DNA mutations in germ – and later sperm – cells.

In fact, researchers say that – while more work needs to be done – these initial findings furthermore indicate that sperm cells in teenagers have approximately 30% higher rates of DNA mutation than those of young men in their twenties, and that teenage boys have similar levels of DNA mutation in their sperm cells to men aged in their late thirties and forties. 

“It appears that the male germ cells accumulate DNA errors unnoticed during childhood, or commit DNA errors at an especially high level at the onset of puberty. However, the reason for this is not yet clear,” said geneticist Dr Peter Forster, a Fellow of Murray Edwards College and the McDonald Institute at the University of Cambridge, who conducted the study with colleagues from the Institute of Forensic Genetics in Münster, Germany.

“Possibly the DNA copying mechanism is particularly error-prone at the beginning of male puberty. Or, sperm production in boys may undergo dozens more cell cycles – and therefore DNA copying errors – than has previously been suspected,” he said.

Either way, the textbooks may well need to be rewritten as a result of the new findings, says Forster, which are published today in the journal Proceedings of the Royal Society B.

The research team used DNA from blood and saliva samples taken from 24,097 normal parents and their validated biological children from areas of Germany, Austria, the Middle East and West Africa.  

The researchers analysed a type of DNA known as ‘microsatellites’ – simple, repetitive sequences of DNA that only mutate as a result of cell replication, providing the team with a natural ‘cell-cycle counter’ which they used to track the number of times a cell divides, and consequently the rate of mutations through DNA copying error.

Through comparative analysis, the research team discovered the increased DNA mutations in children of teenage fathers, and that mutations are six times higher in male sperm cells during onset of puberty than in female oocytes.   

While this means that the children of teenage fathers have increased chance of abnormality, Forster points out that the risk is still very small: perhaps around 2% as opposed to a general average abnormality risk of 1.5%. 

The team hope to develop the cell-cycle counter technique used in the study and apply it to cancers, in order to better estimate the age of such conditions in individuals, and the number of cell divisions between the initial cellular malfunction and tumour growth.

New research reveals that the sperm cells of adolescent boys have more than six times the rate of DNA mutations as the equivalent egg cells in adolescent girls, resulting in higher rates of DNA mutation being passed down to children of teenage fathers. The findings suggest that the risk of birth defects is higher in the children of teenage fathers as a consequence.

Sperm production in boys may undergo dozens more cell cycles – and therefore DNA copying errors – than has previously been suspected
Peter Forster
Section of normal testes of a young man

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Planning for war: a guide for businesses

0
0
Still The Brave Tin Soldiers

The turmoil of 2014 was a timely reminder to businesses that they need to be prepared and have contingency plans for global conflict. The crisis in Ukraine brought Russia and the West to the brink of military confrontation; relations between Japan and China became more fraught; and the year ended with the US and North Korea in something approaching what headline writers like to call “cyber war”.

The period since the end of World War II has seen no armed conflict between major military powers. Historians have dubbed this period “The Long Peace”– but of course even this time span has not been conflict free. In fact, records suggest more than a million deaths in at least 700 conflicts in the past 25 years.

Last year’s tensions have alerted company leaders to this. They have reawakened to the potential for regional conflicts disrupting their international operations, playing havoc with a fragile global economy and causing chaos in the markets. When Malaysia Airlines flight MH17 was downed while flying over conflict zones in Ukraine, causing the deaths of 298 people travelling from Amsterdam to Kuala Lumpur, it reminded the international community of the ease with which what appear to be remote crises can touch us all.

 

Malaysia Airlines CEO Ahmad Jauhari Yahya, right, and crew pray during a moment of silence for the victims of the MH17 plane crash.Azhar Rahim/EPA

 

Risk analysis

This is reflected in the World Economic Forum’s annual survey of threats of concern to world leaders and captains of industry. The WEF’s Global Risks 2015 report finds that interstate conflict is now the highest concern – with infectious diseases, water crises, extreme weather events, cyber attacks, unemployment and failures of national governments as other top worries. Only a year ago, interstate conflict didn’t make the WEF’s top 10 risks at all.

In fact, though, the rise of interstate conflict as an area of top concern is long overdue. As 2014 has shown, today’s business leaders need to be aware of the kinds of crises and conflicts that might be expected over the business cycle of the next five to 10 years.

It would certainly be remarkable – unprecedented even – if there were no conflicts during the next decade. Even the most peaceful decades in history have seen minor conflicts every few years. Most decades since the mid-19th century have seen an average of two conflicts involving at least medium-ranked powers. At least one decade in five could be expected to see a regional war involving a G20 nation, most typically a proxy war between superpowers. History shows that supposedly “unthinkable” wars have a nasty habit of surprising people – even the war leaders who instigate them.

Risk hotspots

Analysis for our risk studies centre by Cytora Ltd, a research partner specialising in geopolitical risk, identifies more than 100 potential scenarios for two nation states to spark off a military conflict in the next decade – gauged from recent antagonist statements towards each other, antithetical values and historical enmity.

 

Map of Interstate Conflict RiskCambridge Centre for Risk Studies & Cytora Ltd.

 

The risk map above of future conflicts identifies a number of regional hot-spots, including the obvious Middle East, central and eastern Africa, the eastern European margins, the Indian subcontinent, parts of Latin America and the emerging South-East Asian powers. Wherever they can get the information from, businesses would do well to incorporate this kind of research into their strategic thinking. Forewarned is forearmed. Conflicts threaten employees in the theatre of war, they disrupt supply chains, reduce demand for goods, and unsettle investment markets.

The various stress tests we have developed – for interstate conflict, pandemics, social unrest and more – illustrate what a company might experience from a catastrophic event such as regional war.

Hypothetical stress test scenario

In the hypothetical “what-if” scenario to help businesses assess their resilience to interstate conflict, we imagine an escalating spiral of military tension between China and Japan, based on Pentagon war games. The ensuing outbreak of hostilities disrupts trade across a wide area of South-East Asia.

 

Map of hypothetical scenario of China-Japan Conflict, showing conflict zones (A); sea exclusion zones (B); and air restriction zones (C ). Military bases are represented by total population within 50kmCambridge Centre for Risk Studies

 

Many US and European businesses have strong economic stakes in the region, including outsourced manufacturing to the region, providing services to growing markets there, and managing supply chains through and around the region.

China and Japan are the world’s second and third largest economies and they create US$120 billion of exports, much of which are shipped through the South China Sea. The scenario envisions much of this being halted by a naval sea exclusion zone that freezes the activities of six of the world’s largest ports, and prevents almost half of the world’s container traffic passing through it.

Militarised restricted air space prevents commercial air traffic across the combatant countries, which stops flights from five of the world’s top 20 airports, handling 8% of world passenger traffic and 46% of air freight.

The scenario tests the ability of businesses to manage this type and scale of crisis. Supply chains, major markets and regional offices have to be reconfigured quickly. Businesses have to review their global exposures and identify concentration risks and choke points in their operations that could be impacted. Currency exchange rates will fluctuate wildly. Business counter-parties could find themselves in distress, so companies need to consider credit-risk tolerance and cash flows during a crisis of this kind.

Company boards might be tempted to skirt round the idea of such unthinkable catastrophes? But for thoughtful businesses, it is certainly something worth thinking about.


The risk of interstate conflict and its impact on business is presented in the proceedings of a Cambridge Centre for Risk Studiesseminar, held in London on February 19, 2015.

The Conversation

This article was originally published on The Conversation. Read the original article.

Dr Andrew Coburn of the Cambridge Judge Business School writes on The Conversation website about how business leaders have reawakened to the risk of regional conflict, and discusses research carried out at the Centre for Risk Studies on bad-news scenarios ranging from cyber war to regional conflict to pandemic.

The World Economic Forum’s annual survey of threats of concern to world leaders and captains of industry... finds that interstate conflict is now the highest concern.
Andrew Coburn
Still The Brave Tin Soldiers

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Graphene’s potential for energy conversion and storage

0
0

In a review article published recently in the journal Science, the researchers, led by Francesco Bonaccorso, a Royal Society Newton Fellow at the Cambridge Graphene Centre, note the substantial progress made in material preparation at the laboratory level. They also highlight the challenge of producing the materials on an industrial scale in a cost-effective manner.

Graphene - a two-dimensional material made up of sheets of carbon atoms – has many potential applications, among them energy conversion and storage. Graphene and related 2D crystals combine high electrical conductivity with physical flexibility and a huge surface to weight ratio. Such qualities make them suitable for storing electric charge in batteries and supercapacitors, and as catalysts in solar and fuel cell electrodes.

“The huge interest in 2D crystals for energy applications comes both from their physico-chemical properties, and the possibility of producing and processing them in large quantities, in a cost-effective manner,” said Bonaccorso. “In this context, the development of functional inks based on 2D crystals is the gateway for the realisation of new generation electrodes in energy storage and conversion devices.”

Bonaccorso added that the challenge ahead is to demonstrate a disruptive technology in which two-dimensional materials not only replace traditional electrodes, but more importantly enable whole new device concepts. 

“Graphene and related materials have great promise in these areas, and the Graphene Flagship has identified energy applications as a key area of investment,” said review co-author Andrea Ferrari, who chairs the Executive Board of the Graphene Flagship, and is director of the Cambridge Graphene Centre. “We hope that our critical overview will guide researchers in academia and industry in identifying optimal pathways toward applications and implementation, with an eventual benefit for society as a whole.”

The Graphene Flagship, a pan-European 10-year, €1 billion science and technology programme was launched in 2013.

Adapted from Cambridge Graphene Centre news story.

Scientists working with Europe's Graphene Flagship and the Cambridge Graphene Centre have provided a detailed and wide-ranging review of the potential of graphene and related materials in energy conversion and storage.

Graphene and related materials have great promise in these areas
Andrea Ferrari
Model of graphene structure

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

Illuminating art’s history

0
0

Faced with the prospect of his rapidly approaching nuptials on 29 October 1442, and with no wedding gift purchased for his bride-to-be, Francis I of Brittany (1414–1450) did what many of us have done at some point: he ‘re-gifted’. He took something that was already in his possession and gave it to someone else.

But this was no ordinary gift: it was an illuminated manuscript, made for Francis’ first wife, Yolande of Anjou, who had died in 1440. Francis had it altered and presented it to his new bride, Isabella Stuart, daughter of James I. The portrait of his first wife was covered with that of Isabella and an image of St Catherine was added, using cheaper pigments. Then, when Francis was made a duke, the portrait was painted over yet again to give Isabella a coronet.

Art historians have written volumes on the Hours of Isabella Stuart over the last century, but a cross-disciplinary Cambridge project is using a variety of imaging techniques to uncover this story of re-gifting. The team’s work is challenging previous assumptions about this and many other manuscripts, helping them to see and understand medieval painting and illumination in new and unexpected ways.

Combining research in the arts, humanities, sciences and technology, MINIARE (Manuscript Illumination: Non-Invasive Analysis, Research and Expertise) currently focuses on uncovering the secrets of medieval art, but it is anticipated that many of the imaging techniques they are adapting may be used to study other types of art, from a range of different periods.

The project is led by Dr Stella Panayotova, Keeper of Manuscripts and Printed Books at the Fitzwilliam Museum, and Professor Stephen Elliott of the Department of Chemistry, who are working with colleagues from across the University and around the world.

“Working in a truly cross-disciplinary way can benefit art history, scientific research and visual culture in general, while pushing technology forward at the same time,” said Panayotova. “Thanks to the imaging techniques we’ve been using, we can see things in these manuscripts that we couldn’t see before.”

Much of what we know about illuminated manuscripts comes from art-historical analysis and circumstantial evidence. Since they are so delicate and the layers of pigment are so thin, manuscripts are seriously compromised by taking samples, which is common practice for the analysis of panel or fresco paintings. To gather hard evidence about how these manuscripts were made, while preserving them, non-invasive techniques are required.

“For our team, it was about finding new applications for existing techniques, and pushing them far beyond current boundaries in order to analyse the very thin layers of a manuscript,” said Elliott. “Part of our research is in the area of medical diagnostics and environmental sensing, where we analyse materials in very thin layers, which is not so different from analysing a painting. So we could certainly see what the problems were.”

Using a combination of imaging techniques, including photomicroscopy, visible and infrared imaging at multiple wavelengths, reflectance imaging spectroscopy and optical coherence tomography, the MINIARE team is able to peer through the layers of a painting to uncover its history, as in the case of the Hours of Isabella Stuart.

“We do have to adapt conventional analytical techniques to make them safe to use on something as fragile as an illuminated manuscript,” said conservation scientist Dr Paola Ricciardi. “For instance, Raman spectroscopy is a brilliant technique, but it’s a challenge to use it on a manuscript as we tend to use one-hundredth of the laser power that we would on a less fragile object.”

The technological challenge for the MINIARE team is making sure the imaging technology is non-invasive enough to keep the manuscript safe, but still sensitive enough to get an accurate result. Many of the imaging tools that the team use are in fact not cameras, but scanners that acquire a spectrum at each point as they scan an entire object. The resulting ‘spectral image cubes’ can then provide information about the types of materials that were used, as well as the ability to see different layers present in the manuscript.

Combining these non-invasive imaging techniques not only helps the researchers to distinguish between artists by analysing which materials they used and how they employed them, but also helps them to learn more about the technical know-how that these artists possessed.

“Many of the artists we’re looking at didn’t just work on manuscripts,” said Panayotova. “Some of them were panel painters or fresco painters, while others also worked in glass, textiles or metal. Identifying the ways in which they used the same materials in different media, or transferred materials and techniques across media, offers a whole new way of looking at art.”

For example, Ricciardi has found evidence for the use of smalt, a finely ground blue glass, as a pigment in an early 15th-century Venetian manuscript made in Murano. The use of a glass-based pigment is not unexpected given the proximity of the Murano glass factories, but this illuminator was working half a century before any other Venetian easel painter whose works are known to contain smalt.

Another unexpected material that the MINIARE team has encountered is egg yolk, which was a common paint binder for panel paintings, but not recommended for manuscript illumination – instead, egg white or gum were normally used. By making a hyperspectral reflectance map of the manuscript, the researchers were able to gather information about the pigments and binders, and determine that some manuscript painters were most likely working across a variety of media.

The techniques that the team are developing and refining for manuscripts will also see application in other types of art. “All of the imaging techniques we’re using on the small scale of medieval manuscripts need to be scalable, in order that we can apply them to easel paintings and many other types of art,” said Dr Spike Bucklow of the Hamilton Kerr Institute. “It’s an opportunity to see how disciplines relate to each other.”

MINIARE (www.miniare.org) involves the Fitzwilliam Museum, Hamilton Kerr Institute, Departments of Chemistry, Physics, History of Art, History and Philosophy of Science, and Applied Mathematics and Theoretical Physics, as well as the Victoria & Albert Museum, Durham University, Nottingham Trent University, Antwerp University, Getty Conservation Institute, J Paul Getty Museum, National Gallery of Art in Washington DC and SmartDrive Ltd.

Inset image – top: Macroscopic X-ray fluorescence imaging has allowed to prove the presence of smalt, a cobalt-containing glass pigment, mixed with ultramarine blue in selected areas of this early 15th century manuscript fragment painted by the Master of the Murano gradual; Left: Fitzwilliam Museum, Marlay Cutting It 18; Right: Cobalt distribution map; Credit: S. Legrand and K. Janssens, Department of Chemistry, University of Antwerp.

Inset image – bottom: Hyperspectral reflectance imaging in the visible and near-infrared range confirms evidence for the use of egg yolk as a paint binder only in figurative areas within the decorated initials in the Missal of Cardinal Angelo Acciaiuoli, painted in Florence ca. 1404; Left: Fitzwilliam Museum, MS 30, fol 1r (detail); Centre: RGB composite obtained from the hyperspectral image cube; Right: egg yolk distribution map, showing its use to paint the figure of Christ with the exclusion of his ultramarine blue robe; Credit: J. K. Delaney and K. Dooley, National Gallery of Art, Washington DC.

Scientific imaging techniques are uncovering secrets locked in medieval illuminated manuscripts – including those of a thrifty duke.

Identifying the ways in which they used the same materials in different media, or transferred materials and techniques across media, offers a whole new way of looking at art
Stella Panayotova
Francis I of Brittany 'regifted' the Book of Hours to his second wife Isabella after having his first wife painted over

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Minimising ‘false positives’ key to vaccinating against bovine TB

0
0
Cows in a field

Using mathematical modelling, researchers at the University of Cambridge and Animal & Plant Health Agency, Surrey, show that it is the specificity of the test – the proportion of uninfected animals that test negative – rather than the efficacy of a vaccine, that is the dominant factor in determining whether vaccination can provide a protective economic benefit when used to supplement existing controls.

Bovine TB is a major economic disease of livestock worldwide. Despite an intensive, and costly, control programme in the United Kingdom, the disease continues to persist. Vaccination using the human vaccine Mycobacterium bovis bacillus Calmette-Guérin (BCG) offers some protection in cattle, but is currently illegal within the European Union (EU) due to its interference with the tuberculin skin test. This test is the cornerstone of surveillance and eradication strategies and is used to demonstrate progress towards national eradication and as the basis of international trade in cattle.

The current tuberculin skin test has a very high estimated specificity of over 99.97%, which means that less than three animals in 10,000 will test falsely positive. The test as carried out in Great Britain is thought to have at best an 80% sensitivity – a measure of how many infected animals will correctly test positive – missing around 1 in 5 bovine TB-infected cattle. It is used to determine if animals, herds and countries are officially free of bovine TB.

Vaccinated animals that test positive have to be treated as infected animals. Under European law, if an animal tests positive, it must be slaughtered. The remaining herd is put under movement restrictions and tested repeatedly using both the skin test and post-mortem examinations until it can be shown to be officially clear of infection. The duration of movement restrictions is important due to the considerable economic burden they place on farms. The cost to the UK government alone, which depends on the number of visits to farms by veterinarians, tests carried out and compensation for the slaughter of infected animals, is estimated to be up to £0.5 billion pounds over the last ten years.

For vaccination to be feasible economically and useful within the context of European legislation, the benefits of vaccination must be great enough to outweigh any increase in testing. A new generation of diagnostic tests, known as ‘Differentiate Vaccinated from Infected Animals’ (DIVA) tests, opens up the opportunity for the use of BCG within current control programmes.

The EU has recently outlined the requirements for changes in legislation to allow cattle vaccination and a recent report from its European Food Safety Authority emphasized the importance of demonstrating that BCG is efficacious and that DIVA tests can be shown to have a comparable sensitivity to tuberculin testing in large-scale field trials. However, a key factor overlooked in this report was that the currently viable DIVA tests have a lower specificity than tuberculin testing; this could lead to vaccinated herds being unable to escape restrictions once a single test-positive animal has been detected, as the more times the herd is tested, the more likely the test is to record a false positive.

In the study published today, the researchers from Cambridge and the Animal & Plant Health Agency used herd level mathematical models to show that the burden of infection can be reduced in vaccinated herds even when DIVA sensitivity is lower than tuberculin skin testing – provided that the individual level protection is great enough. However, in order to see this benefit of vaccination the DIVA test will need to achieve a specificity of greater than 99.85% to avoid increasing the duration and number of animals condemned during breakdowns. A data set of BCG vaccinated and BCG vaccinated/experimentally M. bovis infected cattle suggests that this specificity could be achievable with a relative sensitivity of the DIVA test of 73.3%.

However, validating a test to such a high specificity will likely prove a challenge. Currently, there is no gold standard test to diagnose TB in cattle. Cattle that test positive are slaughtered immediately and therefore have rarely developed any physical signs – in fact, only around a half of animals examined post-mortem show physical signs of infection even if they are, indeed, infected.

Dr Andrew Conlan from the Department of Veterinary Medicine at the University of Cambridge says: “In order for vaccination to be viable, we will need a DIVA test that has extremely high specificity. If the specificity is not good enough, the test will find false positives, leading to restrictions being put in place and a significant financial burden for the farmer.

“But validating a test that has a very high specificity will in itself be an enormous challenge. We would potentially need to vaccinate, test and kill a large number of animals in order to be confident the test is accurate. This would be very expensive.”

The need for a better DIVA test was acknowledged by the Government at the end of last year. In a written statement to the House of Commons noting data from the University of Cambridge and Animal Health and Veterinary Laboratories Agency, the Rt Hon Elizabeth Truss, Secretary of State for Environment, Food and Rural Affairs, said: “An independent report on the design of field trials of cattle vaccine and a test to detect infected cattle among vaccinated cattle (DIVA) shows that before cattle vaccination field trials can be contemplated, we need to develop a better DIVA test.”

The study was funded by the Department for Environment, Food and Rural Affairs (Defra) in the UK.

Reference
Conlan, AJK et al. Potential benefits of cattle vaccination as a supplementary control for bovine tuberculosis. PLOS Comp Biol; 19 Feb 2015

New diagnostic tests are needed to make vaccination against bovine tuberculosis (bovine TB) viable and the number of false positives from these tests must be below 15 out of every 10,000 cattle tested, according to research published today in the journal PLOS Computational Biology.

Validating a test... will in itself be an enormous challenge. We would potentially need to vaccinate, test and kill a large number of animals in order to be confident the test is accurate
Andrew Conlan
Cows

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

Acting ‘out of character’ in the workplace

0
0

We are often typecast as introverts and extroverts. People do have biological propensities to behave in certain ways; some of us are naturally more talkative and sociable while others prefer more time alone. But, according to Canadian-born research psychologist Professor Brian Little, our traits are by no means fixed. Little is now collaborating with Cambridge University PhD candidate Sanna Balsari-Palsule on an in-depth study of 'free-traits'.

In his new book Me, Myself and Us: The Science of Personality and the Art of Well-Being, Little suggests that we are often able to override our biological make-up through the adoption of free-traits which allow us to act in different ways to our natural selves. We call on these free-traits to meet the demands of different situations and achieve projects and goals that are important to us.

Little recommends that we might usefully think of ourselves as amateur scientists. We are continually exploring and testing the world around us to discover what works and what doesn’t. We do things, say things, and then we observe the reactions and unconsciously store the results. We apply what we learn from our ‘experiments’ to the advancement of what Little describes as our ‘personal projects’ – a description he devised back in 1983 to describe the goals and pursuits that underlie people’s behaviour.

The personal projects in question might be big ones (such as career ambitions) or small ones (like cleaning the car) but they form the bedrock of our day-to-day behaviour and our relationships with our friends, family and workmates. Sometimes our personal project pursuit requires us to engage in free-traits; other times, we can just be ourselves. Little proposes that the successful pursuit of ‘core projects’ that are meaningful, manageable, supported by others and generate positive feelings can greatly impact our happiness and the quality of our lives.

Since 2010, Little has lectured in the Department of Psychology and Cambridge Judge Business School.  The course he teaches is based on his lifetime’s research covered in Me, Myself and Us– and it offers undergraduate, graduate and executive MBA students the chance to reflect on their own personality. In 2011, Little taught a group of graduate students that included Sanna Balsari-Palsule.

“I loved the idea that acting is not something restricted to the stage, but that we are so often faced with the need to perform in daily life. With the amount of time spent in our jobs, our occupations hold such a prominent place in our lives. In an ideal world, one’s job would fit one’s traits perfectly, but that’s very rarely the case. As so much can hinge on how we behave with others in the workplace, I became fascinated with exploring what happens when people push the limits of their ability to act out of character. Do they experience detriments in their well-being or work performance and does this increase their chances of burnout?” said Balsari-Palsule.

In collaboration with Little, Balsari-Palsule has been conducting projects that explore the experiences of employees in organisations. Initial results from the first stage of research in a large marketing company are intriguing. The findings suggest that extroverts initially experience advantages over introverts in terms of getting noticed and promoted more rapidly. However, when introvert employees higher up in the organisation act out of character and become extraverted ('pseudo-extroverts'), they have equal performance ratings as extroverts, and do not report feeling drained.

Little and Balsari-Palsule offer an explanation: introvert employees make frequent use of ‘restorative resources’. These are spaces in the workplace designed to allow employees to read quietly or simply relax in order to recover their equilibrium after a strenuous session of acting out of character that would otherwise drain their energy. However, if the same employees were expected to act out of character for more prolonged periods, without the chance to recover, the benefits could quickly turn into costs.

In the same study, however, extroverts report strikingly different, and much less rewarding, experiences of acting out of character. It appears that more outwardly confident personality types find it extremely hard – and stressful – to rein back their personalities and act as if they were introverted (‘pseudo-introverts’).

“We found this difference was most common among younger employees. It may be that introverts are generally so accustomed to acting extrovertly in situations outside of the workplace that it becomes a relatively easy force of habit, particularly in Western cultures where extroversion is often highly valued. On the other hand, extrovert employees at the beginning of their careers are much less used to being isolated in an office for long periods of time, so may feel like caged animals, needing to feed off the energy of others in order to thrive,” said Balsari-Palsule.

In the second stage of research, Balsari-Palsule is looking into the idiosyncrasies of people’s work projects and how the work environment plays a vital role in supporting or, in some cases, constraining them. For example, highly competitive work environments, that place strict demands on employees to conform to certain types of behaviour, may leave little time for employees to pursue their personally important and valuable core projects, which could eventually be detrimental to their well-being. She expects that a closer look at the influences of different factors in the work environment in conjunction with how people behave will shed more light on when the costs and benefits of acting out of character are drawn out.

The practical implications of this research are numerous. Balsari-Palsule suggests that it would serve employers well to not disregard the costs of free-trait acting as compromised psychological well-being and physical health can quickly translate into costly reductions for productivity and performance and increases in absenteeism. Instead, organisations must adopt policies and build work environments that are supportive of free-trait expression but also provide the spaces for people to be themselves.

She said: “Management should rely less on handing out personality questionnaires that pigeonhole employees into introvert and extrovert categories, but instead be aware of the powerful driving force of core projects on personality in the workplace.”

Look around your workplace – and ask yourself which colleagues you’d describe as extravert and which as introvert. Perhaps your most talkative workmate is actually an introvert? Research by Sanna Balsari-Palsule, a PhD candidate in the Department of Psychology, investigates the ways in which people act 'out of character'– and how the consequences play out in the workplace. 

In an ideal world, one’s job would fit one’s traits perfectly, but that’s very rarely the case.
Sanna Balsari-Palsule
Creative Company Conference 2011

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

New Technology Centre Announced For Cambridge Science Park

0
0

The Prime Minister David Cameron announced a £4.8 million partnership yesterday at the Cambridge Science Park to build a new technology centre.

The joint investment between Trinity College and the Department of Business, Innovation and Skills is part of plans to build the Sir John Bradfield Centre in the heart of the science park.

Trinity College, the College of Sir Isaac Newton among many other distinguished scientists, has long been at the centre of scientific innovation in Cambridge University.

The College was an early promoter of technology transfer to industry with the development of the Cambridge Science Park, which is now occupied by more than 90 companies with some 5,000 employees.

The College would like to do more to translate Cambridge research into companies and products; particularly in the very early stage companies.

It is known that science incubators can help in these early stages, in particular by providing teams and start-up companies with flexible and affordable space, education, mentoring and finance. It is expected that these companies will thrive in the self-sustaining entrepreneurial culture of the new centre and the Science Park.

Sir Gregory Winter, Master of Trinity College said: “Trinity College is pleased to help on all these fronts by providing a highly flexible building at the heart of the Science Park, and working with other partners to help with education, mentoring and seed financing.

“We hope to promote a culture in which we not only help to develop technologies and companies, but also the entrepreneurs who will build the industries of the future.

“We are particularly pleased to associate this building with Sir John Bradfield, former Senior Bursar of the College, who was instrumental in the creation of the Cambridge Science Park.”

Sir John Bradfield Centre will be in the heart of the science park thanks to £4.8 million investment.

We hope to promote a culture in which we not only help to develop technologies and companies, but also the entrepreneurs who will build the industries of the future.
Sir Gregory Winter, Master of Trinity College
More information:
  • Sir John Bradfield 1925 – 2014: One name alone is synonymous with the foundation of Cambridge Science Park: Sir John Bradfield, Senior Bursar of Trinity College from 1956 to 1992. Right from the start, Sir John saw that establishing and developing the links between the University and hi-tech tenants was critical to the success of Cambridge Science Park. His evident fascination in science and technology was deep rooted. Sir John won a scholarship to study natural sciences at Trinity College in 1942 and he became a research fellow in zoology in 1947. On October 13th 2014, Sir John passed away at Trinity College, on his way to a Cambridge Science Park Forum.
  • The proposed Technology Centre will be a new building on a 1.25 acre site providing a net internal floor area of 36,000 sq ft over three floors. Two thirds of the building will be innovation and lab space. In addition, about 6,000 sq ft will provide space for a café, retail unit, conference and meeting rooms and communal networking area.

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Subterfuge, double agents and viruses

0
0
Smallpox pustules

This is a tale of subterfuge and double agents, of armed struggle against an invading force and of defensive weapons being turned against their makers. But these events are happening much closer to home than you could ever imagine—these are battles fought every day within our very own bodies.
The protagonists in this struggle are viruses, which seek to exploit the resource-rich environment of our cells, and every single cell in our body, whose very survival depends on their ability to repel these viral invaders.

Viruses are masters of subterfuge, blagging their way into cells and once inside hijacking the cell’s resources in a relentless quest to generate more copies of themselves. Cells possess innate defences against viral colonisation: they can raise the alarm, warning nearby cells when they have succumbed to infection, and can sacrifice themselves, depriving the virus of a host. But recent work from the Department of Pathology in Cambridge has revealed that poxviruses have acquired genes that once formed part of the host cell’s immune defences and have turned them to serve the virus’s own ends, acting as double-agents to deceive our immune system.

When it comes to the subtle art of immune evasion poxviruses are undoubtedly masters. The most notorious poxvirus is variola virus, which caused smallpox until its successful eradication in 1979 following a heroic immunisation campaign by the World Health Organisation. The DNA genomes of poxviruses contain roughly 200 genes. While this is many more than Ebola virus, which gets by with just seven genes in total, it is dwarfed by the 20,000 or so genes that make up you and me.

The poxvirus genome encodes all the components necessary for poxviruses to invade cells, make abundant copies of themselves, and ensure these copies escape the infected cells so they can go off and colonise further cells. However, half of these 200 genes exist for the sole purpose of thwarting the body’s attempts to prevent infection or to limit the damage caused once infection is established.

Perhaps the most dramatic response of our cells to viral infection is to self-destruct, depriving the virus of a host and thus stopping the infection from spreading. This selfless act is called “apoptosis”. The triggering of apoptosis is regulated in our bodies by a finely tuned system of pro-death and pro-survival molecules. Unsurprisingly, viruses want to defuse the cell’s apoptosis bomb and make sure host cells survive long enough for the virus to replicate. When my colleagues determined the three-dimensional structure of the poxvirus protein called N1 they saw that it looked almost identical to the cellular pro-survival molecules that regulate apoptosis. Our studies showed that N1 is indeed a viral ‘apoptosis bomb disposal expert’ – it functions as a pro-survival gene, ensuring that cells refrain from self-destruction even when they know that they are infected.

But inhibiting apoptosis is not the only means by which poxviruses deter the host’s attempts to limit viral infection. When cells realise they have become infected by a virus they produce messenger molecules to warn nearby cells of the imminent danger. They do this by turning on transcription factors—regulators of gene activity. One transcription factor, NF-κB, is the Big Red Button of cellular signalling: when this is switched on the full armada of the body’s immune defences is recruited to the site of infection. It is roughly equivalent to the cell calling in air support.

Recently we solved the three-dimensional structure of a poxvirus protein called A49 that jams the NF-κB signal. Amazingly, we found that A49 also looks very similar to the cellular pro-survival molecules that regulate apoptosis, despite having a completely different function in cells. Furthermore, we also found that the protein we’d previously studied, N1, is able to jam the NF-κB signal in addition to its ability to block apoptosis.

Our findings with N1, A49 and other poxvirus proteins led us to wonder: is the similarity of these proteins to cellular pro-survival molecules more than just a coincidence? We know that the structures of proteins are conserved when those proteins descend from a common ancestor gene; it’s like the facial similarities you’d expect if looking through a family photo album.

To test whether the poxvirus proteins share a common ancestor, we systematically compared them with all the cellular pro-survival molecules that regulate apoptosis. We found that all poxvirus proteins do indeed lie on a single branch of this evolutionary tree, distinct from the branches containing host-cell proteins. This suggests that an ancestral poxvirus first acquired from its host a gene that had pro-survival activity against apoptosis, and that over the course of the evolutionary struggle between poxviruses and their unwelcoming hosts the virus has duplicated this gene and adapted it to have additional, useful functions.

Our work emphasises how evolutionary pressure can very finely tune a virus to the immune system of their host. Poxviruses have evolved multiple, specialised genes to block the various mechanisms used by our cells to respond to infection or warn nearby cells of the danger. But with such specialisation comes restrictions: unlike efficient viruses like Rabies, which infect a whole slew of warm-blooded mammals, the host range of the smallpox virus was limited to humans (the fact that allowed for its successful eradication).

Just as a master strategist will learn from his enemy, so medical researchers may learn from the evolved specialisation of pathogens like poxviruses. By investigating the manner in which poxviruses evade the host immune system, learned by trial and error over the course of thousands of years in their struggle against our bodies’ immune defences, we will gain unexpected insights into how our immune systems work, which may lead to the development of new drugs. What is more, we might also find viral genes that could be used to rein in hyperactive immune systems that cause autoimmune diseases like arthritis. Viruses are not the only ones who can turn genes into double-agents.

Dr Stephen Graham is a Sir Henry Dale Fellow, funded by the Wellcome Trust and the Royal Society, in the Department of Pathology at the University of Cambridge.

Reference
Vaccinia Virus Protein A49 is an Unexpected Member of the B-cell Lymphoma (Bcl)-2 Protein Family

Every moment of every day, our immune systems are battling to keep us healthy against an onslaught from invading organisms. But some of these invaders have evolved to use our very defences against us, writes Dr Stephen Graham, a Sir Henry Dale Fellow.

Viruses are masters of subterfuge, blagging their way into cells and once inside hijacking the cell’s resources in a relentless quest to generate more copies of themselves
Stephen Graham
A human hand with smallpox pustules. Coloured etching by W.T. Strutt.

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

European alternative finance market could top €7 billion in 2015

0
0

The European online alternative finance market grew by 144% last year, and could reach €7 billion in 2015, according to a new report produced by the Centre for Alternative Finance at University of Cambridge Judge Business School and professional services organisation EY (formerly Ernst & Young).

The report includes input from 14 key national or regional industry associations, as well as 255 leading platforms in Europe, covering an estimated 85%-90% of Europe’s online alternative finance market. While previous studies have charted alternative finance in the UK, this report is the first to cover other European countries in detail.

Seen until recently as a niche activity, online alternative finance, including equity-based crowdfunding and peer-to-peer business lending, has become a vital and increasingly commonplace source of essential funding for many businesses throughout Europe, says the report.

Online alternative finance grew across Europe from €1.21 billion in 2013 to €2.96 billion in 2014. The overall European alternative industry is on track to grow beyond €7 billion if the market fundamentals remain sound and growth continues at current rates.

The volume of online alternative business funding has been growing steadily at around 75% year on year, and the estimated number of businesses funded in this way has been growing at an even faster average rate — 133% over the last three years to around 5,801 companies in 2014.

“These new forms of alternative finance are growing quickly, and this growth is beginning to attract institutional investors,” said Robert Wardrop, Executive Director of the Centre for Alternative Finance at Cambridge Judge, and co-author of the new report. “Alternative finance, at least in some European countries, is on the cusp of becoming mainstream.”

“To date there has been little hard data about the extent of the industry across Europe,” said Andy Baldwin, EY Managing Partner, Europe, Middle East, India and Africa Financial Services. “This report shows that, while it is still considerably smaller than the industry in the UK, alternative finance on the continent cannot be ignored. The whole financial services industry should be watching this space with growing interest and this study will provide a valuable benchmark against which to measure future developments.”

The new report is written by Robert Wardrop, Bryan Zhang, Operations and Policy Director of the Centre for Alternative Finance, Raghavendra Rau, Sir Evelyn de Rothschild Professor of Finance at Cambridge Judge and Research Director of the new Centre, and Mia Gray, Senior Lecturer at the Department of Geography at the University of Cambridge, who has focused on alternative finance and regional economies.

The new Centre provides a disciplined research framework to support the fast-growing structures and activities of alternative finance, in order to address the growing needs of academics, policymakers, regulators and industry. The Centre plans to launch a research programme, host a Global Alternative Finance Data Depository, and organise conferences, networking events and a fellowship programme.

Moving Mainstream: The European Alternative Finance Benchmarking Report is available to download for free at www.jbs.cam.ac.uk/ccaf/movingmainstream.

Originally published on the Cambridge Judge Business School website.

The European alternative finance market - which includes crowdfunding, peer-to-peer lending and invoice trading - reached €3 billion last year and could top €7 billion in 2015, as businesses increasingly seek more efficient ways to raise funding.

Alternative finance, at least in some European countries, is on the cusp of becoming mainstream
Robert Wardrop
Financiación colectiva (cropped)

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

The European roadmap for graphene science and technology

0
0

In October 2013, academia and industry came together to form the Graphene Flagship. Now with 142 partners in 23 countries, and a growing number of associate members, the Graphene Flagship was established following a call from the European Commission to address big science and technology challenges of the day through long-term, multidisciplinary R&D efforts.

In an open-access paper published in the Royal Society of Chemistry journal Nanoscale, more than 60 academics and industrialists lay out a science and technology roadmap for graphene, related two-dimensional crystals, other 2D materials, and hybrid systems based on a combination of different 2D crystals and other nanomaterials. The roadmap covers the next ten years and beyond, and its objective is to guide the research community and industry toward the development of products based on graphene and related materials.

Graphene - a two-dimensional material made up of sheets of carbon atoms - and related materials are expected to revolutionise the fields in which they are applied, and they have the potential to become the materials of the 21st century. They will supplement and at times replace existing substances in a range of applications. Two-dimensional materials shall in some cases be integrated into existing platforms in order to enhance them. For example, graphene could be integrated into silicon photonics, exploiting established technology for constructing integrated circuits.

The roadmap highlights three broad areas of activity. The first task is to identify new layered materials, assess their potential, and develop reliable, reproducible and safe means of producing them on an industrial scale. Identification of new device concepts enabled by 2D materials is also called for, along with the development of component technologies. The ultimate goal is to integrate components and structures based on 2D materials into systems capable of providing new functionalities and application areas.

Eleven science and technology themes are identified in the roadmap. These are: fundamental science, health and environment, production, electronic devices, spintronics, photonics and optoelectronics, sensors, flexible electronics, energy conversion and storage, composite materials, and biomedical devices. The roadmap addresses each of these areas in turn, with timelines.

Research areas outlined in the roadmap correspond broadly with current flagship work packages, with the addition of a work package devoted to the growing area of biomedical applications, to be included in the next phase of the flagship. A recent independent assessment has confirmed that the Graphene Flagship is firmly on course, with hundreds of research papers, numerous patents and marketable products to its name.

Roadmap timelines predict that, before the end of the ten-year period of the flagship, products will be close to market in the areas of flexible electronics, composites, and energy, as well as advanced prototypes of silicon-integrated photonic devices, sensors, high-speed electronics, and biomedical devices.

"This publication concludes a four-year effort to collect and coordinate state-of-the-art science and technology of graphene and related materials," says Andrea Ferrari, director of the Cambridge Graphene Centre, and chairman of the Executive Board of the Graphene Flagship. "We hope that this open-access roadmap will serve as the starting point for academia and industry in their efforts to take layered materials and composites from laboratory to market." Ferrari led the roadmap effort with Italian Institute of Technology physicist Francesco Bonaccorso, who is a Royal Society Newton Fellow of the University of Cambridge, and a Fellow of Hughes Hall.

"We are very proud of the joint effort of the many authors who have produced this roadmap," says Jari Kinaret, director of the Graphene Flagship. "The roadmap forms a solid foundation for the graphene community in Europe to plan its activities for the coming years. It is not a static document, but will evolve to reflect progress in the field, and new applications identified and pursued by industry."

Europe's Graphene Flagship lays out a science and technology roadmap, targeting research areas designed to take graphene and related two-dimensional materials from academic laboratories into society.

This publication concludes a four-year effort to collect and coordinate state-of-the-art science and technology of graphene and related materials
Andrea Ferrari
Graphene flowers

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Police use of force: White House told US must learn from UK

0
0

One of the world’s leading criminologists will today tell the Presidential task force on 21st century policing that the United States needs to look to the policing policies and practices of the United Kingdom in order to significantly reduce the levels of deadly force used by and against US police.

Cambridge Professor Lawrence Sherman will say that the professional policing structures in England and Wales, developed since 1856, provide the best possible model for a transition to US policy at a state level, where legal powers of policing lie for most US crimes.  

He believes that individual states should sign up to such England-Wales policies as a Chief Inspector of Constabulary, an independent police complaints commission, and much larger minimum staffing sizes for police forces, which – combined with national initiatives such as the England-Wales register of dismissed officers – would boost policing legitimacy and help to bring down the high annual rates of so-called ‘justifiable’ homicide committed by US police.

Sherman, Director of the Institute of Criminology at the University of Cambridge, expert in evidence-based policing and native of New York State, will tell the task force that they provide the American people with “the first opportunity in a generation” to rethink fundamental approaches to policing in the US, and that the English-Welsh system is the “best bet we have for the 21st century”. Read the full testimony here.

“Few if any nations have achieved more public safety with less police use of force or deadly force than England and Wales,” Sherman will say. “American policing in the 21st century has achieved enough to look across the world and consider whether some other systems might yield better results. The English system has produced by far the best results.” 

In 2013, at least 461 people were killed by US police in ‘justifiable’ homicides according to official FBI reports, although Sherman will say that estimates from news media reports would suggest that number was over 1,000. In the same year, the number of people in England and Wales killed by police was zero.

Sherman says that the vast national differences in use of deadly force is not due to a lack of confrontations in which police had legal powers to kill. In London alone in 2012, police sent authorised firearms officers to 2,451 incidents, including 634 direct threats to life, and seized 416 firearms.

“The reason London’s police killed no one in these events is the result of an infrastructure of institutions and policies that is completely lacking in US policing,” Sherman will tell the White House Task Force. “My recommendations are based on 45 years of working with US police agencies and 15 years of helping to redesign the English policing infrastructure,” he will say. 

Sherman will advocate sweeping changes to US policing systems on three tiers: federal, state and local – all based on UK policing models.

Recommendations for federal government include the setting up of a National College of Policing (NCP) and a National Registry of Dismissed Officers, both of which have been established since 2012 at the UK’s College of Policing, of which Sherman is a non-executive director.

The NCP would run a three-month residential course for potential police executives, with a curriculum featuring the major research evidence recommended by the National Institute of Justice, “essential knowledge for preventing crime and maintaining the legitimacy of police institutions”, says Sherman. A registry of dismissed officers should be established by the Attorney General, accessible only to police agencies conducting background investigations prior to hiring officers. 

The President should issue an executive order requiring all federal law enforcement agencies to sign up to a British-style proportionality standard for the use of deadly force. “This standard would not replace statutory or case law, but hold in situations where there is a clear risk that deadly force might become necessary, but would be disproportionally severe in relation to the reason for engagement,” says Sherman.

On a state level, additional recommendations include the establishment in each US state of an office for an Inspector General of Police (IGP). Similar to Her Majesty’s Inspectorate of Constabulary, the IGP would be empowered to observe, review records and audit all state, county and municipal police agencies – issuing public reports on the level to which they meet each state’s standards and training boards’ recommendations. The IGP would be appointed by the state governor to serve a five-year term.    

Each state should also establish an independent police complaints commission (IPCC) to investigate complaints against officers or agencies. Going beyond the powers of the IPCC of England and Wales, Sherman says such commissions in the US should have the power to dismiss a police officer from the profession on grounds of an ethical breach, even without prosecution or conviction of a crime.  

On a local level, recommendations include the merging of local forces to create police agencies with an absolute minimum of 100 employees per force. Many problems of organisational quality control are exacerbated by the tiny size of most local police forces in the US, says Sherman.

“In 2008, 73% of all US police agencies employed fewer than 25 people, and less than 1% of all 17, 985 US agencies currently meet the English minimum of 1,000 employees. All US agencies should at least aim for a minimum of 100 fulltime employees.”

Sherman points out that no recommendations will completely eradicate the controversies over policing a free society, and there are also deep cultural reasons for differences in policing between the US and the UK. But recent events in the US over the past year, and the public outrage that has ensued, run the risk of irreparably damaging public trust in US policing institutions. 
  
“As both a criminologist and a US citizen, it is clear to me that fundamental changes in our structures of policing are needed – so the question is, what changes to try? In terms of policing with public safety, the English system is the best bet we have for the 21st century.”

Cambridge criminologist tells White House task force that translating UK models of policing to US is the best hope in a generation for tackling dangerous rates of ‘justifiable’ homicides committed by US police, and the resultant haemorrhaging of police legitimacy across the nation.

As both a criminologist and a US citizen, it is clear to me that fundamental changes in our structures of policing are needed
Lawrence Sherman
Image from #BlackLivesMatter protest in Berkeley

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 
Viewing all 4346 articles
Browse latest View live




Latest Images