Quantcast
Channel: University of Cambridge - Latest news
Viewing all 4346 articles
Browse latest View live

Opinion: Why medical technology often doesn’t make it from drawing board to hospital

0
0

If there’s something wrong with your brain, how do you spot that in an MRI? Of course, if it’s something obvious, such as a major aneurysm or a tumour, anyone can see it. But what if it’s something more subtle, such as a neural pathway that is more deteriorated than normal? This might be hard to spot by simply looking at an image. However, there is a range of medical image analysis software that can detect something like this.

You may take a diffusion-weighted MRI– a type of MRI that displays white matter extremely well (think of white matter as the neural roadways that connect areas of grey matter). Then, after processing that MRI, you can use tractography to view the white matter road system as a 3D computer model. You can then measure deterioration across these white matter pathways by looking at a measurement called the fractional anisotropy. After someone uses a software tool to bend the image of your brain to a standard shape, its fractional anisotropy can be compared to a database of hundreds of other diffusion-weighted MRIs to find any abnormalities.

But that probably won’t happen in a hospital. All of the methods described above exists in the research world – but in the clinical world, a radiologist will likely just eyeball your MRI and make a diagnosis based on that.

Why? One major reason is that this software is only really usable by experts in the research world, not clinicians. The incentives to create medical imaging software are considerable, but the incentives to improve it to a final product – in the way that, say, Microsoft is organically inclined to improve its products after user feedback – are nonexistent. If your operating system crashes, Microsoft has the resources and infrastructure to debug it and put that into the next release. But science labs are barely inclined, funded, or skilled enough (from a software engineering perspective) to improve their software after an initial release.

Beta or worse

FSL, a software toolbox created by Oxford for analysing MRIs, is one of the best of its kind out there. Virtually every medical imaging researcher uses it – and yet, with a graphical user interface that resembles something from the 1990s (plagued with hard-to-follow acronyms) such an item would be impossible or dangerous for the average clinician to use without six months of training. Most researchers don’t even bother to use its graphical user interface, even when they’re just starting to learn it, instead opting to use it as a command-line tool (that is to say, entirely text-based – virtually all computers were like this before the early 1980s). This is not an anomaly. AFNI, another image analysis package, is worse; upon start up, five windows pop up immediately and start flashing like an old GeoCities site.

FSL’s graphical user interface.

In some cases, problems with research software can go beyond the user’s learning curve – and those problems are less obvious and more dangerous when not caught. Both AFNI and FSL suffered from a bug that risked invalidating 40,000 fMRI studies from the past 15 years. Such bugs, unaccounted for, could further inhibit the potential use of research software in clinics.

No business model

Why do people still use this software despite all this? Well, in the vast majority of cases, they do work. But the incentives to improve them enough to make them easier to use – or simply to make better products – are not there. Apple makes a profit off its computers and so is inclined to constantly improve them and make them as easy to use as possible. If they don’t do this, people will just go to Windows, a product that essentially does the same thing.

But while you or I will pay money for a computer, FSL is open source and scientists pay nothing for it. The monetary incentive, rather, comes from the publications resulting from using this software – which can lead to grant funding for further research.

Neurovault, an open library for MRIs used in previous research studies, requests in its FAQs that researchers cite the original paper about Neurovault if they make any new discoveries with it, so that Neurovault can obtain more grant money to continue its work. Databases such as Neurovault are excellent and very necessary initiatives – but do you notice Google asking people to cite its original Pagerank algorithm paper if they make a discovery from Google?

The five windows that make up AFNI’s graphical user interface.

Even Kitware, a medical imaging software company, is totally open-source and makes its money from grants and donations, rather than by selling its software and actively seeking feedback from users. Kitware has better graphical user interfaces, but it’s still essentially a research company; most of its tools would still not be suitable to use without several months of specialised training.

Medical imaging is a brilliant field filled with brilliant minds, but the incentives to drive its proof-of-concepts into final products, suitable for use by clinicians instead of just researchers, are not in place yet. While this remains the case, the road from the lab to the hospital will continue to be stagnant.

Matthew Leming, PhD candidate in Psychiatry, University of Cambridge

This article was originally published on The Conversation. Read the original article.

Medical imaging is a brilliant field filled with brilliant minds, writes Matthew Leming, PhD candidate in Psychiatry for The Conversation. So why don’t we see more new technologies making it into hospitals?

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Sharpening our knowledge of prehistory on East Africa’s bone harpoons

0
0

East Africa is the epicentre of human evolution and its archaeological remains offer the potential to fill gaps in our understanding of early modern humans from their earliest origins, around 200,000 years ago, through to the most ‘recent’ prehistory of the last 10,000 years.

The In Africa project, directed by Dr Marta Mirazón Lahr, co-founder of the Leverhulme Centre for Human Evolutionary Studies at the University of Cambridge, is seeking to do exactly that. The group believes that, in East Africa, key ecological and cultural conditions converged, which allowed modern humans to evolve new behaviours and technologies to better exploit the natural resources that they found around them.

For the past five years, they has been working on the palaeoshores of Lake Turkana in Kenya, which has offered significant insights into how people there made use of aquatic resources such as fish or shellfish, something which is seen as a marker of human modernity.

Dr Alex Wilshaw, in Cambridge's Department of Biological Anthropology and a fellow of St John’s College, is a Research Associate on the project. “Looking at prehistoric tools and technology is a key way of exploring when and how the cultural and behavioural traits associated with modern humans were developed,” he explains.

“The area around Lake Turkana is extraordinarily rich not just in fossils, but also in artefacts used to exploit the ecology of the area. In the case of aquatic resources from the lake, these artefacts are often harpoons or points made from bone. While previous archaeological projects have led to pockets of harpoon discovery, the extent of this project has afforded us the opportunity to collect unprecedented numbers of bone harpoons – to date, we have over 500 from 20 different sites.”

Mirazón Lahr and Wilshaw are now preparing a monograph cataloguing and describing the harpoons to give a clearer picture of the diversity that exists within the collection.

“Together, the harpoons have the potential to offer a spatial and temporal cross-section of the activities of early modern humans in the area and tell us something about functional and stylistic changes in technology,” Wilshaw says. “The sites contain artefacts from groups who lived at different times and if we look at the harpoons in detail, their distinct styles show signs of variation among different populations and could offer clues about the appearance and disappearance of diverse groups as the lake levels rose and fell over time.”

The harpoons range in date from around 13,000 years ago – late in the geological epoch known as the Pleistocene – to around 6,000 years ago, the middle of the current geological epoch known as the Holocene. The researchers used radiocarbon and other dating techniques on samples of shell and sediment surrounding the harpoons to place them in time.

While some of the harpoons were sharpened into elongated spears or barbed points, others look more like hooks. Some have been decorated and polished. “There is some discussion over what the harpoons were used for, but we think it is likely to have been fishing, rather than hunting of land animals, as they were all discovered on the lake edge,” Wilshaw explains. “The harpoons would have been attached to a pole or haft and connected using twine or string which then enabled the hunter-fishers to spear their prey and then pull in their catch. There are some huge species of fish native to this area and some of the bigger and thicker harpoons may have been used to catch species like Nile Perch, which can grow up to two metres long. It is possible that the groups were using them to hunt hippo, which were also common in the area.”

The research team focused their efforts on recovering remains from across an extensive landscape exhibiting the remnants of the lake edge and its surrounding flood plain. Many animal and human remains were fossilised and preserved in mud and sediment on the shores of the lake, but as the lake shrank and the environment became increasingly dry, the wind and rain eroded the surface and exposed the fossils.

This phenomenon led the group to the discovery not just of the bone harpoons, but also of many other prehistoric human remains and artefacts. Published last year in Nature, such fossilised bones protruding from the earth led to the remarkable discovery of the remains of a group of hunter gatherers who were brutally massacred around 10,000 years ago at the site of Nataruk – the earliest record of inter-group violence among prehistoric nomadic people. 

The researchers are hoping to win further funding to unlock more of the secrets of East Africa’s prehistoric harpoons.

“Some appear to have been carved from bone, some from ivory and others from horn, but we would like to do a more detailed analysis of what they were made out of and whether there was a preference for material,” adds Wilshaw. “Searching for patterns in functionality could reveal whether design and material varied for different prey and how creative the people were being with technology. Interestingly, some of the harpoons also look as if they have been polished and residue analysis could tell us what people were using to care for their tools”

The In Africa project, which was funded by the European Research Council, aims to use its fossils and archaeological discoveries to enhance international awareness of the role of Africa in the evolution of human diversity.

“The harpoons are the iconic remains of a people who have disappeared,” says Mirazón Lahr, “when they lived, Lake Turkana was much larger and the environment much richer. These discoveries allow us to track their lives, from when the lake rose as the ice age ended to the point where the lake shrank and desert conditions set in – bringing an end to the tradition that had lasted thousands of years and about which very little was previously known.”

Inset images from the In Africa project.

To keep up to date with the latest stories about Cambridge’s engagement with Africa, follow #CamAfrica on Twitter.

 

A project exploring the role of East Africa in the evolution of modern humans has amassed the largest and most diverse collection of prehistoric bone harpoons ever assembled from the area. The collection offers clues about the behaviour and technology of prehistoric hunter-gatherers. 

"There are some huge species of fish native to this area and some of the bigger and thicker harpoons may have been used to catch species like Nile Perch, which can grow up to two metres long."
Alex Wilshaw
Harpoons discovered by the In Africa project

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

King’s College Chapel, 21 pianos, one very special performance

0
0

Taking place in King’s College Chapel on Tuesday (February 21) at 10pm, the Nocturne for 21 Pianos is a collaboration between composer and King’s College Fellow in Music Richard Causton, the Peterborough and Saffron Walden Centres for Young Musicians, King’s College Musical Society and Millers Music.

The pianos, donated by Cambridge music shop Millers Music and worth more than £50,000, will then be gifted to local schools and institutions, who are being encouraged to apply to receive one of the instruments.

A reworking of Chopin’s original Nocturnes, the concert will see 21 local young musicians play 21 pianos simultaneously. With the pianos arranged in a large circle in the Chapel, it will be both a visual and aural spectacle, with a previous performance in 2010 described as “Eerie, ethereal and enchanting,” by The Times.

“This is a unique event for King's College Chapel and the sound and sight of 21 pianos in this wonderful space promises to be really memorable,” said Causton.

“As a child I studied at the Centre for Young Musicians, and I am very happy that pianists from the Peterborough and Saffron Walden branches of CYM will be joining forces with Cambridge University students for this very special performance. It’s a fantastic chance to play in such an awe-inspiring space.”

All 21 pianos have been provided by Millers Music, in celebration of its 160-year anniversary, and its Norwich-based sister store Cookes Pianos, for its 130-year anniversary.

After the event, Millers will gift the pianos to schools and institutions across East Anglia, based on applications received via its website www.millersmusic.co.uk/21pianos. Submissions are now open, and those who apply will need to state why they believe their institution would benefit from a piano.

Entries will be reviewed by a panel of judges, including Richard Causton and Millers managing director, Simon Pollard.

Pollard said: “We’re thrilled to be collaborating with such a prestigious university that celebrates music education. As the oldest music shop group in the UK, we are dedicated to encouraging more young people in the region to embrace music, and gifting these pianos to local institutions does just that.”

The 21 Piano Nocturne concert is part of the Chapel Lates concert series, which Causton also curates. Attendees must arrive at 9.45pm for a 10pm start, with an estimated finish time of 10.50pm. Tickets are priced at £10 (concessions £5 and King’s members £2) and available to buy here and King’s College Visitors Centre.

Schools, community centres, churches and other education institutions in East Anglia are eligible to apply to receive a piano. The closing date for applications is Sunday, March 12.

One of the UK’s most iconic buildings will resonate to the sound of 21 pianos on Tuesday evening as part of a unique event involving a Cambridge composer, students, and young musicians from around Cambridgeshire.

This is a unique event for King's College Chapel and the sound and sight of 21 pianos in this wonderful space promises to be really memorable.
Richard Causton
King's College Chapel

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Mapping the family tree of stars

0
0
Image showing a family trees of stars in our solar system, including the Sun

It was Charles Darwin, who, in 1859 published his revolutionary theory that all life forms are descended from one common ancestor. This theory has informed evolutionary biology ever since but it was a chance encounter between an astronomer and an biologist over dinner at King’s College in Cambridge that got the astronomer thinking about how it could be applied to stars in the Milky Way.

Writing in Monthly Notices of the Royal Astronomical Society, Dr Paula Jofré, of the University of Cambridge’s Institute of Astronomy, describes how she set about creating a phylogenetic “tree of life” that connects a number of stars in the galaxy.

“The use of algorithms to identify families of stars is a science that is constantly under development. Phylogenetic trees add an extra dimension to our endeavours which is why this approach is so special. The branches of the tree serve to inform us about the stars’ shared history“ she says.

The team picked twenty-two stars, including the Sun, to study. The chemical elements have been carefully measured from data coming from ground-based high-resolution spectra taken with large telescopes located in the north of Chile. Once the families were identified using the chemical DNA, their evolution was studied with the help of their ages and kinematical properties obtained from the space mission Hipparcos, the precursor of Gaia, the spacecraft orbiting Earth that was launched by the European Space Agency and is almost halfway through a 5-year project to map the sky.

Stars are born from violent explosions in the gas clouds of the galaxy. Two stars with the same chemical compositions are likely to have been born in the same molecular cloud. Some live longer than the age of the Universe and serve as fossil records of the composition of the gas at the time they were formed.  The oldest star in the sample analysed by the team is estimated to be almost ten billion years old, which is twice as old as the Sun. The youngest is 700 million years old.

In evolution, organisms are linked together by a pattern of descent with modification as they evolve. Stars are very different from living organisms, but they still have a history of shared descent as they are formed from gas clouds, and carry that history in their chemical structure. By applying the same phylogenetic methods that biologists use to trace descent in plants and animals it is possible to explore the ‘evolution’ of stars in the Galaxy.

“The differences between stars and animals is immense, but they share the property of changing over time, and so both can be analysed by building trees of their history”, says Professor Robert Foley, of the Leverhulme Centre for Human Evolutionary Studies at Cambridge.

With an increasing number of datasets being made available from both Gaia and more advanced telescopes on the ground, and on-going and future large spectroscopic surveys, astronomers are moving closer to being able to assemble one tree that would connect all the stars in the Milky Way.

Paula Jofré et al. ‘Cosmic phylogeny: reconstructing the chemical history of the solar neighbourhood with an evolutionary tree’ is published by Monthly Notices of the Royal Astronomical Society. DOI 10.1093/mnras/stx075

 

Astronomers are borrowing principles applied in biology and archaeology to build a family tree of the stars in the galaxy. By studying chemical signatures found in the stars, they are piecing together these evolutionary trees looking at how the stars formed and how they are connected to each other. The signatures act as a proxy for DNA sequences. It’s akin to chemical tagging of stars and forms the basis of a discipline astronomers refer to as Galactic archaeology.

The branches of the tree serve to inform us about the stars' shared history
Dr Paula Jofré
Image showing family trees of stars in our solar system, including the Sun

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Multiplier effect: the African PhD students who will grow African research

0
0

“Africa needs a million new PhD researchers over the next decade.” It’s a huge figure. Professor David Dunne uses it to explain the scale of need in Africa for a new generation of scholars who will pioneer sustainable solutions to many of the continent’s challenges.

“There are world-class academics in Africa,” he explains, “but not enough to train and mentor all the young researchers that Africa needs to maintain and accelerate its progress. This is where Cambridge and other leading international universities can help, by making expertise and facilities available to help bridge this mentorship gap.”

Dunne is Director of the Cambridge-Africa Programme, a University initiative that for the past eight years has been building collaborative links between Cambridge and Africa. The model is centred on Cambridge researchers helping to mentor young African researchers in their African universities and research Institutions. This contributes to research capacity building in Africa but also benefits Cambridge by widening the experience and opportunities for its researchers and students.

However, that stark fact remains – a great many more new researchers are needed. With this in mind, a new Cambridge-Africa PhD studentship scheme began to enrol PhD students last year from all over Africa – five per year, every year for five years. “It’s at least a beginning,” says Dunne. “We want this programme to grow in Cambridge, and other universities.”

One criterion is that the prospective student must be studying issues that are priorities for Africa. The research interests of the current students are broad: from urban growth to poverty, business associations to sustainable industries, infectious disease to post-conflict citizenship.

Taskeen Adam

Taskeen Adam is one of the PhD students. She’d worked as an electrical engineer for two years when she decided that she wanted to use her skills to bring about social change. “What attracted me to engineering was the challenge of solving technical problems. But my real passion is for humanitarian issues and the need to create quality education for all.”

In 2012, the United Nations General Assembly declared access to the internet as a basic human right. But figures from 2014 gathered for Taskeen’s home country of South Africa showed that more than 4,000 schools had no access to electricity and 77% of schools had no computers. Many thousands of children were missing out on the chance to learn the skills needed to make a better life.

Her research is enabling her to look at the educational opportunities afforded by the internet, in particular the potential of decolonised African MOOCs (Massive Online Open Courses) as a means for delivering inclusive educational programmes to the most marginalised learners in South Africa. She’s keen to develop an online educational framework adapted for, and relevant to, communities in developing countries.

Taskeen completed her first degree at the University of Witwatersrand in Johannesburg. On graduating, and while working full time, she pioneered an initiative called ‘Solar Powered Learning’ to give students in rural areas access to technology that was both low cost and environmentally friendly.

The pilot project won Taskeen accolades. She was listed among South Africa’s Mail & Guardian’s top 200 Young South Africans for 2014. This gave her the confidence to embark on a career that would use her engineering skills in ways that could help to bridge inequalities.

As part of her Master’s research, she spent two weeks in Kigali, capital of Rwanda, where she visited schools benefiting from a national scheme to equip every child with a laptop. It was clear that this commendable programme was failing to enhance learning. Although resources were being provided, there was a lack of focus on maintenance skills, curriculum integration and teacher professional development. In many cases, the children were more comfortable using the laptops than were their teachers. 

“My trip demonstrated the mismatch between the deliverables and the outcomes of the scheme. The focus was on technology deployment, rather than on improving educational attainment,” she says. “Many African governments seem to be following a similar path, and I hope that, by using the resources, networks and expertise here in Cambridge, I might eventually be able to influence policy changes at the intersection of education and technology back in Africa.”

Richmond Juvenile Ehwi

Richmond Juvenile Ehwi also hopes to take his skills and expertise back to his home country, Ghana. He has just arrived in Cambridge to start his PhD in Cambridge’s Department of Land Economy. After his first degree at Ghana’s Kwame Nkrumah University of Science and Technology, he worked as a research consultant and estate manager.

Moving to Ghana’s capital city, he became interested in the changes he saw in the property market. “Plush Western-designed detached houses, apartments and gated communities are springing up and I wondered what the future would be like for Ghana’s urban landscape. While this development mirrors Accra’s integration into the globalised city concept, accompanying this trend are social, economic, environmental and cultural costs.”

As Western lifestyles become increasingly popular, the older-style family compounds associated with traditional Ghanaian culture are declining, even in rural areas. “With literacy rates and standards of living rising, households are demanding greater privacy and better sanitation which, in most traditional compound houses, are greatly compromised,” he explains.

In the West, gated communities are often seen in a negative light: they are associated with segregation, racial polarisation and social exclusion. While accepting the realities of this criticism, Richmond seeks to facilitate a balanced discussion and inspire evidence-based planning policies.

He suggests that, as new gated residences develop in the suburbs, there can be both material and social benefits for surrounding areas. “In Ghana, the new gated communities tend to be multiracial rather than segregated according to race or nationality. The ability to pay for your house is what counts, not what you do or what your ethnicity is. Gated developments offer the security and services that most people aspire to,” he says.

Entire neighbourhoods can benefit from the expectations of the owners of the new properties, he explains: “It’s misleading to think of gated communities as isolated enclaves. People who live in them are not completely cut off from society. They travel to work, to malls and markets, to church services. These public spaces facilitate social interaction. Also, better-off households offer employment for gardeners, drivers and care givers – and help to raise incomes and opportunities.”

His long-term plan is to create an Urban Study Research Centre back in Accra, and to take back a deeper understanding of the interplay of economic factors with social and cultural issues in urban development.

The multiplier effect

Dunne points to such plans as an indicator of the promise of the Cambridge-Africa PhD studentship scheme. “We are training 25 Cambridge-Africa scholars. It’s a small number compared with the overall need. But these researchers are a starting point. They will train other researchers and the expertise will multiply back in Africa.”

He adds: “It’s not just that Africa needs research and researchers for its own use. The world needs African researchers. We can’t have a situation where 14% of the world’s population – living on a continent with unique culture, diversity and environment – contributes less than 1% of published research output. The world needs the unique knowledge and perspective that African researchers can provide to solve our shared global challenges.”

The Cambridge-Africa PhD studentship scheme is funded by the University and the Cambridge Trust.

To keep up to date with the latest stories about Cambridge’s engagement with Africa, follow #CamAfrica on Twitter.

 

Taskeen Adam and Richmond Juvenile Ehwi are part of a PhD programme that’s enrolling five African students per year for five years, to help train world-class researchers for Africa. 

The world needs African researchers. We can’t have a situation where 14% of the world’s population – living on a continent with unique culture, diversity and environment – contributes less than 1% of published research output.
David Dunne
Taskeen Adam and Richmond Juvenile Ehwi

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Cocaine addiction leads to build-up of iron in brain

0
0

Cocaine is one of the most widely-used illicit drugs in the Western world and is highly addictive. A report last year by the UK government’s Advisory Council on the Misuse of Drugs found that almost one in 10 of all 16-to 59-year-olds have used cocaine in their lifetime. Cocaine use was implicated in, but not necessarily the cause of 234 deaths in Scotland, England and Wales in 2013. However, despite significant advances in our understanding of the biology of addiction – including how the brains of people addicted to cocaine may differ in structure – there is currently no medical treatment for cocaine addiction; most individuals are treated with talking or cognitive therapies.

A team of researchers led by Dr Karen Ersche from the Department of Psychiatry at Cambridge examined the brains of 44 people who were addicted to cocaine and 44 healthy control volunteers. In the cocaine group, they detected excessive amounts of iron in a region of the brain known as the globus pallidus, which ordinarily acts as a ‘brake’ for inhibiting behaviour.

Particularly striking was the fact that the concentration of iron in this area was directly linked with the duration of cocaine use – in other words, the longer that participants had used cocaine, the greater the accumulation of iron. At the same time, the increased iron concentration in the brain was accompanied by mild iron deficiency in the rest of the body, suggesting that iron regulation in general is disrupted in people with cocaine addiction.

“Given the important role that iron plays in both health and disease, iron metabolism is normally tightly regulated,” explains Dr Karen Ersche from the Department of Psychiatry. “Long-term cocaine use, however, seems to disrupt this regulation, which may cause significant harm.

“Iron is used to produce red blood cells, which help store and carry oxygen in the blood. So, iron deficiency in the blood means that organs and tissues may not get as much oxygen as they need. On the other hand, we know that excessive iron in the brain is associated with cell death, which is what we frequently see in neurodegenerative diseases.”

The Cambridge researchers now aim to identify the precise mechanisms by which cocaine interacts with iron regulation. Dr Ersche believes the most likely mechanism is that cocaine disrupts iron metabolism, possibly by reducing the absorption of iron from food, increasing the permeability of the blood-brain-barrier so that more iron enters the brain, where it can accumulate.

Although excess iron in the brain is associated with neurodegeneration, there is no suggestion that cocaine addiction leads to an increased risk of Alzheimer’s or Parkinson’s disease. The mechanism underlying the increase in iron in the brain in Parkinson’s disease, for example, is different to that in cocaine addiction, as are the affected brain regions.

As an essential micronutrient, iron can only be obtained through our diet and cannot be excreted, other than through blood loss. The researchers now want to find out whether means of correcting the disruptions in iron metabolism might slow down or even reverse the accumulation of iron in the brain, and ultimately help affected individuals to successfully recover from cocaine addiction.

This work was funded by the Medical Research Council and was conducted at the NIHR Cambridge Biomedical Research Centre and the Behavioural and Clinical Neuroscience Institute.

Reference
Ersche, KD et al. Disrupted iron regulation in the brain and periphery in cocaine addiction. Translational Psychiatry; 21 Feb 2017; DOI: 10.1038/tp.2016.271

Cocaine addiction may affect how the body processes iron, leading to a build-up of the mineral in the brain, according to new research from the University of Cambridge. The study, published today in Translational Psychiatry, raises hopes that there may be a biomarker – a biological measure of addiction – that could be used as a target for future treatments.

Given the important role that iron plays in both health and disease, iron metabolism is normally tightly regulated. Long-term cocaine use, however, seems to disrupt this regulation, which may cause significant harm
Karen Ersche
relaxing after work_MMVI

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Carol Ibe: Making training for African researchers affordable

0
0

Carol Ibe, a Gates Cambridge Scholar who was born in the USA but grew up in Nigeria, is not only doing a PhD in Plant Sciences, but is also running her own non-profit organisation to help train future African scientists and promote joined up thinking on sustainable development.

Carol set up the JR Biotek Foundation in 2013, although the idea for setting up the organisation came to her while she was doing her first masters in the States in 2006. A year later she had launched her first training programme in biotechnology and biomedicine for students and laboratory scientists in Africa. She wanted to ensure that those participating paid minimal costs so the training could be open to as many research students in Africa as possible, and so she worked with partners to keep costs down. More than 60 people applied for the training workshop from 11 countries in Africa. Even though costs were very low, many could not attend because they lacked funding.

Carol had already completed two masters and worked for several years as both a molecular biologist and a research biologist before setting up the Foundation. Although she was working hard on the organisation, she realised she wanted to continue her academic research so she applied to the University of Cambridge and tailored her research proposal to the work she is doing with her organisation.

“I started to think what area of training and capacity building could have the most impact in the continent,” she said. “Agriculture is key to Africa’s development because it is the largest employer of labour. Food insecurity remains a major problem. Soil conditions are deteriorating very rapidly and people are suffering on a daily basis. We need to train a new generation of scientists who can improve agricultural productivity and human health in Africa.”

Her PhD focuses on rice, the staple food of a large part of the world, and how to produce quality rice in places where there are poor soil and climate conditions. “Factors such as lack of funding and new technologies, poor infrastructure and poor market access hinder farmers from producing rice with higher yield and quality. If we can empower smallholder farmers to produce and sell more we can reduce poverty,” said Carol.

While she has been at Cambridge, Carol has been busy not just with her research but with forging partnerships which help achieve the goals she has set for the Foundation.

In April, the Foundation is holding the first African Diaspora Biotech Summit. The event will take place in Cambridge and will bring African graduate students, researchers and bio-industry leaders together to debate how research capacity, innovation and commercialisation can be strengthened across the continent.

It will bring together 70 African diaspora delegates from different disciplines and professions, including biotechnology and applied biosciences, policy, sustainable development and bio-entrepreneurship and will include a keynote address from Professor Lucy J. Ogbadu, Director-General of the National Biotechnology Development Agency at Nigeria’s Ministry of Science and Technology.

The summit will look at areas such as the role of modern bio-technologies in improving agricultural productivity and food security in sub-Saharan Africa by 2050 and the need to reform Africa’s tertiary education system to make it globally competitive. Carol says that too often previous initiatives developed outside Africa have failed to meet the need for which they are developed due to “a limited understanding of the depth of the problems facing African nations and the African people”.

In the lead-up to the summit, JR Biotek is running a Molecular Laboratory Training Workshop for Africa-based agricultural research scientists and academics. It will be held in collaboration with the University of Cambridge's Department of Plant Sciences. Eight PhD students will be awarded scholarships to attend the workshop and the Summit, funded by the BBSRC Global Challenge Research Fund and Trinity College, Cambridge. During the Summit, the Foundation will also hold the NextGen Africa Bioinnovation Pitch Competition which is designed to identify and celebrate bio-innovations made to improve lives and systems and to promote sustainable development in Africa.

Carol says: “I want to provide affordable quality training so scientists in Africa can be successful in their research projects. I know what they need because I have been there myself. I am also hoping that African governments and their development partners will start investing in research and development across all sectors in the continent, especially agriculture and healthcare because that’s how innovation, which we so desperately need in Africa, can come about.”

Do you have to choose between an academic career and activism? Gates Cambridge Scholar Carol Ibe is one of an increasing number of students are choosing to keep a foot in both camps.

I want to provide affordable quality training so scientists in Africa can be successful in their research projects. I know what they need because I have been there myself
Carol Ibe
Molecular Lab Techniques Training

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Cambridge celebrates ‘long-standing and deep-rooted’ relationship with India

0
0

The announcement coincides with a visit to India by Professor Sir Leszek Borysiewicz, Vice-Chancellor of the University of Cambridge. Professor Borysiewicz will address alumni and donors in New Delhi at an event to celebrate the fundraising campaign for the University and its colleges. He will also reconfirm the University’s commitment to attracting the brightest and best students from India.

“I am extremely proud of Cambridge’s long-standing and deep-rooted relationship with India,” says Professor Borysiewicz. “Many of India’s leading figures – academics, scientists, industrialists and politicians – have enjoyed a Cambridge education. Together we have achieved great things, and I know that by continuing to work together we will rise to even greater heights.”

Speaking at the event, the Vice Chancellor will announce that the University is renewing its commitment to attracting talented Indian students to study at Cambridge. From this year, admissions staff will be coming to India to visits schools and meet students face-to-face in Mumbai, Bangalore and Delhi. In autumn, a team of academics will visit India to conduct admissions interviews, so that applicants need not travel to the UK for that part of our application process.

“We believe that diversity – of nationality, of background, and of opinion is one of Cambridge’s greatest strengths,” he adds. “We are a University that is open to the world and must remain so.”

The centrepiece of Cambridge’s 2017 celebrations will be India Unboxed, a programme of exhibitions, events, digital engagement and installations organised by the University of Cambridge Museums and Botanic Garden. Rooted in the museum collections, the programme will explore themes of identity and connectivity for diverse audiences in the UK and India.

A series of profiles – This Cambridge-Indian Life– will look at the people at the heart of the relationship between Cambridge and India: Indian scholars and students who study at Cambridge, Cambridge researchers working in collaborations based in India, and notable Indian alumni from the University.

Throughout the year, the University will highlight key research collaborations that sit at the heart of Cambridge’s relationship with India. Cambridge leads three major joint UK-India centres: in cancer research, anti-microbial resistant tuberculosis, and crop science. It has 85 collaborative research partnerships across India in fields from the arts and humanities to entrepreneurship to the sciences and technology.

“The world today faces critical challenges – in the fields of education, energy, food security, health, and politics - to name but a few,” says Professor Borysiewicz. “These challenges are serious, complex and urgent. My deeply held conviction is that Cambridge has a responsibility to address these challenges. We know we cannot solve any of these problems in isolation and are working with partners in India to find local solutions to global issues.”

Notable Indian alumni from the University of Cambridge include:

  • Sir Dorabji Tata (Gonville and Caius College 1877):  played a key role in the development of the Tata Group, especially in the steel and power sectors
  • Prince Ranjitsinhji (Trinity College 1888): considered one of the greatest cricketers of all time and played for Sussex and England. In India, he did much to improve conditions in his home state of Nawanagar
  • Three Indian prime ministers studied at Cambridge: Jawaharlal Nehru (Trinity College 1907), India's first prime minister; Rajiv Gandhi (Trinity College 1961); Dr Manmohan Singh (St John's College 1955)
  • Srinivasa Ramanujan (Trinity College 1913): largely self-taught mathematics genius. He was the second Indian to be elected as a Fellow of the Royal Society
  • Harivansh Rai Bachchan (St Catharine's College 1955): Hindi poet best known for his lyric poem Madhushala
  • Jayant Narlikar (Fitzwilliam House and King's College 1957): co-developed the conformal gravity theory, commonly known as Hoyle–Narlikar theory, which synthesizes Einstein's Theory of Relativity and Mach's Principle
  • Amartya Sen (Trinity College 1957, 1998): Nobel prize-winning economist. His reputation is based on studies of famine, human development theory and welfare economics. He plays a key role in the debate on globalisation
  • Camellia Panjabi (Newnham College 1961): marketing director of Taj Hotels and co-director of Masala World, which own businesses that include the Bombay Brasserie, London
  • Zia Mody (Selwyn 1976): Indian legal consultant, considered an authority on corporate merger and acquisition law
  • Karan, Lord Bilimoria (Sidney Sussex College 1985): founder of Cobra Beer and founding chairman of the Indo British Partnership Network
  • Prathiba Singh (Hughes Hall 1991): leading intellectual property lawyer

For more information on Cambridge and India, see our new India site

Today, as part of UK-India Year of Culture 2017, the University of Cambridge launches a year-long celebration of its ties with India, which stretch back 150 years.

Indian flag mosaic

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Database protecting UK expats from Brexit ‘misinformation’ to be built by Cambridge researchers

0
0

University of Cambridge researchers have set out to compile a database of communication routes that will allow UK expats residing in EU nations to receive reliable, up-to-the-minute advice throughout the negotiation process once Article 50 is triggered.

The work is part of an effort to mitigate rash Brexit-induced decisions fuelled by an information vacuum that could see thousands of over-65s in particular arriving back in the UK without necessarily having property or pensions on return.

Such a sudden reverse migration could increase pressures on already overstretched health and social care services in the UK, at a time when significant numbers of key workers in these sectors may themselves be returning to EU homelands as a result of Brexit-related insecurities. 

Researchers say that fears over future rights held by UK citizens who have settled on the continent – about everything from possible legal status and rights to work, as well as access to welfare, healthcare and pensions – could be exacerbated by misinformation resulting from rumour, speculation and tabloid bombast.

They say there is an urgent need to create a ‘one stop shop’ for trustworthy information channels that cover the various types of UK migrants currently within the remaining EU: from students and young families in the cities to retirees on Mediterranean coastlines.

The research, funded by the UK’s Economic and Social Research Council (ESRC), will take place over the next six weeks. Researchers say the final product will be shared widely with trusted parties such as government agencies, legal charities and citizen advice bureaux, but will not be released fully into the public domain for fear of exploitation by commercial and lobby organisations.

“UK citizens abroad need to be empowered to make sound, informed decisions during Brexit negotiations on whether to remain in their adopted homelands or return to the UK,” says lead researcher Dr Brendan Burchell from Cambridge’s Department of Sociology.

“However, at the moment there is a missing link: there is no database of the conduits through which high quality information can be communicated that targets specific countries or sub-groups of UK migrants. This is what we aim to build over the coming weeks.”

The team of researchers will be scouring the internet and interrogating local charities and expat organisations to compile the most comprehensive list of information channels used by UK citizens in each of the other EU27 countries. These will include legal, health, financial and property advice services, English language local newspapers, Facebook pages, blogs, chat rooms and so on.

Last year, the BBC’s ‘Reality Check’ website reported that there are around 1.2 million UK-born people living in EU nations. Over 300,000 of those live in Spain, of which one-third receive a UK state pension.

Burchell says that talk of migratory influxes into the UK has been almost entirely limited to EU nationals during the heated debates around Brexit. Little consideration has been given to returning UK nationals from EU countries such as France and Spain – many of whom are increasingly elderly baby-boomer retirees that may not have lived in the UK for a decade or more.

“Without access to well-grounded information that updates throughout the Brexit process, the current void will be increasingly filled with dangerous speculation and even so-called ‘fake news’ from partisan groups or those that would seek to prey upon the anxiety of UK over-65s to make quick money through lowball property sales or investment scams,” says Dr Burchell.

Professor Maura Sheehan, an economist from Edinburgh Napier University’s Business School in also working on this project, believes that if panic is sparked it could lead to a domino effect in certain expatriate communities.

“Housing markets in areas along the Mediterranean coast could collapse as retirees try to sell up, but with no new UK expats looking to buy. Life savings could get swept away in the confusion,” she says. 

“Meanwhile there is no slack in UK social infrastructure for ageing expats returning en masse with expectations of support. The NHS has yet to emerge from its current crisis, there is a desperate shortage of housing, and social care is badly underfunded.

“The idea that we could see socially isolated baby-boomer expats back in the UK with health conditions, financial woes and even ending in destitution as a result of bad decisions based on misinformation should not simply be written off as so-called ‘remoaner’ hysteria.”

Anyone who would like to suggest material for the database or find out more about the project can contact the team on brexit_expat_info@magd.cam.ac.uk.

inset image by Ville Miettinen (cc: Att-SA)

Urgent requirement for channels of timely and reliable information to be developed targeting UK-born people living on the continent, say researchers – before life-changing decisions get made rashly in a milieu of rumour and speculation.

UK citizens abroad need to be empowered to make sound, informed decisions during Brexit negotiations on whether to remain in their adopted homelands or return to the UK
Brendan Burchell
Brits abroad

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Newly discovered planets could have water on their surfaces

0
0

The team has been using the TRAPPIST–South telescope at the European Space Observatory’s (ESO) La Silla Observatory, the Very Large Telescope (VLT) at Paranal, the NASA Spitzer Space Telescope as well as two other telescopes supported by the UK’s STFC, the William Herschel Telescope and the Liverpool Telescope. All the planets, labelled TRAPPIST-1b, c, d, e, f, g and h in order of increasing distance from their parent star, have sizes comparable to Earth.

The astronomers identified the planets thanks to periodic drops in the brightness of the central star. As the planets passed in front of the star, they cast a shadow, events known as transits, from which the team could measure the planet’s orbital periods and calculate their sizes and masses. They found that the inner six planets were comparable in size, mass and temperature to the Earth raising the possibility that they host liquid water on their surface.

With just 8% the mass of the Sun, TRAPPIST-1 is very small in stellar terms, only marginally bigger than the planet Jupiter — and though nearby in the constellation Aquarius, it is invisible visually with anything less than powerful telescopes. Astronomers expected that such dwarf stars might host many Earth-sized planets in tight orbits, making them promising targets in the hunt for extraterrestrial life. TRAPPIST-1 is the first such system to be discovered.

Co-author Dr Amaury Triaud, of the University of Cambridge’s Institute of Astronomy, explains: “Stars like TRAPPIST-1 belong to the most common type of stars that exist within our Galaxy. The planets that we found are likely representative of the most common sort of planets in the Universe”. He add: “That the planets are so similar to Earth bodes well for the search for life elsewhere. Planets orbiting ultra-cool dwarfs, like TRAPPIST-1, likely represent the largest habitable real estate in the Milky Way!”

The seven planets of the TRAPPIST-1 system. Credit: ESO

The seven planets of the TRAPPIST-1 system. Credit: ESO

The team determined that all the planets in the system were similar in size to Earth and Venus in our Solar System, or slightly smaller. The density measurements suggest that at least the innermost six are probably rocky in composition.

The planetary orbits are not much longer than that of Jupiter’s Galilean moon system, and much smaller than the orbit of Mercury in the Solar System. However, TRAPPIST-1’s small size and low temperature means that the energy input to its planets is similar to that received by the inner planets in our Solar System; TRAPPIST-1c, d and f receive similar energy inputs to Venus, Earth and Mars, respectively.

All seven planets discovered in the system could potentially have liquid water on their surfaces, though their orbital distances make some of them more likely candidates than others. Climate models suggest the innermost planets, TRAPPIST-1b, c and d, are probably too hot to support liquid water, except maybe on a small fraction of their surfaces. The orbital distance of the system’s outermost planet, TRAPPIST-1h, is unconfirmed, though it is likely to be too distant and cold to harbour liquid water — assuming no alternative heating processes are occurring. TRAPPIST-1e, f, and g, however, are of more interest for planet-hunting astronomers, as they orbit in the star’s habitable zone and could host oceans of surface water.

These new discoveries make the TRAPPIST-1 system an even more important target in the search for extra-terrestrial life. Team member Didier Queloz, from the University of Cambridge’s Cavendish Laboratory, is excited about the future possibilities: “Thanks to future facilities like ESO’ Extremely Large Telescope, or NASA/ESA’s soon-to-be-launched James Webb Space telescope, we will be capable to measure the structure of the planets’ atmospheres, as well as their chemical composition. We are about to start the remote exploration of terrestrial climates beyond our Solar system.”

The discovery is described in Nature, which also includes a science fiction short story, written by Laurence Suhner. Amaury Triaud comments: “We were thrilled at the idea of having artists be inspired by our discoveries right away. We hope this helps convey the sense of awe and excitement that we all have within the team about the TRAPPIST-1 system.”

The star draws its name from the TRAPPIST-South telescope, which made the initial discovery. TRAPPIST is the forerunner of a more ambitious facility called “SPECULOOS” that includes Cambridge as core partner, conducted by researchers of the “Cambridge Centre for Exoplanet Research” in the broad research context related to “Universal Life”. SPECULOOS is currently under construction at ESO’ Observatory of Cerro Paranal. SPECULOOS will survey 10 times more stars for planets, than TRAPPIST could do. We expect to detect dozens of additional terrestrial planets.

Michaël Gillon et al: “Seven temperate terrestrial planets around the nearby ultracool dwarf star TRAPPIST-1” Nature 23rd Feb. 2017

http://www.nature.com/nature/journal/v542/n7642/full/nature21360.html

Link to a science-fiction short story: http://www.nature.com/nature/journal/v542/n7642/full/542512a.html

Cambridge Exoplanet Research Centre: http://exoplanets.phy.cam.ac.uk

For additional information, images, videos, a graphic novel and short stories, visit www.trappist.one

An international team of astronomers has found a system of seven potentially habitable planets orbiting a star 39 light years away three of which could have water on their surfaces raising the possibility they could host life. Using ground and space telescopes, the team identified the planets as they passed in front of the ultracool dwarf star known as TRAPPIST-1. The star is around eight per cent of the mass of the Sun and is no bigger than Jupiter.

That the planets are so similar to Earth bodes well for the search for life elsewhere
Amaury Triaud

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Opinion: Population versus targeted – which approach is best for preventing heart disease?

0
0

 

Cardiovascular disease is the number one killer worldwide and the second biggest killer in the UK. However, most cases of heart disease can be prevented by managing risk factors.

The risk of getting heart disease in people who do not already have it is easily assessed using risk scores. These risk scores use information from a combination of risk factors to calculate how likely you are to develop heart disease. If you have a high score (in the UK, a one in ten or greater chance of getting heart disease in the next ten years), your GP may recommend changes to your diet, more exercise, or medicines, such as statins, to reduce your risk.

In most countries, regular assessment of heart disease risk is recommended for all people above a certain age (universal screening). In the UK, the NHS Health Check is an example of a universal screening programme which is available to all people who are 40 to 74 years old. It assesses a person’s risk of developing heart disease, stroke, diabetes and kidney disease.

Yet there is debate over whether screening for heart disease should be universal or targeted. Targeted screening involves screening specific groups of people who might be considered to be at higher risk. For example, this could include prioritising screening of people with diabetes or hypertension (known medical risk factors) or people with a high risk score based a combination of their known risk factors.

The benefits of using statins in people who already have heart disease is widely accepted. But some healthcare experts feel that giving statins to reduce risk in healthy people could lead to “over-medicalising” the population. How does this stack up against the evidence and what is the impact of screening?

Death rates from heart disease have been falling in many countries over time which means that without re-calibration heart disease risk scores often start to overestimate risk. This means that when they are routinely applied to a population, as happens in universal screening, some people who are assessed as having a high risk will not go on to develop heart disease.

The benefit of statins is widely accepted.roger ashford/Shutterstock.com

These otherwise healthy people may be prescribed medicines that are not needed, which can lead to higher healthcare costs as well as potential exposure to side effects. On the other hand, universal screening can also help identify people who go on to develop heart disease who may not have been identified through targeted approaches. Starting treatment in these people earlier can help reduce risk and ultimately can save lives or improve quality of life. Although all medicines have the risk of side effects, statins have been found to be safe and effective.

Universal screening may be more difficult to put into practice compared with targeted screening as it requires high levels of support, funding, awareness, uptake and monitoring. It can also be difficult to encourage healthy people to go to their doctor for screening, so universal screening will never reach the whole population. Between 2009 and 2013, just 12.8% of people who were eligible had an NHS health check, lower than the expected coverage of 30%. Targeted screening is also more cost-effective for heart disease risk assessment than universal screening.

A Goldilocks approach

Is there a happy medium that balances the pros and cons of universal and targeted screening? In addition to regular heart disease checks for all people aged over 40, guidelines issued by the National Institute for Health and Care Excellence (NICE), the main institute that provides guidance on health issues in the UK, also recommend that information on risk factors in electronic health records is used by GPs to prioritise who should be invited for heart disease risk assessment.

Applying this targeted approach in a systematic and routine way is currently limited due to issues with missing information and poor capturing of some risk factors in health records. But improvements in how missing information is dealt with in risk scores and better recording of risk factors will help make this a reality in the near future.

Given the low coverage of the NHS Health Checks, combining this universal approach with targeted screening using information already recorded in electronic health records could provide the best opportunity for preventing heart disease and saving lives.

Ellie Paige, Research Associate in Epidemiology, University of Cambridge

This article was originally published on The Conversation. Read the original article.

Should screening for heart disease be universal or targeted to those at greatest risk? Ellie Paige (Department of Public Health and Primary Care) weighs up the evidence for The Conversation.

Fortunes of the Heart

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Opinion: How years of IMF prescriptions have hurt West African health systems

0
0

 

The International Monetary Fund (IMF) provides financial assistance to countries in economic trouble. But its policy proposals don’t always yield positive results for the countries it purports to help. For instance, critics have argued that the IMF inhibits government spending on public health and diverts resources from the health sector to repay external debt.

We set out to examine how IMF policy reforms affect government health systems in West Africa.

IMF policies have real consequences for real people. Our research showed that in West Africa the IMF has exerted a unique influence on the evolution of health systems in a number of countries. Among them are Benin, Burkina Faso, Cote d'Ivoire, Gambia, Ghana, Guinea-Bissau, Liberia, Mali, Niger, Nigeria, Senegal, Sierra Leone and Togo. These 13 countries have a combined population of more than 330 million.

It has done so through its trademark practice of “conditionality”. In exchange for loans, the IMF requires governments to adopt policies that prioritise short-term economic objectives over, for example, long-term investment in health systems.

West African health systems were weak thanks to legacies of conflict and weak state capacity even before the IMF got involved. Sadly, the policy reforms demanded by the IMF over the past two decades in exchange for loans have undermined the ability of national governments to repair their historical problems. In the process, hundreds of millions of lives have been affected.

Specifically, the IMF’s fiscal targets prompt reductions in health investment. Wage and personnel caps, for example, limit the ability of clinics and hospitals to employ more doctors and nurses. The IMF also encourages decentralisation of health services to make them responsive to local needs, which in practice can hamper the delivery of adequate health care.

Our research contributes to decades-old debates about the harmful effects of the IMF’s lending programmes on the development of public health systems. It shows that these concerns still hold today. The research also suggests that the IMF’s self-proclaimed prioritisation of health in recent years has been largely cosmetic.

West African health systems and the IMF

We searched archival material to conduct our research. This included IMF staff reports, government policy memoranda and correspondence between the IMF and national governments.

Strengthening public health care systems is central to achieving universal health coverage. This is a key objective of the United Nation’s Sustainable Development Goals.

West African countries have consistently lagged behind most other regions in the world when it comes to health system capacity. The region is home to nine of the lowest 20 ranked countries on the UN’s Human Development Index.

Infant mortality rates are also among the highest worldwide, with a regional average of 57.8 deaths per 1,000 live births in 2015. Public health spending also remains woefully inadequate at 2.4% of GDP for 2014 for the region.

Many attempts have been made to explain West Africa’s inadequate health systems. These include domestic factors, like legacies of conflict and weak state capacity. The failings of key intergovernmental organisations like the World Health Organisation (WHO) have also been blamed.

There’s no doubt that West African health systems were broken before IMF conditionality. But in the last 20 years, it is the IMF that has set the fiscal and institutional parameters within which health policies can develop. These did not repair the problems that already existed. They may even have exacerbated some.

The IMF’s presence in West Africa has been a source of controversy among public health practitioners since the Ebola crisis of 2014. The IMF was found to have contributed to the failure of health systems to develop, exacerbating the Ebola crisis.

Its critics complain that the IMF is responsible for designing inappropriate or dogmatic policies that undermine the development of health systems. But the organisation has argued that its reforms actually bolster health policy.

Our research suggests that this is not the case. The IMF’s policy reforms are actually hampering the development of West Africa’s health systems.

Linking IMF conditions to health systems

First, macroeconomic targets set by the IMF reduced fiscal space for health investment. The IMF has promoted social protection policies as part of its lending programmes. But these have been inadequately incorporated into programme design.

For example in 2005, when Malian government expenditure on health reached 3.0% of GDP, IMF staff encouraged authorities to reduce spending. They were concerned that “financing substantial increases of education and health sector wages … might eventually prove unsustainable”. In Benin, authorities cut spending on health in 2005 to “ensure achievement of the main fiscal objectives” of the IMF.

Second, conditions stipulating wage and personnel caps limited staff expansion of doctors and nurses. An example is a series of IMF conditions aimed at reducing Ghana’s public sector wage bill in 2005. The Ghanaian Minister of Finance wrote to the IMF that “at the current level of remuneration, the civil service is losing highly productive employees, particularly in the health sector”. Wage ceilings remained until late 2006, and the number of physicians in Ghana halved.

Third, administrative reforms prevented adequate delivery of health care. For example, following IMF advice, Guinean authorities devolved budgetary responsibilities from the central government to the prefectural, or regional, level in the early 2000s. Five years later, an IMF mission to the country reported “governance problems” that included “insufficient and ineffective decentralisation”, while also noting deterioration in the quality of health service delivery.

Neo-colonialism and policy space for health

How can the role of the IMF in influencing health policy in West Africa be explained? The organisation has long been regarded as a tool of the Western economic powers, primarily the US and Europe. The former imperial powers continue to use the IMF to promote a neoliberal agenda across the world.

As part of this neocolonial mission, the IMF has re-engineered the economic and political dimensions of sub-Saharan African countries via intrusive conditions. West Africa stands out as the region that had to implement a large share of such reforms over the past 20 years.

A country that receives an IMF loan typically experiences economic troubles. But even under constraining economic conditions, policy options remain. The question is who gets to define these policy options: the countries themselves, following domestic political processes, or the IMF?

The IMF has deprived West African nations of the policy space to adapt to local exigencies, undermining the delivery of effective health systems. Yet, domestic governments are equipped with local knowledge and are better informed on how crises are unfolding on the ground.

The IMF is headquartered in Washington DC. It is largely staffed with Anglo-Saxon economists who are tasked with leading responses to unfamiliar environments in faraway places. It is unsurprising that the organisation’s responses are so out of touch.

The Conversation

Thomas Stubbs, Research associate, University of Cambridge and Alexander E. Kentikelenis, Research fellow in politics and sociology, University of Oxford

This article was originally published on The Conversation. Read the original article.

International Monetary Fund policies can have a real impact on people – and don’t always yield positive results. Writing for The Conversation, Thomas Stubbs (University of Cambridge) and Alexander E. Kentikelenis, (University of Oxford) explore the impact its policies have made on health in West Africa.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

New study identifies possible early warning signs of Huntington’s disease

0
0

Researchers from the University of Cambridge and University of Surrey have identified early biomarkers of disease during examinations of Huntington’s disease sheep still at a pre-symptomatic stage of the disease.

Up until this point, the five-year-old sheep had displayed no signs of the illness, but the comprehensive study identified clear metabolic changes in the animals carrying the genetic variant. These new findings reveal that Huntington’s disease affects important metabolic processes in the body prior to the appearance of physical symptoms.

Huntington’s disease affects more than 6,700 people in the UK. It is an incurable neurodegenerative disease: patients typically die 10-25 years after diagnosis.

The disease is caused by a mutation in the huntingtin gene. Genetic information is coded in DNA that is made up of a repeated string of four molecules known as nucleotides, or bases – A, C, G and T. Changes in the genetic code of the hutingtin gene leads directly to disease. The gene contains a repeated string of CAG bases: in healthy individuals, the CAG repeat is around 20 CAGs long, but if the repeat has 36 or more CAGs, an individual will develop Huntington’s disease. The sheep model of Huntington’s disease, which carries a CAG repeat in the disease-causing range, has been developed to increase knowledge about the condition.

During this study, researchers took blood samples from the normal and Huntington’s disease animals every two hours over a 24-hour period and assessed their metabolic profiles using a targeted metabolomics approach established at the University of Surrey. Unlike previous research in this area, which was affected by to external environmental factors that impacted upon metabolic profiling, sheep in this study were monitored in a well-controlled setting, negating any outside influences.

Blood measurements found startling differences in the biochemistry of the sheep carrying the disease-causing variant, compared to the normal sheep. Significant changes were observed in 89 of the 130 metabolites measured in their blood, with increased levels of the amino acids, arginine and citrulline, and decreases in sphingolipids and fatty acids that are commonly found in brain and nervous tissue.

The alterations in these metabolites, which include key components of the urea cycle and nitric oxide pathways (both vital body processes), suggest that both of these processes are dysregulated in the early stages of Huntington’s disease, and that the illness affects the body long before physical symptoms appear.

The identification of these biomarkers may help to track disease in pre-symptomatic patients, and could help researchers develop strategies to remedy the biochemical abnormalities.

Professor Debra Skene from the University of Surrey said: “Metabolic profiling has revealed novel biomarkers that will be useful to monitor Huntington’s disease progression.

“Our research shows that this disease affects the body in a number of ways before the tell-tale signs of Huntington’s disease become visible.”

Professor Jenny Morton from the University of Cambridge said: “Despite its devastating impacts on patients and their families, there are currently limited treatments options, and no cure for Huntington’s disease.  The development of objective and reliable biomarkers that can be rapidly measured from blood samples becomes immeasurably important once clinical trials for therapies begin.

“The more we learn about this devastating illness the better chance we have of finding a cure.”

The research was funded by the CHDI Foundation and the Biotechnology and Biological Sciences Research Council.

Reference
Skene,  DJ et al. Metabolic profiling of presymptomatic Huntington’s disease sheep reveals novel biomarkers. Scientific Reports; 22 Feb 2017; DOI: 10.1038/srep43030

Adapted from a press release by the University of Surrey.

Early warning signs of Huntington’s disease have been uncovered in a sheep carrying the human disease-causing genetic variant, providing new insights into this devastating illness, a new study in Scientific Reports has found.

Despite its devastating impacts on patients and their families, there are currently limited treatments options, and no cure for Huntington’s disease
Jenny Morton
Sheep

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

£10m funding for advanced materials research awarded to the University of Cambridge

0
0
preparing materials

The new funding is part of a £128million Engineering and Physical Sciences Research Council (EPSRC) investment in the Sir Henry Royce Institute for Advanced Materials, which comprises seven partner Universities including Manchester, Oxford and Imperial College London.

Cambridge’s award will enable the University to purchase additional equipment to support its leadership of the Royce Institute’s Materials for Energy Efficient Information and Communications Technology initiative. This will focus on improving energy storage technologies, reducing power consumption and developing new materials and devices able to harness energy from the environment.

The new equipment will enable Cambridge researchers to fabricate new energy-efficient devices, such as batteries and solar cells, and to undertake the advanced characterisation of materials and machines. These techniques will, in turn, help to hasten the development of energy technologies that are safer and more efficient, including longer-life phone batteries and electric cars with extended ranges .

Much of the Royce Equipment will be housed within the Maxwell Centre, in the Cavendish Laboratory (Department of Physics), which is famous for the discovery of the structure of DNA (Crick and Watson), but brings together researchers from Engineering, Materials Science and Chemistry

Professor Sir Richard Friend, Director of the Maxwell Centre and Cambridge’s Cavendish Professor of Physics, welcomed the announcement, pointing to the support it would offer researchers in co-ordinating work across the University’s departments, maximising the opportunities for multidisciplinary collaboration.

Professor Friend added: “This funding will be vitally important in terms of enabling what we do with advanced materials to be enhanced both in terms of upstream university work but also in its industrial application.”

The EPSRC funding will be distributed across the Institute’s seven partners to support investments in new equipment and infrastructure. In turn, these new facilities will enable the Institute to accelerate the design of advanced materials and explore their possible applications, including their use in existing and emerging industrial sectors within the UK.

Focused on promoting translation from discovery to application, the Royce Institute will play a major role in driving forward key elements of the Government’s industrial strategy, which lays a particular emphasis on enhancing the commercialisation of the UK’s world-leading basic research. 

Research into improving energy storage, reducing power consumption and developing new energy-efficient devices received a boost with the announcement of £10m funding for new equipment at the University of Cambridge.

This funding will be vitally important in terms of enabling what we do with advanced materials to be enhanced both in terms of upstream university work but also in its industrial application
Professor Sir Richard Friend

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Cambridge to partner in major new research centre aimed at tackling challenges in health and life sciences

0
0

The institute is named in honour of the pioneering British scientist whose use of X-rays to study biological structures played a crucial role in the discovery of DNA's 'double helix' structure by Francis Crick and James Watson. It will bring together UK strengths in the physical sciences, engineering and life sciences to create a national centre of excellence in technology development and innovation.

The institute is part of the government's Industrial Strategy to maintain the UK's global leadership in science, innovation and research and will have a hub based at the Harwell campus, outside Oxford. It will bring together UK expertise to develop new technologies that will transform our understanding of disease and speed up the development of new treatments.

Business Secretary Greg Clark said: “The UK has always been a pioneer in the world of science, technology and medical research. It's this excellence we want to continue to build on and why we made science and research a central part of our Industrial Strategy - strengthening links between research and industry, ensuring more home-grown innovation continues to benefit millions around the world.

“Named after one of the UK's leading chemists, the new Rosalind Franklin Institute will inspire and house scientists who could be responsible for the next great discovery that will maintain the UK's position at the forefront of global science for years to come.”

Delivered and managed by the Engineering and Physical Sciences Research Council (EPSRC), the Rosalind Franklin Institute will bring together academic and industry researchers from across the UK to develop disruptive new technologies designed to tackle major challenges in health and life sciences, accelerate the discovery of new treatments for chronic diseases affecting millions of people around the world (such as dementia), and deliver new jobs and long-term growth to the local and UK economies.

Chair of the Research Councils and EPSRC Chief Executive, Professor Philip Nelson said: “The UK is currently in a world leading position when it comes to developing new medical treatments and technologies in the life sciences. However, other countries are alive to the potential and are already investing heavily. The Rosalind Franklin Institute will help secure the country as one of the best places in the world to research, discover, and innovate.”

The central hub at Harwell will link to partner sites at the universities of Cambridge, Edinburgh, Manchester and Oxford, Imperial College, King's College London, and University College London. Industry partners will be on board from the outset, and the Institute will grow over time, as more universities and researchers participate.

The work at new Institute will contribute directly to the delivery of EPSRC's 'Healthy Nation' prosperity outcome, its Healthcare Technologies programme, and to the Technology Touching Life initiative that spans three research councils (the Biotechnology and Biological Sciences Research Council (BBSRC), the Medical Research Council (MRC) and EPSRC) and seeks to foster interdisciplinary technology development research across the engineering, physical and life sciences.

Patrick Vallance, President of R&D at GSK said: “We welcome the creation of the RFI which will bring world-leading, multi-disciplinary teams from industry and academia closer together, and will further strengthen the UK as a place to translate excellent science into patient benefit. Through collaboration we will be able to make advances in life science technologies much quicker than we could manage alone.”

Research at the RFI will initially be centred on five selected technology themes, focusing on next-generation imaging technologies - X-ray science, correlated imaging (combining X-ray, electron and light microscopy), imaging by sound and light, and biological mass spectrometry - and on new chemical methods and strategies for drug discovery.

Adapted from a press release by the EPSRC

The University of Cambridge is to partner in the new Rosalind Franklin Institute, a £100 million multi-disciplinary science and technology research centre announced by Business Secretary Greg Clark.

Harwell campus

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

The Monuments Men of Libya

0
0

True heroes, generous hearts: these are the Libyan archaeologists who, with Daesh at their heels, have accomplished the feat of completing the excavation of the Haua Fteah cave in Cyrenaica, one of the most important prehistoric sites of all Africa. When, in 2013, the team of international researchers involved at the site was forced to suspend work, nine Libyan archeologists - two women and seven men - successfully completed the excavation by themselves, securing its secrets for posterity. Thus the history of human population along the North African coast over the last 100 thousand years can now be written.

Haua Fteah is, in fact, the largest karst cave in the Mediterranean (measuring 80 x 20 metres) and is open to the sea a short distance from the city of Susa, the ancient Apollonia.  It is a sort of natural hangar, inhabited uninterruptedly by humans from prehistoric times until the present. Investigated for the first time between 1951 and 1955 by Charles McBurney - an archaeologist from the University of Cambridge - the same university resumed the research in 2007 under the direction of Professor Graeme Barker, in collaboration with the Libyan Department of Antiquities and an international team of scholars, including myself.

Beginning at the earliest levels, at about 15 meters below the current surface and relating to the Middle Palaeolithic, the cave takes us on a breathtaking journey through time. We first move through the levels dating back to 70,000 years ago where the only human remains found so far at the site - two fragments of Homo sapiens lower jaw - were uncovered: a moving testimony to the arrival of our ancestors along the North African coast. We can then examine the layers of the Upper Paleolithic on our way to the Neolithic, when the first species of domesticated animals and plants of the Levantine regions made their appearance in North Africa. Continuing our journey towards the surface, through layers dating from the Classical Period and thence more recent ones, we arrive at the present day. Like a wonderful freeze frame which has lasted thousands of years, the cave is still in use to this very day - as a livestock shelter - by families of shepherds.  It is greatly respected by the local population.

Despite the instability that has followed the collapse, in 2011, of the Gaddafi regime in Libya, we continued to work at Haua Fteah - albeit with considerable difficulty - until September 2012 when the US Ambassador to Libya, Chris Stevens, was assassinated in Benghazi. The terrible news reached us while we were actually digging in the cave. We were struck by a deep sense of loss, but also with concern for our own safety. We left Libya a few days later and, after months of indecision and another very brief campaign in 2013, Graeme Barker reluctantly decided to suspend the excavation. Daesh had definitively taken control of the city of Derna, just 60 kilometers east of Haua Fteah, and the risk to our safety was really too great.

However, our Libyan colleagues continued to monitor the massive open trench, and a short while later they informed us that its walls, exposed since 2007, were not going to last for long. To safeguard and bring to a conclusion the work of years, the excavation had to be completed as soon as possible. “We can do it ourselves,” said Ahmad Saad Emrage, archaeologist at the University of Benghazi. “We can still work safely enough. We will be accurate and fast.” So, without any delay, the ‘command’ of operations fell to Ahmad and his team of local archaeologists: Fadl Abdulazeez, Akram Alwarfalli, Moataaz Azwai, Saad Buyadem, Badr Shamata, Asma Sulaiman, Reema Sulaiman and Aiman ​​Alareefi.

Who are these ‘Fantastic 9’? They are, first and foremost, passionate archaeologists and serious professionals. Ahmad and Fadl are the ‘fathers’ of the group, always ready to guide and encourage the younger ones; then there is Moataz, the tireless ‘gentle giant’; then two young daredevils, Akram and Saad, who, after a day of excavation, love to dive from the beautiful cliffs of Lathrun; also Badr and Aiman, who make sure the rest of the troop always has tea in their cups and sheesha to puff on; and then the two sisters, Aasma and Reema, who have iron will. Nine different individuals, nine different histories, united by an immense passion for their homeland, Libya, and by a single unwavering desire: to save their country and its history.

The first excavation campaign began on May 9th 2015 and was supposed to last for two months, but it was suspended after only four weeks. Ahmad told me that “after work started with no particular problem, the situation had rapidly deteriorated. Local sources had reported to us that Daesh militants had recently been seen in the Susa area. When passing through the town we would often hear gunfire and screams. We were afraid, but we did not want to stop. During the raids against the Daesh positions in Derna, Libyan air force planes and helicopters flew over the cave. We were by no means certain that they were all aware of our presence in the area, and, for fear of being mistaken for terrorists, we would run to take shelter in the back of the cave every time we heard the noise of an approaching aircraft. I remember one day when I was carrying the long plastic tube we used to store the stratigraphy drawings of the excavation over my shoulder; on hearing the sound of an approaching helicopter, Fadl grabbed the tube and threw it away from us, for fear it could be mistaken for a rocket launcher. We were extremely tense, and when the helicopter finally moved away, we looked at each other and burst into laughter.”

“We were increasingly afraid but we continued to work. One day, however, a friend came running, shouting that the night before he had seen masked men in the vicinity of the cave, almost certainly Daesh militiamen. And shortly afterwards the Susa police arrived and forced us to stop work. It was not easy to convince the boys that we could not go on. ‘We can still do it - they kept repeating – we’ll be even more careful and fast’.  Reluctantly, however, we collected the equipment and left the cave.”

But that was not the end by any means. Two months later, thanks to the liberation of Derna from Daesh militia, the 'Fantastic 9' returned to the cave and finally managed to complete the excavation. “Do not call us heroes,” Ahmad exclaimed when I told him that I would recount their adventure. “We just did what had to be done, as archaeologists and as Libyans.” However, in a country like Libya that sees its archaeological heritage so dramatically at risk, our colleagues’ achievement was exceptional in its significance: it showed that the Libyans have not given up, that they wish to reclaim their own cultural heritage and determine its fate themselves.

Inset images: Top: the Haua Fteah trench (credit: Cyrenaica Prehistory Project). Bottom: Saad Buyadem and Fadl Abdulazeez (credit: Cyrenaica Prehistory Project).

Dr Giulio Lucarini is a Leverhulme Research Fellow at the McDonald Institute for Archaeological Research in Cambridge. Excavation at Haua Fteah has been principally funded by the European Research Council, with supplementary funding from the Society of Libyan Studies, the project’s sponsor.

This article was first published in Archeostorie. Journal of Public Archaeology.

With Daesh militia at their heels, a handful of brave Libyan archaeologists completed the excavation of the Haua Fteah cave in Cyrenaica, North Africa. Cambridge archaeologist Dr Giulio Lucarini tells their story.

“We can do it ourselves,” said Ahmad Saad Emrage, archaeologist at the University of Benghazi. “We can still work safely enough. We will be accurate and fast.”
Members of the project at the end of the 2012 season

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Opinion: Robots and AI could soon have feelings, hopes and rights … we must prepare for the reckoning

0
0

Get used to hearing a lot more about artificial intelligence. Even if you discount the utopian and dystopian hyperbole, the 21st century will broadly be defined not just by advancements in artificial intelligence, robotics, computing and cognitive neuroscience, but how we manage them. For some, the question of whether or not the human race will live to see a 22nd century turns upon this latter consideration. While forecasting the imminence of an AI-centric future remains a matter of intense debate, we will need to come to terms with it. For now, there are many more questions than answers. The Conversation

It is clear, however, that the European Parliament is making inroads towards taking an AI-centric future seriously. Last month, in a 17-2 vote, the parliament’s legal affairs committee voted to to begin drafting a set of regulations to govern the development and use of artificial intelligence and robotics. Included in this draft proposal is preliminary guidance on what it calls “electronic personhood” that would ensure corresponding rights and obligations for the most sophisticated AI. This is a start, but nothing more than that.

If you caught any of the debate on the issue of “electronic” or “robot” personhood, you probably understand how murky the issues are, and how visceral reactions to it can be. If you have not caught any of it, now is a good time to start paying attention.

The idea of robot personhood is similar to the concept of corporate personhood that allows companies to take part in legal cases as both claimant and respondent – that is, to sue and be sued. The report identifies a number of areas for potential oversight, such as the formation of a European agency for AI and robotics, a legal definition of “smart autonomous robots”, a registration system for the most advanced ones, and a mandatory insurance scheme for companies to cover damage and harm caused by robots.

The report also addresses the possibility that both AI and robotics will play a central role in catalysing massive job losses and calls for a “serious” assessment of the feasibility of universal basic income as a strategy to minimise the economic effects of mass automation of entire economic sectors.

We, Robots

As daunting as these challenges are – and they are certainly not made any more palatable given the increasingly woeful state of geopolitics – lawmakers, politicians and courts are only beginning to skim the surface of what sort of problems, and indeed opportunities, artificial intelligence and robotics pose. Yes, driverless cars are problematic, but only in a world where traditional cars exist. Get them off the road, and a city, state, nation, or continent populated exclusively by driverless cars is essentially a really, really elaborate railway signalling network.

Artificial minds will need very real rights.Shutterstock


I cannot here critique the feasibility of things such as general artificial intelligence, or even the Pandora’s Box that is Whole Brain Emulation– whereby an artificial, software-based copy of a human brain is made that functions and behaves identically to the biological one. So let’s just assume their technical feasibility and imagine a world where both bespoke sentient robots and robotic versions of ourselves imbued with perfect digital copies of our brains go to work and “Netflix and chill” with us.

It goes without saying that the very notion of making separate, transferable, editable copies of human beings embodied in robotic form poses both conceptual and practical legal challenges. For instance, basic principles of contract law would need to be updated to accommodate contracts where one of the parties existed as a digital copy of a biological human.

Would a contract in Jane Smith’s name, for example, apply to both the biological Jane Smith and her copy? On what basis should it, or should it not? The same question would also need to be asked in regard to marriages, parentage, economic and property rights, and so forth. If a “robot” copy was actually an embodied version of a biological consciousness that had all the same experiences, feelings, hopes, dreams, frailties and fears as their originator, on what basis would we deny that copy rights if we referred to existing human rights regimes? This sounds like absurdity, but it is nonetheless an absurdity that may soon be reality, and that means we cannot afford to laugh it off or overlook it.

There is also the question of what fundamental rights a copy of a biological original should have. For example, how should democratic votes be allocated when copying people’s identities into artificial bodies or machines becomes so cheap that an extreme form of “ballot box stuffing” – by making identical copies of the same voter – becomes a real possibility?

Should each copy be afforded their own vote, or a fractional portion determined by the number of copies that exist of a given person? If a robot is the property of its “owner” should they have any greater moral claim to a vote than say, your cat? Would rights be transferable to back-up copies in the event of the biological original’s death? What about when copying becomes so cheap, quick, and efficient that entire voter bases could be created at the whim of deep-pocketed political candidates, each with their own moral claim to a democratic vote?

How do you feel about a voter base comprised of one million robotic copies of Milo Yiannopolous? Remember all that discussion in the US about phantom voter fraud, well, imagine that on steroids. What sort of democratic interests would non-biological persons have given that they would likely not be susceptible to ageing, infirmity, or death? Good luck sleeping tonight.

Deep thoughts

These are incredibly fascinating things to speculate on and will certainly lead to major social, legal, political, economic and philosophical changes should they become live issues. But it is because they are increasingly likely to be live issues that we should begin thinking more deeply about AI and robotics than just driverless cars and jobs. If you take any liberal human rights regime at face value, you’re almost certainly led to the conclusion that, yes, sophisticated AIs should be granted human rights if we take a strict interpretation of the conceptual and philosophical foundations on which they rest.

Who will win the AI vote?Shutterstock


Why then is it so hard to accept this conclusion? What is it about it that makes so many feel uneasy, uncomfortable or threatened? Humans have enjoyed an exclusive claim to biological intelligence, and we use ourselves as the benchmark against which all other intelligence should be judged. At one level, people feel uneasy about the idea of robotic personhood because granting rights to non-biological persons means that we as humans would become a whole lot less special.

Indeed, our most deeply ingrained religious and philosophical traditions revolve around the very idea that we are in fact beautiful and unique snowflakes imbued with the spark of life and abilities that allow us to transcend other species. That’s understandable, even if you could find any number of ways to take issue with it.

At another level, the idea of robot personhood – particularly as it relates to the example of voting – makes us uneasy because it leads us to question the resilience and applicability of our most sacrosanct values. This is particularly true in a time of “fake news”, “alternative facts”, and the gradual erosion of the once proud edifice of the liberal democratic state. With each new advancement in AI and robotics, we are brought closer to a reckoning not just with ourselves, but over whether our laws, legal concepts, and the historical, cultural, social and economic foundations on which they are premised are truly suited to addressing the world as it will be, not as it once was.

The choices and actions we take today in relation to AI and robotics have path-dependent implications for what we can choose to do tomorrow. It is incumbent upon all of us to engage with what is going on, to understand its implications and to begin to reflect on whether efforts such as the European Parliament’s are nothing more than pouring new wine into old wine skins. There is no science of futurology, but we can better see the future and understand where we might end up in it by focusing more intently on the present and the decisions we have made as society when it comes to technology.

When you do that, you realise we as a society have made no real democratic decisions about technology, we have more or less been forced to accept that certain things enter our world and that we must learn to harness their benefits or get left behind and, of course, deal with their fallout. Perhaps the first step, then, is not to take laws and policy proposals as the jumping-off point for how to “deal” with AI, but instead start thinking more about correcting the democratic deficit as to whether we as a society, or indeed a planet, really want to inherit the future Silicon Valley and others want for us.

To hear more about the future of AI and whether robots will take our jobs, listen to episode 10 of The Conversation’s monthly podcast, The Anthill – which is all about the future.

Christopher Markou, PhD Candidate, Faculty of Law, University of Cambridge

This article was originally published on The Conversation. Read the original article.

Is artificial intelligence a benign and liberating influence on our lives – or should we fear an impending rise of the machines? And what rights should robots share with humans? Christopher Markou, a PhD candidate at the Faculty of Law, suggests an urgent need to start considering the answers.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Opinion: Want to eradicate viruses? They made us who we are

0
0

It is cold and flu season so many of us are currently under the weather with a virus. But what exactly is a virus? And are they even alive? The Conversation

Outside a host cell, these weird microscopic particles, or virions, only consist of a tiny piece of genetic information (about 10,000 times less than that contained in the human genome) and a protein or lipid (fatty molecule) shell. Whether these particles are living things is the subject of much debate, as they don’t meet many of the usual criteria for life.

While there isn’t any formal agreement on what defines life, most definitions include the ability to adapt to the environment, to reproduce, to respond to stimuli, and to use energy.

While the virus particle may fall short of the definition of life depending on the criteria used, for some virologists like myself, thinking of the virion as the “virus” is like calling a sperm or unfertilised egg a “person”. Sure, a sperm is an essential step towards creating a person, but few people would argue that a sperm or unfertilised egg should be described as the finished product.

Flu: part virus, part human.Shutterstock


Much like a sperm, virions are produced in the millions. Many will never reach their destination and are lost and degrade in the environment. It is only when the virus binds to and enters a target cell that its cycle of replication can begin.

A virion doesn’t even always contain a majority of the molecules a virus can create. For example, the norovirus virion contains just three different types of protein and one type of RNA (a nucleic acid like DNA which uses a different sugar to form its backbone). Infected cells, however, make at least eight different viral proteins and four different viral RNAs.

Nor does the virus particle itself usually result in the symptoms of disease. Typically, when you catch a virus, your symptoms come from either infected cells dying, or your immune response to those infected cells.

For these reasons, some virologists consider the infected cell, rather than the virion, to be the virus.

I am virus

While this idea sounds outlandish, from conception to grave, your cells are intricately associated with viruses. Even if you don’t have a cold or the flu, you are still part-virus as human DNA plays host to a range of different viruses.

These are retroviruses, the best-known example of which is HIV. While HIV only entered the human population relatively recently, viruses very much like it have been infecting us and the creatures we evolved from since long before humans even existed.

While HIV infects immune cells, when a retrovirus instead infects the cells that produce eggs or sperm, the viral DNA can be inherited by any offspring. Over millions of years, these viruses have lost their ability to produce infectious particles, but have in some cases found other vital roles, and are now indispensable for human life.

One well-studied example is a protein called Syncytin-1, which is vital for the development of the placenta. This was originally a retroviral protein which entered the monkey population which gave rise to humans around 24m years ago. If we deleted this protein from our DNA, humanity would rapidly go extinct as we could no longer produce a functional placenta.

Transplanting pig organs into humans carries a risk of viral infection.Shutterstock


All these viruses which inserted into our DNA long ago are termed endogenous retroviruses (ERVs). In humans, ERVs have long since lost the ability to produce infectious virions, but this is not the case in all animals. Pig ERVs, for example, can produce infectious particles and are a concern when considering the use of pig organs for transplant, as these are known to be able to infect human cells in the lab.

Blurred lines

If a virus is the infected cell, rather than the virion, you could even think of the viruses that can infect us as more than 99.9% human. This is because they need many of the human proteins or other molecules present in your cells and encoded in your DNA to make more virus.

A human cell is vastly more complex than even the largest virus, and viruses can make use of this to compensate for their own simplicity. Viruses and their host cells share many common needs. They need to be able to produce RNA, protein, lipids and have access to the raw materials to generate these. As a host cell already contains all the needed components to achieve this, a virus can simply provide its own instructions, in the form of the viral genome, and let the cell do most of the work.

It takes many more cellular proteins to make a virus, than it does viral proteins. A virus only needs to provide instructions for the few components the host cell cannot produce. An example of this would be viruses which have a virion with a lipid membrane, such as influenza. This membrane is usually recycled from host cell membranes. The addition of a couple of viral proteins converts this into the membrane coat of the virion.

This use of host components by viruses also makes it clear why it has been so difficult to develop effective antiviral drugs. Much as with cancer treatment, there is very little to distinguish infected cells from normal human cells, which makes coming up with a drug that will only target infected cells extremely challenging. To be effective, you have to target that tiny part of the infected cell that is purely virus, without harming the remainder.

So are viruses alive? It’s still not settled, and really depends on what you think a virus is. What does seem clear, however, is that the viruses which infect us can be seen as part human, and we are part virus.

Edward Emmott, Research Associate in Virology, University of Cambridge

This article was originally published on The Conversation. Read the original article.

We are still part-virus, writes Edward Emmott, Research Associate in Virology, for The Conversation. Human DNA plays host to a range of different viruses. And this could help explain why it has been so difficult to develop effective antiviral drugs.

HIV-infected T cell

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Back to the future of skyscraper design

0
0

Newly-published, The Recovery of Natural Environments in Architecture by Professor Alan Short is the culmination of 30 years’ research and award-winning green building design by Short and colleagues in Architecture, Engineering, Applied Maths and Earth Sciences at the University of Cambridge.

“The crisis in building design is already here,” said Short. “Policy makers think you can solve energy and building problems with gadgets. You can’t. As global temperatures continue to rise, we are going to continue to squander more and more energy on keeping our buildings mechanically cool until we have run out of capacity.”

Short is calling for a sweeping reinvention of how skyscrapers and major public architecture are designed – to end the reliance on sealed buildings which exist solely via the ‘life support’ system of vast air conditioning units.

Instead, he shows it is entirely possible to accommodate natural ventilation and cooling in large buildings by looking into the past, before the widespread introduction of air conditioning systems which were ‘relentlessly and aggressively promoted’ by inventor Willis Carrier and rival entrepreneurs.

“The majority of contemporary buildings have absolutely no resilience to climate at all,” he added. “To make them habitable, you have to seal them and air condition them. The energy use and carbon emissions this generates is spectacular and to a large extent unnecessary. Buildings in the West count for 40-50% of electricity usage, generating substantial carbon emissions. The rest of the world is catching up at a frightening rate, China at 31% and rising in 2017.

“Modern buildings cannot survive unless hard-wired to a life support machine, yet this fetish for glass, steel and air-conditioned skyscrapers continues; they are symbols of status around the world on an increasingly vast scale.”

Short’s book highlights a developing and sophisticated art and science of ventilating buildings through the 19th and earlier 20th centuries, including the two chambers of the Houses of Parliament, and the design of ingeniously ventilated hospitals. Of particular interest were those built under the aegis of John Shaw Billings, designer of the first Johns Hopkins Hospital in Baltimore (1873-1889).

“We spent three years digitally modelling Billings' final designs and a brilliant alternative design,” added Short. “We put pathogens in the airstreams, modelled for someone with TB coughing in the wards and we found the ventilation systems in the room would have kept patients safe from harm.

“We discovered that nineteenth century hospital wards could generate up to 24 air changes an hour– that’s similar to the performance of  a modern-day, computer-controlled operating theatre. We believe you could build wards based on these principles for the NHS now. Single rooms are not appropriate for all patients. Communal wards appropriate for certain patients – older people with dementia, for example – would work just as well in today’s hospitals, at a fraction of the energy cost.”

Professor Short contends the mindset and skill-sets behind these designs have been completely lost, lamenting the disappearance of expertly designed theatres, opera houses, and other public buildings where up to half the volume of the building was given over to ensuring everyone got fresh air. Early twentieth century climate determinists like Ellsworth Huntington at Yale inadvertently promoted the export of cool, temperate climates around the world and explicitly condemned the inhabitants of hot climates as uncivilised and backward.

Much of the ingenuity present in 19th century hospital and building design was driven by a panicked public clamouring for buildings that could protect against what was thought to be the pernicious threat of miasmas – toxic air that spread disease.

Bad, malodourous air was considered lethal by huge swathes of the populace. Miasmas and other quasi-mystical phenomena were feared as the principal agents of disease and epidemics for centuries, and were used to explain the spread of infection from the Middle Ages, right through to the cholera outbreaks in London and Paris during the 1850s. Miasma theory attracted the attention of luminaries such as Florence Nightingale who believed that foul air, rather than germs, was the main driver of 'hospital fever' leading to disease and frequent death. The prosperous steered clear of hospitals.

While miasma theory has been long since disproved, Short has for the last thirty years advocated a return to some of the building design principles produced in its wake.

“The air conditioning industry has persuaded us that you can’t do this naturally any more and that it would defy progress to do so. Huge amounts of a building’s space and construction cost are today given over to air-conditioning instead.

“But I have designed and built a series of buildings over the past three decades which have tried to reinvent some of these ideas and then measure what happens – publishing what works as well as what doesn’t.

“To go forward into our new low energy, low carbon future, we would be well advised to look back at design before our high-energy high-carbon present appeared. What is surprising is what a rich legacy we have abandoned. There is an analogy with the widespread introduction of affordable antibiotics and the relaxation in the ferocious cleanliness regimes in hospitals and the frightening consequences emerging now.”

Successful examples of Short’s approach include the iconic Queen’s Building at De Montfort University in Leicester. Containing as many as 2,000 staff and students, the entire building is naturally ventilated, passively cooled and naturally lit, including the two largest auditoria each seating more than 150 people.

Conventional wisdom in the ventilation and heating industry was that this omission of mechanical and electrical equipment was impossible. Confounding its critics, the building was awarded the Green Building of the Year and RIBA’s Education Building of the Year in 1995 and was at the time the largest naturally ventilated building in Europe, influencing guidance in Europe and the USA. The building uses a fraction of the electricity of comparable buildings in the UK.

Following success there, Short and industry associates have also experimented with theatre design, including the Contact Theatre in Manchester, which uses the abundant heat sources of theatre lights and audience to drive air flows around the building and the passive downdraught cooled School of Slavonic and East European Studies in Bloomsbury, epicentre of the London Heat Island.

Short contends that glass skyscrapers in London and around the world will become a liability over the next twenty or thirty years’ time if climate modelling predictions and energy price rises come to pass as expected. He points to the perfect storm of the skyscraper boom in China, where huge high-rise, all-glass metropolises expand at an exponential rate.  Meanwhile, 550 million people south of the Qin-Huai line in that country are not allowed to centrally heat or cool their own homes because of the energy that would demand and consume.

Short is convinced that sufficiently cooled skyscrapers using the natural environment can be produced in almost any climate, pointing to his research work on cooling an 11-storey tower at Addenbrooke’s Hospital in Cambridge. He and his team have also worked on hybrid buildings in the harsh climates of Beijing and Chicago – built with natural ventilation assisted by back-up air-conditioning – which, surprisingly perhaps, can be switched off more than half the time on milder days and during the spring and autumn.

“I think you can upscale these designs,” he added. “As you go higher, airspeeds increase and it becomes easier to control the climate within tall buildings.

“My book is a recipe book which looks at the past, how we got to where we are now, and how we might reimagine the cities, offices and homes of the future. There are compelling reasons to do this. The Department of Health says new hospitals should be naturally ventilated, but they are not. Maybe it’s time we changed our outlook.”

The Recovery of Natural Environments in Architecture: Air, Comfort and Climate, published by Routledge, is out now.

Answers to the problem of crippling electricity use by skyscrapers and large public buildings could be ‘exhumed’ from ingenious but forgotten architectural designs of the 19th and early 20th century – according to a world authority on climate and building design.

The air conditioning industry has persuaded us that you can’t naturally ventilate buildings any more.
Alan Short
Post-war advertisement for air conditioning by Carrier

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Rapid changes point to origin of ultra-fast black hole winds

0
0

Outflowing gas is a common features of the supermassive black holes that reside at the centre of large galaxies. Often millions of times more massive than the Sun, these black holes feed off the surrounding gas that swirls around them. Space telescopes observe this as a bright light from the innermost part of the disc around the black hole.

Occasionally the black holes consume too much gas and release an ultra-fast wind. These winds are an important characteristic to study because they could have a strong influence on regulating the growth of the host galaxy by clearing the surrounding gas away and therefore suppressing the birth of stars.

Using ESA’s XMM-Newton and NASA’s NuStar telescopes, scientists have now made the most detailed observation yet of such an outflow. The winds recorded from the black hole reach 71,000 km/s – a quarter of the speed of light – putting it in the top 5% of fastest known black hole winds.

XMM-Newton focused on the black hole for 17 consecutive days, revealing the extremely variable nature of the winds.

“We often only have one observation of a particular object, then several months or even years later we observe it again and see if there’s been a change,” says Dr Michael Parker of the Institute of Astronomy at the University of Cambridge, UK, lead author on a paper published in Nature this week which describes the discovery.

“Thanks to this long observation campaign, we observed changes in the winds on a timescale of less than an hour for the first time.”

The changes were seen in the increasing temperature of the winds, a signature of their response to greater X-ray emission from the disc right next to the black hole.

Furthermore, the observations also revealed changes to the chemical fingerprints of the outflowing gas: as the X-ray emission increased, it stripped electrons in the wind from their atoms, erasing the wind signatures seen in the data.

“The chemical fingerprints of the wind changed with the strength of the X-rays in less than an hour, hundreds of times faster than ever seen before,” says co-author Professor Andrew Fabian, also from the Institute of Astronomy, and principal investigator on the project.

“It allows us to link the X-ray emission arising from the material falling into the black hole, to the variability of the outflowing wind farther away.”

Dr Parker adds: “Black hole winds are one of the mechanisms for feedback, where the energy coming out from the black hole regulates the growth of the host galaxy. Understanding these winds is crucial to understanding how galaxies, including our own, grow.”

Michael Parker et al: "The response of relativistic outflowing gas to the inner accretion disk of a black hole" Nature 2 March 2017

Adapted from a press release by the European Space Agency

Astronomers have made the most detailed observation yet of an ultra-fast wind emanating from a Black Hole at a quarter of the speed of light. Using the European Space Agency (ESA)’s XMM-Newton and NASA’s NuSTAR telescopes, the scientists observed the phenomenon in an active galaxy known as IRAS 13224-3809.

Understanding these winds is crucial to understanding how galaxies, including our own, grow
Dr Michael Parker
Artist's impression of the winds emanating from the supermassive black hole

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
Viewing all 4346 articles
Browse latest View live




Latest Images