Are you the publisher? Claim or contact us about this channel

Embed this content in your HTML


Report adult content:

click to rate:

Account: (login)

More Channels


Channel Catalog

older | 1 | .... | 98 | 99 | (Page 100) | 101 | 102 | .... | 141 | newer

    0 0

    The John Pickard Neurosurgical Laboratories, based at Addenbrooke’s Hospital, Cambridge University Hospitals, will contain purpose built modern laboratories and updated offices, and are named after John Pickard, Professor Emeritus of Neurosurgery. Pickard was Cambridge’s first Professor of Neurosurgery, who was in post from 1991 until his retirement in 2013. The suite consists of laboratories dedicated to neurochemistry, and imaging and treating brain tumours.

    “Injuries to the brain, either through trauma or diseases such as brain tumours, can have serious lasting effects on individuals, as well as for their families and carers,” says Professor Peter Hutchinson, Head of Academic Neurosurgery at the University of Cambridge. “Our newly refurbished laboratories will help us to better understand what is happening in response to this damage, putting us in a better position to treat the patients and improve their long-term outcomes.”

    The Neurochemistry Laboratory, led by Dr Keri Carpenter, aims to develop better ways of monitoring and treating brain injury by investigating how the brain responds to injury and how these responses can lead to long-term disabilities. Better treatments are needed to ensure the best outcome for each patient, and alleviate demands on carers, local authorities and NHS resources. The findings are potentially also relevant to diseases such as dementia and Parkinson’s, which often manifest at a younger age in brain injury survivors.

    The laboratories will be the leading unit in the UK to use microdialysis, which enables doctors to deliver molecules to and from the injured brain. This technology can be used to monitor, study and potentially treat specific areas of the brain. Researchers at the University have pioneered the use of non-radioactive ‘labels’ administered by microdialysis to track metabolism. Microdialysis is also used to support clinical trials of drugs given intravenously to establish how effectively the drug is able to cross the ‘blood-brain barrier’, transiting from the bloodstream into the brain.

    Researchers from the laboratories will work in collaboration with colleagues at the newly-refurbished Wolfson Brain Imaging Centre in the development of advanced imaging techniques, as well as with colleagues in departments such as Chemistry, Clinical Neurosciences and Medicine, and across the Cambridge Biomedical Campus.

    The Brain Tumour Imaging Laboratory will be the UK’s first dedicated laboratory for analysing medical imaging of patients with brain tumours. It will use advanced imaging that can be performed on clinical scanners to understand disease-related changes in and around brain tumours – including how far these tumours spread, the effect and impact this spread has on the normal brain, and how treatments such as surgery and radiotherapy affect normal brain function. The laboratory will be led by Mr Stephen Price.

    The Lisa Wiles Neurooncology Laboratory – named after a patient treated at Addenbrooke’s – will also be the first of its kind in the UK, and will be integrated with the operating rooms to collect and process tissue samples taken directly from cancer patients being operated on. Led by Mr Colin Watts, the team will use this new facility as a resource available to the whole of the Cambridge Cancer Centre community to support world-class research to improve our understanding of brain cancer and develop new therapies. This should enable faster, more precise diagnoses to improve the treatment of patients – including tailoring treatment to each individual patient.

    “We’re very grateful to Professor Pickard and to Ms Wiles for helping us make these new laboratories a reality,” says Professor Patrick Maxwell, Regius Professor of Medicine at the University of Cambridge. “These facilities will perform an important role in helping make a real difference to the lives of patients with brain injuries.”

    Professor John Pickard was the first chairman and clinical director of the Wolfson Brain Imaging Centre, a leading biomedical imaging centre housing both MRI and PET scanners. More recently he became the first Cambridge Health Technology Co-operative Honorary Director, which is one of eight national co-operatives that receive funding from the National Institute for Health Research. The Cambridge co-operative is the only one to focus on brain injury.

    A new suite of laboratories aimed at improving outcomes for patients with brain injuries and brain tumours opens today at the University of Cambridge.

    ur newly refurbished laboratories will help us to better understand what is happening in response to brain injury, putting us in a better position to treat the patients and improve their long-term outcomes
    Peter Hutchinson
    brain2 (edited)

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    Professor Richard Gilbertson from the CRUK Cambridge Institute

    On the children’s ward at Newcastle General Hospital in 1986, medical student Richard Gilbertson got his first taste of life as a paediatric oncologist. He looked around the ward and saw a child in a bed, in a dark corner. “She has a medulloblastoma that has returned,” the consultant said. “What can we do for her?” asked Gilbertson, who had been fascinated by medulloblastomas – one of the commonest malignant brain tumours in children – since his first year of medicine, when he was randomly assigned to do a project on them. “Nothing,” the consultant replied. “The only thing we can do is let her die in peace.”

    “I got so angry,” remembers Gilbertson – now Professor – sitting in his airy office on the first floor of the vast glass-and-steel Cancer Research UK Cambridge Institute (CRUK CI) at the Li Ka Shing Centre. “It was the 1980s and there was nothing we could do for a child with a brain tumour. That was completely unacceptable to me. And I know it sounds contrived, but I made up my mind from that moment that I was going to do something.”

    That ‘something’ was initially a number dreamed up in a Newcastle pub: relaxing with a beer after a long week, Gilbertson and his fellow medical students decided that by the time they retired, they should take personal responsibility for implementing a 15 per cent reduction in mortality, from whatever disease they chose.

    Gilbertson never forgot that pledge and, 30 years later, you could argue that he’s fulfilled it. As a result of his insights into how children’s brain tumours behave, more children than ever before are surviving them. However this, he argues, is because of better patient care and understanding of the condition, not more effective treatments, earlier detection or prevention. These are the things, he says, that will fulfil his latest ambition: a world without cancer.

    When Gilbertson began studying brain tumours, they were all regarded as the same disease, and needing the same treatment. He proved that the main types of tumour not only behaved differently from each other but are also totally different diseases.

    “All tumours generate cells, and these cells speak different languages – express different genes – in the same way that we all speak with an accent,” he explains. “So say we had a French person, an Irish person, and a Chinese person who all spoke English, we could identify their country of origin from their accent. We thought: if we can first identify the ‘accent’ that normal cells in the brain speak with and compare them with the ‘accent’ of the tumour cells, we can trace the origin of those cells back – just like we hear a Cockney speak and we know he’s from London.”

    When Gilbertson and his team studied cancer ‘accents’ in brain tumours, they fell very clearly into four different categories – four different diseases arising from four different cell types. One tumour has blood vessels like sieves, for example. It’s an achilles heel that doctors can exploit – chemotherapy drugs in the bloodstream can permeate the tumour far more efficiently.

    The World Health Organisation (WHO) has now adopted Gilbertson’s classification, and children around the world now receive treatment matched to their category of tumour.

    All this was achieved while Gilbertson was Cancer Centre and Scientific Director at St Jude Children’s Research Hospital, Memphis, one of the world’s leading children’s cancer hospitals, where he worked for 15 years. Yet his research seemed to be taking him more and more in the direction of not just paediatric cancer but the entirety of cancer. “Of course, they are different diseases. But I firmly believe that cancer needs to be thought of as a continuum. What you’re looking at is how development goes wrong, whether you’re seven or 70.”

    In 2015, Gilbertson was appointed as Li Ka Shing Chair of Oncology in Cambridge and Director of the Cambridge Cancer Centre, with access to both child and adult cancers and, he says, “the best minds in the world”. These minds – physicists, engineers, chemists – are, right now, working out how Gilbertson’s dream of a cancer-free world can become reality.

    There are big ideas to be worked on. “One of the things that’s always puzzled me as a paediatric oncologist is: why don’t children get cancer more?” Gilbertson says. “After all, as children grow, they experience massive cell proliferation. Cancer happens when a cell’s DNA goes wrong. Think of them as accelerators that make cells divide too much, with no brakes. Yet paediatric cancers are quite rare – much more so than adult cancer. People say it’s because children don’t smoke or live for 70 years or do those things which cause mistakes in a cell’s DNA. I don’t buy that. I think it’s partly true. But there must be something that protects children in the design of their cells from actually getting cancer.”

    Gilbertson and his team have just completed a seven-year study identifying the cells that make cancers in children and adults. When his researchers challenged healthy cells with the mutations that drive cancer, they found that children’s stem cells appeared to be intrinsically resistant.

    There was something about them that stopped them making cancer – unlike the adult cells, which weren’t resistant. “That’s terribly exciting, because if I can look into a child’s cells and work out the biology that is protecting that child’s cells from cancer, then maybe we could reproduce that in an adult cell with a drug. And if you do that, you’ve got a preventative for cancer. We are working on that right now, and it’s something I will be pursuing.”

    All possibilities are explored. Gilbertson’s lab is currently screening around 1.2 million compounds, found everywhere from the depths of the Amazonian rainforest to the bottom of the ocean. Out of these, four have possibilities as potential treatments and are being developed. Then there are the drugs that already exist, that have been shown to be effective against other cancers: these are being screened as well. Taking a chemical compound from a tree, crushing it up and putting it on cancerous cells is one avenue of exploration, to be sure. But making that compound into a drug that a patient can actually take is a very long process. It’s far quicker to take a drug that already exists and give it to that child with the brain tumour. “It’s a bit like when your roof is leaking and you put a saucepan on the floor,” says Gilbertson. “The saucepan wasn’t designed to do that. It was designed to cook carrots. But it’s actually useful to catch water in.”

    Innovations around early diagnosis are also being examined. Just 25 per cent of people who are diagnosed with one of the eight most common cancers in the late stages will be alive 10 years later. Diagnose the same cancers just a few months earlier, when the disease is in its early stages, and 80 per cent of those patients will be alive in 10 years. The earlier you diagnose, the better chance of a cure. And here, again, it’s all about understanding the tumour, working out its strengths and its weaknesses, finding the things it does that can be turned against it.

    Gilbertson points to the work of Professor Rebecca Fitzgerald at the MRC Cancer Unit, a partner of the Cancer Centre, as a perfect example. She developed the cytosponge – the equivalent of a cervical smear for the oesophagus, a notoriously hard area in which to spot pre-cancerous changes. It’s a tiny pill containing an even tinier sponge on a ‘fishing line’. The patient swallows it, the pill hits the stomach and dissolves, leaving the sponge behind. The line is then pulled up, with the sponge scraping a cell sample from the oesophagus on the way up. “What we will see increasingly in cancer is a push towards diagnosing early, and people becoming increasingly used to going through their GPs,” he says. “If you asked my dad’s generation if they had their blood pressure checked regularly, they would say no. Those kind of early diagnostics didn’t exist then but now they do, and are common practice. I think you will see that in cancer.”

    Yesterday, Gilbertson mentions, he was meeting with the inventor of a breath tester to detect lung cancer. Tumour cells have a different way of consuming food than normal cells, he explains, so they produce slightly different waste products. Some of these are volatile, and these tell-tale compounds will appear in the breath – so they can be detected. The team are also developing tests for circulating tumour DNA. It’s now known that DNA isn’t present just in cells: it floats around the bloodstream. Tumours are caused by mutations in that DNA: if you create a blood test sensitive enough to detect those mistakes, you could identify that tumour before the person even starts to show symptoms.

    Imagine, he says, the conversations around cancer in the future. “If people never get cancer as we know it, they’ll be saying: ‘Oh, I’ve been diagnosed with gastric cancer but the doctor’s just fixed it.’ Imagine a world where well-person clinics test accurately for the earliest cancers every year, rather than patients walking around with tumours inside them for years on end – and only when they get ill do we do something about it. Imagine a child going into a clinic for a five-year checkup, and having a blood test which reveals she has cancer. You intervene with a relatively non-toxic treatment – even minor surgery – and that’s it.” A world without cancer, where the dark corners of the ward are banished to the history books. It’s a pledge worth pursuing.

    This article is taken from CAM – the Cambridge Alumni Magazine, edition 79. 

    Thirty years ago, Professor Richard Gilbertson pledged to implement a 15 per cent reduction in mortality from children’s brain cancer. This is the story of what happened next.

    Interview: Lucy Jolin​

    Professor Richard Gilbertson from the CRUK Cambridge Institute

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    Overeating and lack of physical activity worldwide has led to rising levels of obesity and a global epidemic of diseases such as heart disease, stroke and type 2 diabetes. A key process in the development of these diseases is the progressive resistance of the body to the actions of insulin, a hormone that controls the levels of blood sugar. When the body becomes resistant to insulin, levels of blood sugars and lipids rise, increasing the risk of diabetes and heart disease. However, it is not clear in most cases how insulin resistance arises and why some people become resistant, particularly when overweight, while others do not.

    An international team led by researchers at the University of Cambridge studied over two million genetic variants in almost 200,000 people to look for links to insulin resistance. In an article published today in Nature Genetics, they report 53 regions of the genome associated with insulin resistance and higher risk of diabetes and heart disease; only 10 of these regions have previously been linked to insulin resistance.

    The researchers then carried out a follow-up study with over 12,000 participants in the Fenland and EPIC-Norfolk studies, each of whom underwent a body scan that shows fat deposits in different regions of the body. They found that having a greater number of the 53 genetic variants for insulin resistance was associated with having lower amounts of fat under the skin, particularly in the lower half of the body.

    The team also found a link between having a higher number of the 53 genetic risk variants and a severe form of insulin resistance characterized by loss of fat tissue in the arms and legs, known as familial partial lipodystrophy type 1. Patients with lipodystrophy are unable to adequately develop fat tissue when eating too much, and often develop diabetes and heart disease as a result.

    In follow-up experiments in mouse cells, the researchers were also able to show that suppression of several of the identified genes (including CCDC92, DNAH10 and L3MBTL3) results in an impaired ability to develop mature fat cells.

    “Our study provides compelling evidence that a genetically-determined inability to store fat under the skin in the lower half of the body is linked to a higher risk of conditions such as diabetes and heart disease,” says Dr Luca Lotta from the Medical Research Council (MRC) Epidemiology Unit at the University of Cambridge. “Our results highlight the important biological role of peripheral fat tissue as a deposit of the surplus of energy due to overeating and lack of physical exercise.”

    “We’ve long suspected that problems with fat storage might lead to its accumulation in other organs such as the liver, pancreas and muscles, where it causes insulin resistance and eventually diabetes, but the evidence for this has mostly come from rare forms of human lipodystrophy,” adds Professor Sir Stephen O’Rahilly from the MRC Metabolic Diseases Unit and Metabolic Research Laboratories at the University of Cambridge. “Our study suggests that these processes also take place in the general population.”

    Overeating and being physically inactive leads to excess energy, which is stored as fat tissue. This new study suggests that among individuals who have similar levels of eating and physical exercise, those who are less able store the surplus energy as fat in the peripheral body, such as the legs, are at a higher risk of developing insulin resistance, diabetes and cardiovascular disease than those who are able to do so.

    “People who carry the genetic risk variants that we’ve identified store less fat in peripheral areas,” says Professor Nick Wareham, also from the MRC Epidemiology Unit. “But this does not mean that they are free from risk of disease, because when their energy intake exceeds expenditure, excess fat is more likely to be stored in unhealthy deposits. The key to avoiding the adverse effects is the maintenance of energy balance by limiting energy intake and maximising expenditure through physical activity.”

    These new findings may lead to future improvements in the way we prevent and treat insulin resistance and its complications. The researchers are now collaborating with other academic as well as industry partners with the aim of finding drugs that may reduce the risk of diabetes and heart attack by targeting the identified pathways.

    The research was mainly funded by the Medical Research Council, with additional support from the Wellcome Trust.

    Lotta, LA et al. Integrative genomic analysis implicates limited peripheral adipose storage capacity in the pathogenesis of human insulin resistance. Nature Genetics; 14 Nov 2016; DOI: 10.1038/ng.3714

    A large-scale genetic study has provided strong evidence that the development of insulin resistance – a risk factor for type 2 diabetes and heart attacks and one of the key adverse consequences of obesity – results from the failure to safely store excess fat in the body.

    We’ve long suspected that problems with fat storage might lead to its accumulation in other organs, where it causes insulin resistance and eventually diabetes, but the evidence for this has mostly come from rare forms of human lipodystrophy
    Steve O'Rahilly
    Shake Shack burger and fries

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    In November 2004, Mary McClinton was admitted to Virginia Mason Medical Center in Seattle, USA, to receive treatment for a brain aneurysm, a potentially serious swelling in a blood vessel. What followed was a tragedy, made worse by the fact that it was entirely preventable.

    McClinton was mistakenly injected with the antiseptic chlorhexidine. It happened, the hospital says, because of “confusion over the three identical stainless steel bowls in the procedure room containing clear liquids — chlorhexidine, contrast dye and saline solution”. Doctors tried amputating one of her legs to save her life, but the damage to her organs was too great: McClinton died 19 days later.

    Nine years on, an almost identical accident occurred at Doncaster Royal Infirmary in the UK. Here, the patient, ‘Gina’, survived, but only after having her leg amputated.

    Professor Mary Dixon-Woods is one of Cambridge’s newest recruits, and she is on a mission: to improve patient safety in the National Health Service and in healthcare worldwide. She has recently taken up the role as RAND Professor of Health Services Research, having moved here from the University of Leicester.

    It is, she admits, going to be a challenge. Many different policies and approaches have been tried to date, but few with widespread success, and often with unintended consequences.

    Financial incentives are widely used in the NHS and in the USA, but recent evidence suggests that they have little effect. “There’s a danger that they tend to encourage effort substitution – what people often refer to as ‘teaching to the test’,” explains Dixon-Woods. In other words, people focus on the areas that are being incentivised, but neglect other areas. “It’s not even necessarily conscious neglect. People have only a limited amount of time, so it’s inevitable they focus on areas that are measured and rewarded: it’s an economy of attention as much as anything else.”

    In 2013, Dixon-Woods and colleagues published a study, funded by the Wellcome Trust, evaluating the use of surgical checklists introduced in hospitals to reduce complications and deaths during surgery. The checklists have become the most widely used patient safety intervention in the world and are recommended by the World Health Organization. Yet, the evidence shows that checklists may have little impact, and  her research found that in some situations – particularly in low-income countries – they might even make things worse.

    “The checklists sometimes introduced new risks. Nurses would use the lists as a box-ticking exercise rather than as a true reflection of events – they would tick the box to say the patient had had their antibiotics when there were no antibiotics in the hospital, for example.” They also reinforced the hierarchies – nurses had to try to get surgeons to do certain tasks, but the surgeons used it as an opportunity to display their power and refuse.

    Problems are compounded by a lack of standardisation. Dixon-Woods and her team spend time in hospitals to try to understand which systems are in place and how they are used. Not only does she find differences in approaches between hospitals, but also between units and even between shifts. “Standardisation and harmonisation are two of the most urgent issues we have to tackle. Imagine if you have to learn each new system wherever you go or even whenever a new senior doctor is on the ward. This introduces massive risk.”

    One place that has managed to break this pattern is Northern Ireland, which has overcome the problem of poor labelling of lines such as intravenous lines and urinary catheters

    Even when an institution manages to make genuine improvements in patient safety, too often these interventions cannot be replicated elsewhere or scaled up, leading to the curse of “worked once”, as she describes it.

    One place that has managed to break this pattern is Northern Ireland, which has overcome the problem of poor labelling of lines such as intravenous lines and urinary catheters. A sick patient may have several different lines attached to them; these were not labelled in any consistent way – if at all – so a nurse might use the wrong line or leave a line in place too long, risking infection. Over 18 months, the health service in Northern Ireland came up with a solution. Soon, whether you are in a hospital, a nursing home or a hospice, every line will be labelled the same way.

    “I’m interested in how they managed to achieve that and what we can learn that can be used in the next place that wants to standardise their lines.”

    Dixon-Woods compares the issue of patient safety to that of climate change, in the sense that it is a “problem of many hands”, with many actors, each making a contribution towards the outcome, and where it is difficult to identify who has responsibility for solving the problem. “Many patient safety issues arise at the level of the system as a whole, but policies treat patient safety as an issue for each individual organisation.”

    Nowhere is this more apparent than the issue of ‘alarm fatigue’. Each bed in an intensive care unit typically generates 160 alarms per day, caused by machinery that is not integrated. “You have to assemble all the kit around an intensive care bed manually,” she explains. “It doesn’t come built as one like an aircraft cockpit. This is not a problem a hospital can solve alone. It needs to be solved at the sector level.”

    Dixon-Woods has turned to Professor John Clarkson in Cambridge’s Engineering Design Centre to help. Clarkson has been interested in patient safety for over a decade; in 2004, his team published a report for the Chief Medical Officer entitled ‘Design for patient safety – a system-wide design-led approach to tackling patient safety in the NHS’.

    We need to look through the eyes of the healthcare providers to see the challenges and to understand where tools and techniques we use in engineering may be of value

    John Clarkson

    “Fundamentally, my work is about asking how can we make it better and what could possibly go wrong,” explains Clarkson. It is not, he says, just about technology, but about the system and the people within the system. When he trains healthcare professionals, he avoids using words like ‘risk’, which mean different things in medicine and engineering, and instead asks questions to get them thinking about the system.

    “We need to look through the eyes of the healthcare providers to see the challenges and to understand where tools and techniques we use in engineering may be of value. I have no doubt that if you were to put a hundred engineers into Addenbrooke’s [Hospital], you could help transform its care.”

    There is a difficulty, he concedes: “There’s no formal language of design in healthcare. Do we understand what the need is? Do we understand what the requirements are? Can we think of a range of concepts we might use and then design a solution and test it before we put it in place? We seldom see this in healthcare, and that’s partly driven by culture and lack of training, but partly by lack of time.”

    Dixon-Woods agrees that healthcare can learn much from how engineers approach problems. “Medical science tends to prioritise trials and particular types of evidence, whereas engineering does rapid tests. Randomised controlled trials do have a vital role, but on their own they’re not the whole solution. There has to be a way of getting our two sides talking.”

    Only then, she says, will we be able to prevent further tragedies such as the death of Mary McClinton.

    Healthcare is a complex beast and too often problems arise that can put patients’ health – and in some cases, lives – at risk. A collaboration between the Cambridge Centre for Health Services Research and the Department of Engineering hopes to get to the bottom of what’s going wrong – and to offer new ways of solving the problems.

    Standardisation and harmonisation are two of the most urgent issues we have to tackle. Imagine if you have to learn each new system wherever you go or even whenever a new senior doctor is on the ward. This introduces massive risk
    Mary Dixon-Woods
    Intravenous drip

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    Talk with Your Hands: Communicating across the Sensory Spectrum opens with Hayden Dahmm speaking to camera. He is studying engineering and he’s blind. One of the benefits of being blind, he suggests, is that he is not distracted by physical appearance. The words people use, and how they use them, gives him “a genuine impression of the speaker”.

    Louise Stern is a writer and artist. She is deaf and explains that her native tongue is American Sign Language. Speaking with her hands, she says: “The body is eloquent and conveys layers of emotion and meaning.” When she describes how eye contact is, for a deaf person, an especially beautiful thing, she hesitates – and then says “it makes me feel like they see me”.

    In just ten minutes, Talk with Your Hands conveys the richness of verbal and non-verbal languages and explores how our senses overlap and merge. Through interviews with blind and deaf people, interwoven with insights from neuroscientists, the film demonstrates how we communicate with sounds and gestures – and how each mode of communication has its own characteristics.

    Sign language is not a translation of, or substitute for, verbal language. While spoken language is linear (produced through the channels of our mouths one word at a time), sign language is flowing and simultaneous. Similarly, the spoken word is not just the written word spoken out loud. It’s much more than that, explains Hayden, rather as “poetry is the things that cannot be translated”.

    The capacity for language is what sets mankind apart from other animals. Years ago, scientists looking at brain damage identified the parts of the brain responsible for speaking and comprehension, for hearing and seeing. Now we know that this understanding of how the brain works is far too simplistic: language, and the different ways we use it, colonises most of the brain.

    Talk with Your Hands is one of four films made by Cambridge researchers for the 2016 Cambridge Shorts series, funded by Wellcome Trust ISSF. The scheme supports early career researchers to make professional quality short films with local artists and filmmakers. Researchers Craig Pearson (Wellcome Trust-MRC Cambridge Stem Cell Institute) and Julio Chenchen Song (Department of Linguistics) collaborated with filmmaker Toby Smith.

    The capacity for language is what sets us apart from other animals. Talk with Your Hands, the third of four Cambridge Shorts films, explores the richness of sensory perception in interviews with blind and deaf people together with insights from neuroscientists.  

    Actress Nadia Nadarajah recites a poem using British Sign Language

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    In the mid-17th century, a French missionary called Pierre Pelleprat visited several Caribbean islands before travelling to French Guyana and the South American mainland. In an infamous account of his travels, he described the blacks of the Caribbean as “so hideous and misshapen that they fill you with horror”. Tellingly, however, he did not consider this ugliness to be beyond salvation: “I do not know whether my eyes were charmed, but I usually found [the negroes] better shaped and more pleasant after their baptism.”

    Pelleprat was part of a generation of Europeans who began to travel widely, crossing oceans to encounter people whose cultures seemed alien and uncivilised. Arriving on the shores of the tropical islands of the Caribbean after a long and perilous sea voyage, Pelleprat and his compatriots would have met for the first time large native populations of Caribs as well as Amerindians and Africans. Many were enslaved to white settlers making fortunes in commodities such as sugar and tobacco.

    The missionary's words make shocking reading. But, says Dr Mélanie Lamotte (Faculty of History and Newnham College), it is worth exploring the beliefs that underpinned them. Lamotte is a researcher whose work focuses on colour prejudice and interethnic antagonism in the early modern French empire. A French national, educated at the Sorbonne in Paris and at Cambridge, Lamotte has a personal reason to be interested in the experiences of enslaved people. On her mother’s side, she is the descendant of a slave who, in the 18th century, was taken from the coast of Senegal to work on a sugar cane plantation on the Caribbean island of Guadeloupe.  

    A project to trace her own family ancestry took Lamotte on an often frustrating journey into archival materials. Accounts written by slaves are few; for generations, they were denied the opportunities afforded by education and literacy. She says: “I see my work as a form of historical reparation for the inhabitants of France’s former colonies whose history has long been neglected or twisted by analysts – and I hope that a better understanding of France’s colonial past will shed light on the roots of the social tensions apparent in France today.”

    Research into the various ways in which race has been understood has generally taken the 18th century as its starting point. Encounters between people with different cultures have, of course, taken place for millennia – and it is hard to know how such differences were registered by those who experienced them. Lamotte suggests that records of 17th-century encounters between Europeans and inhabitants of distant lands reveal something remarkable. She says: “Rather than focusing on race as something inborn, early European travellers saw difference as something more fluid – and often as something that could be corrected by imposing ‘civilising’ influences.”

    On the basis of her extensive work on underexplored archival material, Lamotte argues that before the 18th century, “race didn’t matter” in the same way that it came to matter when ideas of racial differences became more fixed, most famously by laws that prohibited marriages between different groups. Accounts from travellers in the French empire reveal that, although notions of blood and breeding were powerful in French society (for example in the preservation of the lineages of French nobility), the non-whites encountered by France’s empire-makers were initially not seen in the same terms.

    Lamotte has looked in detail at records relating to the French empire in three contrasting locations: the island of Guadeloupe in the Caribbean, Ile Bourbon in the Indian Ocean, and Louisiana in the former French colonies of north America. In making direct comparisons between these widespread colonies, her research extends existing scholarship to a global scale and also reflects the fact that many travellers sailed throughout the French empire. Their ideas circulated too – including the notion that non-Europeans could be schooled out of ‘backward’ customs.

    In arguing that “race didn’t matter”, Lamotte does not suggest that prejudice did not exist (it most certainly did) but rather that it took forms dictated by the preoccupations of a society concerned with behaviour, dress and manners – and, of course, with religion. During an era in which the outward signs of politesse were paramount, newly-encountered people were judged, and categorised, in terms of their level of ‘savagery’ and ‘barbarism’ or (at the other end of the scale) ‘civilisation’.

    A good measure of self-interest fuelled the initially cordial relations between the colonialists and the inhabitants of lands seen to be rich in possibilities. In some cases, by forging alliances with the local population, the incomers were able to tap into the local trade networks that were vital to securing goods for export – fur from North America and spices from the East Indies.

    Many of the early French settlers were men, and marriageable women were in short supply.  In 1690 only 16 white women were recorded on Île Bourbon (now La Réunion), an island located in the Southwest Indian Ocean. Most European men on Ile Bourbon married non-white women – despite an ordinance issued in 1674 which forbade “Frenchmen to marry negresses” and “blacks to marry whites” in the colony. 

    In New France (Acadia, Canada, the Great Lakes and the Illinois Country), miscegenation (the mixing of groups) was pursued as a deliberate policy to assist integration. In 1603, Samuel de Champlain, French explorer and founder of the Quebec settlement, reportedly assured local communities that “our young men will marry your daughters, and we shall be one people”. From the outset of French colonisation in the Caribbean, the need to maintain a buoyant slave population created a comparatively more prejudiced and segregated ‘plantation society’.

    Father Mongin, a Jesuit missionary who spent time in the French Caribbean in the 1680s, wrote that “some [negroes] do not lack intelligence and are capable of all sorts of arts and sciences, should they receive the right education”. Thousands of miles across the world, a surgeon named Sieur Dellon, who had spent some time on Ile Bourbon 20 years earlier, wrote in a similar vein that “among [Malagasies], there are some with common sense, quick witted, and who would be fit for the arts and sciences, if they were educated”.

    Education in this context involved imposing all-important French rules of politesse, conversion from idolatry to Christianity, and the stamping out of “absurd” beliefs and “ridiculous” ceremonies. Non-Europeans were described as “sluggish” and indolent. Outrage was expressed when some French settlers in North America, rather than ‘Frenchifying’ the natives, became ‘Indianised’ themselves. In the early 17th century a French administrator complained that, living among the native people, the French coureurs de bois (woodsmen) behaved “like the savages” and enjoyed “an animal life”, doing little more than hunting and fishing.

    The start of the 18th century witnessed a change in attitude on the part of the colonisers. French administrators in North America began to argue against interracial marriages which “would mix good [French] blood with bad [Native American] blood”. In 1723 the colonial Council of Louisiana issued an edict forbidding “all Frenchmen and any other subjects of the king who are white to marry savage women”. The number of mixed marriages dropped, and in 1738 one governor observed that “the Illinois Indians do not invite the French to marry their daughters any longer, and the French do not think about this anymore”.

    Children of mixed European and indigenous heritage (métis) were, by the middle of the 18th century, frequently considered to be inferior. French colonialists complained that métis children were “extremely swarthy” and “naturally lazy”. Dark complexions were seen as indicators of ‘racial’ inferiority – and the alleged licentiousness and brutishness that had long been attributed to the natives were increasingly believed to be ‘fixed’. Official documents listed colonial populations under headings such as Nègre (negro), Mulâtre (mulato), Métis (mixed) and Sauvage (indigenous).

     “Ultimately, ‘racial’ discourses developed partly because the French needed to justify discrimination and segregation towards people who were viewed as a threat to French socio-economic and imperialist ambitions. These people included slaves who could claim emancipation, free peoples of colour who presented as economic competitors, and the large Native American population, unreceptive to French policies of ‘Frenchification’ and evangelisation,” says Lamotte.

    “People continue to use language and ideas inherited from colonial times, for example, by using the term ‘nègres’ to designate blacks, and maintaining the image of blacks as lazy or violent. As the result of centuries of prejudice, many blacks in the Antilles consider themselves inferior to whites. A creole phrase often heard in Guadeloupe when a baby is born is ‘ti-moun la bien soti’, meaning ‘your baby looks good as he or she doesn’t have too dark a skin’. Exposing the ways in which such views took hold over the centuries, and telling the tales of those who lived with prejudice, is a powerful way of shaping a more equal world.”

    As Europe expanded its overseas colonies, fixed ideas of racial differences took hold. Historian Dr Mélanie Lamotte, whose forebears include a slave, is researching a brief period when European notions of ethnicity were relatively fluid.  Early French settlers believed that non-white inhabitants of the colonies could be ‘civilised’ and ‘improved’.

    I see my work as a form of historical reparation for the inhabitants of France’s former colonies – and I hope that a better understanding of France’s colonial past will shed light on the roots of the social tensions apparent in France today.
    Mélanie Lamotte
    Image from Archives Nationales d’Outre Mer (ANOM), Jean-Baptiste Labat, Nouveau voyage aux isles
    Mélanie Lamotte: my family history

    I was born in France, and my mother comes from the island of Guadeloupe, an overseas department in the French Caribbean. The history of France’s colonial empire was, until recently, largely absent from my country’s school curriculum. But I wanted to know more about my family. When I was 20 years old, I reconstructed my family tree, tracing it back three centuries to a slave called Anne Rose. My research inspired me to become a historian. From 2006 to 2015, I received grants from the EU and the UK government to study history at the Sorbonne and Cambridge. Today I’m a historian of slavery, ethnic prejudice and early modern French colonialism.

    In the 18th century, Anne Rose was transported from the coast of Senegal to work on a sugar cane plantation in Guadeloupe. White planters were making fortunes in commodities such as sugar, tobacco, coffee and indigo. One of Anne Rose’s children, a man named Quidi, moved to Pointe Noire, Guadeloupe in 1794. My extended family still lives in that same town. Slavery was officially abolished in the 1790s, in the aftermath of the French revolution. Records suggest that Quidi was freed and, remarkably for a former slave, lived for nearly 100 years. He had a daughter, Demoiselle Anne Rose, in 1799. The title ‘Demoiselle” suggests that she may have been of relatively high status.

    Slavery was permanently abolished in the French colonies in 1848, and blacks in the French Caribbean began to use the names of their slave ancestors as their family names. My family on my mother’s side is still called ‘Annerose’. The grandson of Demoiselle Anne Rose was my granddad’s grandfather. My granddad was told that his grandfather had been homeless, and lived on a beach called ‘Plage Caraïbe’ in Pointe-Noire.

    Slavery is a significant part of French history. The four French overseas departments (former French colonies) are ranked among the poorest regions of the EU. Researchers have shown that this economic distress is in part a consequence of the slavery. The French government has been working to raise the profile of Atlantic slavery in French consciousness. The Law of May 2001 declared slavery and the slave trade to be ‘crimes against humanity’. May 10th is now an annual day of commemoration of Atlantic slavery in France.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    A new paper by a group of researchers from the Universities of Oxford and Cambridge, UNEP World Conservation Monitoring Centre, and University College London (UCL) explores whether Pokémon Go's success in getting people out of their homes and interacting with virtual 'animals' could be replicated to redress what is often perceived as a decline in interest in the natural world among the general public.

    Or, could the game's popularity pose more problems than opportunities for conservation?

    Study author Leejiah Dorward, a doctoral candidate in Oxford University's Department of Zoology, said: "When Pokémon Go first came out, one of the most striking things was its similarity with many of the concepts seen in natural history and conservation. The basic facts and information about Pokémon Go make it sound like an incredibly successful citizen science project, rather than a smartphone game.

    "We wanted to explore how the success of Pokémon Go might create opportunities or challenges for the conservation movement."

    Co-author John C Mittermeier, a doctoral candidate in Oxford's School of Geography and the Environment, said: "There is a widespread belief that interest in natural history is waning and that people are less interested in spending time outside and exploring the natural world.

    "Pokémon Go is only one step removed from natural history activities like bird watching or insect collecting: Pokémon exist as 'real' creatures that can be spotted and collected, and the game itself has been getting people outdoors. What’s going on here, and can we as conservationists take advantage of it?"

    In the paper, the researchers explain that Pokémon Go has been shown to inspire high levels of behavioural change among its users, with people making significant adjustments to their daily routines and to the amount of time spent outside in order to increase their chances of encountering target 'species'. There is also evidence that users are discovering non-virtual wildlife while playing Pokémon Go, leading to the Twitter hashtag #Pokeblitz that helps people identify 'real' species found and photographed during play.

    Pokémon Go, the researchers write, exposes users first hand to basic natural history concepts such as species' habitat preferences and variations in abundance. 'Grass Pokémon', for example, tend to appear in parks, while water-related types are more likely to be found close to bodies of water. There are also four regional species that are continent restricted: Tauros to the Americas, Mr Mime to Western Europe, Farfetch’d to Asia, and the marsupial-like Kangaskhan to Australasia. This differentiation captures a fundamental aspect of natural history observation – that exploring new habitats and continents will lead to encounters with different species.

    And hundreds of people congregated near New York’s Central Park one night over the summer to try to find a rare Vaporeon – something that will sound familiar to birdwatchers used to similar gatherings to see a rare species.

    However, the researchers caution that the success of Pokémon Go could also bring challenges: for example, it may be that this type of augmented reality – featuring engaging, brightly coloured fictional creatures – could replace people's desire to interact with real-world nature, or the focus on catching and battling Pokémon may encourage exploitation of wildlife. There has also been controversy in the Netherlands, where Pokémon Go players have been blamed for damage caused to a protected dune system south of The Hague.

    Co-author Dr Chris Sandbrook, a senior lecturer at UNEP World Conservation Monitoring Centre and affiliated lecturer at the Department of Geography, University of Cambridge, said: "Just getting people outside does not guarantee a conservation success from Pokémon Go. It might actually make things worse – for example, if interest in finding a rare Vaporeon replaces concern for real species threatened with extinction. Real nature could be seen as just a mundane backdrop for more exciting virtual wildlife."

    Dorward added: "One of the positive things about Pokémon Go is that there's a very low barrier for entry. As long as you have a smartphone, you can play – and the game itself does a lot of things for you. Finding ways to break down barriers to engagement with real-life nature is a priority for conservation. Pokémon are also relatable 'characters', whereas modern conservation tends to frame itself purely in scientific terms, which may be off-putting to many.

    "There is something called the biophilia hypothesis, which suggests that people have an in-built affinity with nature and a desire to explore the natural world. If that’s one of the reasons Pokémon Go has proved to be so popular – because it’s a natural history proxy – then that could be a huge boost to conservation. It's possible that the desire to connect with nature is there and to get people to engage with conservation we just need to 'sell' it correctly."

    The paper'Pokémon Go: benefits, costs, and lessons for the conservation movement' is published in the journal Conservation Letters.

    The augmented reality game, designed for mobile devices, allows users to capture, battle and train virtual creatures called Pokémon that appear on screen as if part of the real-world environment. But can the game's enormous success deliver any lessons to the fields of natural history and conservation?

    The basic facts and information about Pokémon Go make it sound like an incredibly successful citizen science project, rather than a smartphone game
    Leejiah Dorward
    Pokemon outside King's College Cambridge

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    The studies are part of BLUEPRINT, a large-scale research project bringing together 42 leading European universities, research institutes and industry entrepreneurs, with close to €30 million of funding from the EU. BLUEPRINT scientists have this week released a collection of 26 publications, part of a package of 41 publications being released by the International Human Epigenome Consortium.

    One of the great mysteries in biology is how the many different cell types that make up our bodies are derived from a single stem cell and how information encoded in different parts of our genome are made available to be used by different cell types. Scientists have learned a lot from studying the human genome, but have only partially unveiled the processes underlying cell determination. The identity of each cell type is largely defined by an instructive layer of molecular annotations on top of the genome – the epigenome – which acts as a blueprint unique to each cell type and developmental stage.

    Unlike the genome, the epigenome changes as cells develop and in response to changes in the environment. Defects in the proteins that read, write and erase the epigenetic information are involved in many diseases. The comprehensive analysis of the epigenomes of healthy and abnormal cells will facilitate new ways to diagnose and treat various diseases, and ultimately lead to improved health outcomes.

    “This huge release of research papers will help transform our understanding of blood-related and autoimmune diseases,” says Professor Willem H Ouwehand from the Department of Haematology at the University of Cambridge, one of the Principal Investigators of BLUEPRINT. “BLUEPRINT shows the power of collaboration among scientists across Europe in making a difference to our knowledge of how epigenetic changes impact on our health.”

    Among the papers led by Cambridge researchers, Professor Nicole Soranzo and Dr Adam Butterworth have co-led a study analysing the effect of genetic variants in our DNA sequence on our blood cells. Using a genome-wide association analysis, the team identified more than 2,700 variants that affect blood cells, including hundreds of rare genetic variants that have far larger effects on the formation of blood cells than the common ones. Interestingly, they found genetic links between the effects of these variants and autoimmune diseases, schizophrenia and coronary heart disease, thereby providing new insights into the causes of these diseases.

    A second study led by Professor Soranzo looked at the contribution of genetic and epigenetic factors to different immune cell characteristics in the largest cohort of this kind created with blood donors from the NHS Blood and Transplant centre in Cambridge.

    Dr Mattia Frontini and Dr Chris Wallace, together with scientists at the Babraham Institute, have jointly led a third study mapping the regions of the genome that interact with genes in 17 different blood cell types. By creating an atlas of links between genes and the remote regions that regulate them in each cell type, they have been able to uncover thousands of genes affected by DNA modifications, pointing to their roles in diseases such as rheumatoid arthritis and other types of autoimmune disease.

    Dr Frontini has also co-led a study with BLUEPRINT colleagues from the University of Vienna that has developed a reference map of how epigenetic changes to DNA can program haematopoietic stem cells – a particular type of ‘master cell’ – to develop into the different types of blood and immune cells.

    Professor Jeremy Pearson, Associate Medical Director at the British Heart Foundation, which helped fund the research, said: “Our genes are critical to our health and there’s still a wealth of information hidden in our genetic code. By taking advantage of a large scale international collaboration, involving the combined expertise of dozens of research groups, these unprecedented studies have uncovered potentially crucial knowledge for the development of new life saving treatments for heart disease and many other deadly conditions.

    “Collaborations like this, which rely on funding from the public through charities and governments across the globe, are vital for analysing and understanding the secrets of our genetics. Research of this kind is helping us to beat disease and improve millions of lives.”

    Departmental Affiliations

    • Professor Nicole Soranzo – Department of Haematology
    • Dr Adam Butterworth – Medical Research Council (MRC)/British Heart Foundation (BHF) Cardiovascular Epidemiology Unit
    • Dr Mattia Frontini – Department of Haematology, and Senior Research Fellow for the BHF Cambridge Centre for Research Excellence
    • Dr Chris Wallace – Department of Medicine and MRC Biostatistics Unit


    • Astle, WJ et al. The allelic landscape of human blood cell trait variation. Cell; 17 Nov 2016; DOI: 10.1016/j.cell.2016.10.042
    • Chen, L et al. Genetic drivers of epigenetic and transcriptional variation in human immune cells. Cell; 17 Nov 2016; DOI: 10.1371/journal.pbio.0000051
    • Javierre et al. Lineage-specific genome architecture links enhancers and non-coding disease variants to target gene promoters. Cell; 17 Nov 2016; DOI: 10.1016/j.cell.2016.09.037
    • Farlik et al. Cell Stem Cell; 17 Nov 2016; DOI: 10.1016/j.stem.2016.10.019

    Cambridge researchers have played a leading role in several studies released today looking at how variation in and potentially heritable changes to our DNA, known as epigenetic modifications, affect blood and immune cells, and how this can lead to disease. 

    BLUEPRINT shows the power of collaboration among scientists across Europe in making a difference to our knowledge of how epigenetic changes impact on our health
    Willem Ouwehand
    Detail of Epigenome

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    Stem cells are the stuff of life – but what’s it like to work with them in the lab? To unlock the secrets of how stem cells diversify into the different parts of our body, and pave the way to medical advances, scientists need to culture dishes of living material.

    This isn’t as easy as it sounds: stem cells flourish only when they are happy. They need lots of food for a start. Because they excrete waste matter, the medium they live in needs replenishing. As they multiply, cells need splitting up so that they have enough space. Before long they’re hungry (and grubby) all over again.

    Caring for stem cells, day in, day out, is a bit like looking after a gang of growing children. Dish Life employs a conversational style and enlists a group of real kids to explain some basic science. The scientists are the endlessly-patient parents and the cells the sometimes-unpredictable kids.

    The film opens a window on to the life of a scientist working with stem cells: life has to be organised around the demands of the cells. If stem cells are round and shiny, rather like Christmas tree decorations, they are healthy. If they are flat or spiky, they might be dying.

    In an engaging and light-hearted way, Dish Life illustrates the high level of commitment required to work successfully with living cells in research that contributes to the development of new treatments for degenerative diseases. Being a scientist is fulfilling – but it’s also a lot of hard work.

    Dish Life is one of four films made by Cambridge researchers for the 2016 Cambridge Shorts series, funded by Wellcome Trust ISSF. The scheme supports early career researchers to make professional quality short films with local artists and filmmakers. Researchers Dr Loriana Vitillo (Stem Cell Institute) and Karen Jent (Department of Social Anthropology) collaborated with filmmaker Chloe Thomas.


    Science is demanding as well as exciting. Dish Life, the final of four Cambridge Shorts films, compares the task of raising stem cells in the lab to the challenge of looking after a gang of unruly kids. In conversation with real-life children, scientists show how tricky it is to work with these ‘super cells’.

    Still from Dish Life, a Cambridge Shorts film

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    Latest research on archaeological sites of the ancient Indus Civilisation, which stretched across what is now Pakistan and northwest India during the Bronze Age, has revealed that domesticated rice farming in South Asia began far earlier than previously believed, and may have developed in tandem with - rather than as a result of - rice domestication in China.

    The research also confirms that Indus populations were the earliest people to use complex multi-cropping strategies across both seasons, growing foods during summer (rice, millets and beans) and winter (wheat, barley and pulses), which required different watering regimes. The findings suggest a network of regional farmers supplied assorted produce to the markets of the civilisation's ancient cities.

    Evidence for very early rice use has been known from the site of Lahuradewa in the central Ganges basin, but it has long been thought that domesticated rice agriculture didn't reach South Asia until towards the end of the Indus era, when the wetland rice arrived from China around 2000 BC. Researchers found evidence of domesticated rice in South Asia as much as 430 years earlier.

    The new research is published today in the journals Antiquity and Journal of Archaeological Science by researchers from the University of Cambridge's Division of Archaeology, in collaboration with colleagues at Banaras Hindu University and the University of Oxford.

    "We found evidence for an entirely separate domestication process in ancient South Asia, likely based around the wild species Oryza nivara. This led to the local development of a mix of 'wetland' and 'dryland' agriculture of local Oryza sativaindica rice agriculture before the truly 'wetland' Chinese rice, Oryza sativajaponica, arrived around 2000 BC," says study co-author Dr Jennifer Bates

    "While wetland rice is more productive, and took over to a large extent when introduced from China, our findings appear to show there was already a long-held and sustainable culture of rice production in India as a widespread summer addition to the winter cropping during the Indus civilisation."

    Co-author Dr Cameron Petrie says that the location of the Indus in a part of the world that received both summer and winter rains may have encouraged the development of seasonal crop rotation before other major civilisations of the time, such as Ancient Egypt and China's Shang Dynasty.

    "Most contemporary civilisations initially utilised either winter crops, such as the Mesopotamian reliance on wheat and barley, or the summer crops of rice and millet in China - producing surplus with the aim of stockpiling," says Petrie.

    "However, the area inhabited by the Indus is at a meteorological crossroads, and we found evidence of year-long farming that predates its appearance in the other ancient river valley civilisations."

    The archaeologists sifted for traces of ancient grains in the remains of several Indus villages within a few kilometers of the site called Rakhigari: the most recently excavated of the Indus cities that may have maintained a population of some 40,000.

    As well as the winter staples of wheat and barley and winter pulses like peas and vetches, they found evidence of summer crops: including domesticated rice, but also millet and the tropical beans urad and horsegram, and used radiocarbon dating to provide the first absolute dates for Indus multi-cropping: 2890-2630 BC for millets and winter pulses, 2580-2460 BC for horsegram, and 2430-2140 BC for rice.

    Millets are a group of small grain, now most commonly used in birdseed, which Petrie describes as "often being used as something to eat when there isn't much else". Urad beans, however, are a relative of the mung bean, often used in popular types of Indian dhal today.

    In contrast with evidence from elsewhere in the region, the village sites around Rakhigari reveal that summer crops appear to have been much more popular than the wheats of winter.

    The researchers say this may have been down to the environmental variation in this part of the former civilisation: on the seasonally flooded Ghaggar-Hakra plains where different rainfall patterns and vegetation would have lent themselves to crop diversification - potentially creating local food cultures within individual areas.

    This variety of crops may have been transported to the cities. Urban hubs may have served as melting pots for produce from regional growers, as well as meats and spices, and evidence for spices have been found elsewhere in the region.

    While they don't yet know what crops were being consumed at Rakhigarhi, Jennifer Bates points out that: "It is certainly possible that a sustainable food economy across the Indus zone was achieved through growing a diverse range of crops, with choice being influenced by local conditions.

    "It is also possible that there was trade and exchange in staple crops between populations living in different regions, though this is an idea that remains to be tested."

    "Such a diverse system was probably well suited to mitigating risk from shifts in climate," adds Cameron Petrie. "It may be that some of today's farming monocultures could learn from the local crop diversity of the Indus people 4,000 years ago."

    The findings are the latest from the Land, Water and Settlement Project, which has been conducting research on the ancient Indus Civilisation in northwest India since 2008.

    Thought to have arrived from China in 2000 BC, latest research shows domesticated rice agriculture in India and Pakistan existed centuries earlier, and suggests systems of seasonal crop variation that would have provided a rich and diverse diet for the Bronze Age residents of the Indus valley.

    Our findings appear to show there was already a long-held and sustainable culture of rice production in India as a widespread summer addition to the winter cropping during the Indus civilisation
    Jennifer Bates
    Zebu cattle pulling a wagon beside a pond at the Indus Civilisation site of Rakhigarhi in northwest India

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    Fear related disorders affect around one in 14 people and place considerable pressure on mental health services. Currently, a common approach is for patients to undergo some form of aversion therapy, in which they confront their fear by being exposed to it in the hope they will learn that the thing they fear isn’t harmful after all. However, this therapy is inherently unpleasant, and many choose not to pursue it. Now a team of neuroscientists from the University of Cambridge, Japan and the USA, has found a way of unconsciously removing a fear memory from the brain.

    The team developed a method to read and identify a fear memory using a new technique called ‘Decoded Neurofeedback’. The technique used brain scanning to monitor activity in the brain, and identify complex patterns of activity that resembled a specific fear memory. In the experiment, a fear memory was created in 17 healthy volunteers by administering a brief electric shock when they saw a certain computer image.  When the pattern was detected, the researchers over-wrote the fear memory by giving their experimental subjects a reward.

    Dr. Ben Seymour, of the University of Cambridge’s Engineering Department, was one of the authors on the study.  He explained the process:

    "The way information is represented in the brain is very complicated, but the use of artificial intelligence (AI) image recognition methods now allow us to identify aspects of the content of that information. When we induced a mild fear memory in the brain, we were able to develop a fast and accurate method of reading it by using AI algorithms. The challenge then was to find a way to reduce or remove the fear memory, without ever consciously evoking it". 

     “We realised that even when the volunteers were simply resting, we could see brief moments when the pattern of fluctuating brain activity had partial features of the specific fear memory, even though the volunteers weren't consciously aware of it. Because we could decode these brain patterns quickly, we decided to give subjects a reward - a small amount of money - every time we picked up these features of the memory“.

    The team repeated the procedure over three days. Volunteers were told that the monetary reward they earned depended on their brain activity, but they didn’t know how. By continuously connecting subtle patterns of brain activity linked to the electric shock with a small reward, the scientists hoped to gradually and unconsciously override the fear memory.

    Scan of brain showing information associated with a fear memory Credit Ai Koizumi

    Dr Ai Koizumi, of the Advanced Telecommunicatons Research Institute International, Kyoto and Centre of Information and Neural Networks, Osaka, led the research:

     "In effect, the features of the memory that were previously tuned to predict the painful shock, were now being re-programmed to predict something positive instead."

    The team then tested what happened when they showed the volunteers the pictures previously associated with the shocks.

    "Remarkably, we could no longer see the typical fear skin-sweating response. Nor could we identify enhanced activity in the amygdala - the brain's fear centre,” she continued. “This meant that we'd been able to reduce the fear memory without the volunteers ever consciously experiencing the fear memory in the process ".

    Although the sample size in this initial study was relatively small, the team hopes the technique can be developed into a clinical treatment for patients with PTSD or phobias.

    "To apply this to patients, we need to build a library of the brain information codes for the various things that people might have a pathological fear of, say, spiders” adds Dr Seymour.”Then, in principle, patients could have regular sessions of Decoded Neurofeedback to gradually remove the fear response these memories trigger".

    Such a treatment could have major benefits over traditional drug based approaches. Patients could also avoid the stress associated with exposure therapies, and any side-effects resulting from those drugs.

    Koizumi et al. “Fear reduction without fear through reinforcement of neural activity that bypasses conscious exposure” Nature Human Behaviour: DOI:10.1038/S41562-016-0006

    Researchers have discovered a way to remove specific fears from the brain, using a combination of artificial intelligence and brain scanning technology. Their technique, published in the inaugural edition of Nature Human Behaviour, could lead to a new way of treating patients with conditions such as post-traumatic stress disorder (PTSD) and phobias.

    The challenge then was to find a way to reduce or remove the fear memory, without ever consciously evoking it
    Ben Seymour
    Large house spider on kitchen floor

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    It is easy to imagine ancient Rome as a society where the emperors, senators and other nobles sat on top of an undifferentiated, static mass of ordinary Romans (who in turn sat above the mass of slaves). But Roman society was, in fact, highly stratified throughout and people of all social levels went to great lengths to better their lot in life and climb the social ladder. Some even succeeded in joining the empire’s richest ranks.

    The traditional view of the Roman people lounging around at the games ignores just how much they had to work. As Pliny the Younger noted when recommending a young man to a friend: “He loves hard work as much as poor people usually do”. Most free men in the country were peasants and in the towns and cities were unskilled labourers, doing such jobs as carrying the goods imported to the docks of Rome at Ostia and working on building the great imperial buildings, such as the Colosseum.

    Manual work was never going to pay well and probably provided little more than a subsistence income. The main way for people to improve their quality of life was to acquire a skill. If a worker could learn a craft then his income as an artisan could comfortably rise to double or treble that of an unskilled worker.

    Average wages, in denarii, in AD301.

    Get a trade

    The variety of skilled jobs we find in the sources is extraordinary. More than 225 trades are listed on tombstones and other inscriptions. A letter attributed to the emperor Hadrian, for example, gives us an idea of the competitive industry that the urban population of Alexandria showed in their pursuit of making a living:

    No one is idle. Some are blowers of glass, others makers of paper, all at least are weavers of linen or seem to belong to one craft or another … Their only god is money, which everyone adores.

    Women also played an important economic role. That women are listed in only 35 different occupations, however, shows that their opportunities were far more limited. They worked mainly in the service sector, spinning wool, making jewellery, serving in taverns, hairdressing and making and mending clothes.

    Banking and commerce

    If a Roman had some capital, lending money could be very profitable. One source describes commercial moneylenders“rejoicing in the accrual of money which increases day by day”. Their joy was understandable as 12% interest was typically charged for unsecured loans. Interest on short-term loans in crisis periods could reach 50%. And if the borrower failed to make payments on time, creditors held considerable legal powers and could sell all the debtor’s possessions – including his children – into slavery.

    Roman gold coins found in India: trade was a very lucrative profession in ancient Rome.British Museum

    Trade was anther profitable business – and the empire’s shipping routes were busy with vessels transporting all manner of goods, such as wine, pottery, olive oil, spices and slaves. The aristocracy looked down on trade as being beneath them but that did not stop them from using front-men to carry out business on their behalf. It seems that former slaves were often used in this role, presumably because they could be more trusted to do what they were told and hand over the bulk of the profits at the end of the deal.

    These freedmen frequently proudly asserted their prosperous – free – status on inscriptions on their tombs. Some former slaves of emperors became extremely influential and rich, such as Narcissus – a former slave of Emperor Claudius in the first century AD who went on to amass considerable wealth and influence as a freedman. The status of freedman as former slaves, however, meant they were never fully accepted among the social elite.

    Big league

    If a Roman wanted to make it really big then he needed to become a celebrity. Successful gladiators were adored by the crowds. Mosaics featuring them were widespread. They were a common topic of conversation and even a clay baby’s bottle at Pompeii was stamped with a figure of a gladiator – presumably so that the infant could drink in strength and courage along with its milk. The fighters were handsomely paid for their work but, of course, few survived to enjoy a prosperous old age.

    Gladiators from the second century AD depicted in a mosaic in Zliten, modern-day Libya.

    Charioteers actually seem to have earned the most, reflecting the great popularity of the regular chariot racing – the Circus Maximus held 250,000 spectators. The most successful charioteer known was the second-century AD champion Gaius Appeleius Diocles, from Lusitania, now Portugal. In a 24-year career, he competed in 4,257 races, winning 1,462 of them. His career earnings reached 35,863,120 sesterces – estimated at US$15 billion. Given it took only one million sesterces to qualify as a senator, the size of his fortune is clear.

    So it took hard work, patience – and sometimes a great deal of risk – but if it all came good, any Roman could hope to rise up to a position where they owned a villa and amassed a fortune. Those who achieved it, though, were the lucky few.The Conversation

    Jerry Toner, Director of Studies in Classics, Churchill College, University of Cambridge

    This article was originally published on The Conversation. Read the original article.

    Jerry Toner, Director of Studies in Classics, Churchill College, University of Cambridge, discusses the stratification of Roman society.

    Praetorian guards - Louvre Lens, France.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    Researchers have developed a new imaging technique that makes it possible to study why proteins associated with Alzheimer’s and Parkinson’s diseases may go from harmless to toxic. The technique uses a technology called multi-dimensional super-resolution imaging that makes it possible to observe changes in the surfaces of individual protein molecules as they clump together. The tool may allow researchers to pinpoint how proteins misfold and eventually become toxic to nerve cells in the brain, which could aid in the development of treatments for these devastating diseases.

    The researchers, from the University of Cambridge, have studied how a phenomenon called hydrophobicity (lack of affinity for water) in the proteins amyloid-beta and alpha synuclein – which are associated with Alzheimer’s and Parkinson’s respectively – changes as they stick together. It had been hypothesised that there was a link between the hydrophobicity and toxicity of these proteins, but this is the first time it has been possible to image hydrophobicity at such high resolution. Details are reported in the journal Nature Communications.

    “These proteins start out in a relatively harmless form, but when they clump together, something important changes,” said Dr Steven Lee from Cambridge’s Department of Chemistry, the study’s senior author. “But using conventional imaging techniques, it hasn’t been possible to see what’s going on at the molecular level.”

    In neurodegenerative diseases such as Alzheimer’s and Parkinson’s, naturally-occurring proteins fold into the wrong shape and clump together into filament-like structures known as amyloid fibrils and smaller, highly toxic clusters known as oligomers which are thought to damage or kill neurons, however the exact mechanism remains unknown.

    For the past two decades, researchers have been attempting to develop treatments which stop the proliferation of these clusters in the brain, but before any such treatment can be developed, there first needs to be a precise understanding of how oligomers form and why.

    “There’s something special about oligomers, and we want to know what it is,” said Lee. “We’ve developed new tools that will help us answer these questions.”

    When using conventional microscopy techniques, physics makes it impossible to zoom in past a certain point. Essentially, there is an innate blurriness to light, so anything below a certain size will appear as a blurry blob when viewed through an optical microscope, simply because light waves spread when they are focused on such a tiny spot. Amyloid fibrils and oligomers are smaller than this limit so it’s very difficult to directly visualise what is going on.

    However, new super-resolution techniques, which are 10 to 20 times better than optical microscopes, have allowed researchers to get around these limitations and view biological and chemical processes at the nanoscale.

    Lee and his colleagues have taken super-resolution techniques one step further, and are now able to not only determine the location of a molecule, but also the environmental properties of single molecules simultaneously.

    Using their technique, known as sPAINT (spectrally-resolved points accumulation for imaging in nanoscale topography), the researchers used a dye molecule to map the hydrophobicity of amyloid fibrils and oligomers implicated in neurodegenerative diseases. The sPAINT technique is easy to implement, only requiring the addition of a single transmission diffraction gradient onto a super-resolution microscope. According to the researchers, the ability to map hydrophobicity at the nanoscale could be used to understand other biological processes in future.

    The research was supported by the Medical Research Council, the Engineering and Physical Sciences Research Council, the Royal Society and the Augustus Newman Foundation.

    Marie N. Bongiovanni et al. ‘Multi-dimensional super-resolution imaging enables surface hydrophobicity mapping.’ Nature Communications (2016). DOI: 10.1038/NCOMMS13544 

    A new super-resolution imaging technique allows researchers to track how surface changes in proteins are related to neurodegenerative diseases such as Alzheimer’s and Parkinson’s diseases.

    These proteins start out in a relatively harmless form, but when they clump together, something important changes.
    Steven Lee
    Brain showing hallmarks of Alzheimer's disease (plaques in blue)

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    Angela Merkel has finally confirmed that she will run for reappointment as German chancellor in the country’s 2017 parliamentary elections. Many have hoped for this moment, despite the setbacks of the past few years. There is a strong sense that the world needs Merkel now more than ever. She has made some unpopular decisions in her 11 years as chancellor but she is, to many, the antithesis of Donald Trump.

    Tough times

    Chancellorship has been no walk in the park for Merkel of late. In 2015, she upset many supporters of her party, the Christian Democratic Union (CDU), by opening German borders to hundreds of thousands of refugees. To curb the influx, Merkel had to commit to a dirty deal with Turkish president Recep Tayyip Erdoğan, offering generous EU visa terms for his citizens in exchange for stopping millions of refugees from entering Europe.

    The pressure intensified in 2016, when a spate of sexual assaults, apparently committed by migrants, stirred up a significant backlash against the new arrivals.

    Merkel’s CDU went on to suffer bitter setbacks in federal elections. And an Islamic State-inspired axe attack by a young man from Afghanistan in Bavaria in July 2016 was seen as evidence that Merkel’s open door refugee policy had failed.

    In September 2016, Merkel’s popularity reached a five-year low. No more than 45% of German people were satisfied with her performance. During a public speech on German Unity Day in Dresden, angry protesters drew on Nazi language and called Merkel a “traitor of the people” and demanded her resignation.

    On the international stage, the Brexit vote was a huge blow to Merkel and her pro-European course. She now needs to negotiate an exit for Britain without also triggering the demise of the entire EU project.

    And as if all of this wasn’t enough, Merkel will have to deal with Donald Trump as president of the United States. After Trump’s election victory, Merkel gave a remarkable speech, offering him close collaboration on the basis that the new American president would respect freedom, democracy and the dignity and worth of all people.

    While most other world leaders gave bland statements of half-hearted hope that the president-elect would not see through on his more controversial promises, the German leader was sending a strong signal – and even a challenge.

    After the open sexism and racism that characterised Trump’s campaign, it looks like close collaboration is an extremely unlikely scenario. Merkel was effectively saying that standing up to such prejudice was more important to her than relations with the US – although whether she remains true to her principles should she be re-elected is another question.

    A sense of responsibility

    Given the overwhelming number of problems facing whoever wins in 2017, the easiest decision would have been to let someone else do the job of chancellor. But Merkel isn’t one for easy solutions.

    There was little enthusiasm or excitement in her voice as she announced her candidacy, and she openly admitted that standing had been a difficult decision. Although Merkel didn’t mention any names, it was obvious that she wanted to send a message to Trump and right-wing populists in Europe. She emphasised that political decisions need to be based on the fundamental values of freedom, democracy, respect for the law, and the dignity of every human being.

    Merkel responds to Trump’s victory.

    Following her announcement, Merkel appeared on a talk show and left no doubt that she expected difficult times and an “exhausting and challenging” election campaign. Yet, she added that she felt confident that she could defend these values that hold our society together.

    Merkel openly challenges Trump because there is a lot more at stake than Anglo-German relations. Fears grow that in 2017 the right-wing populist Marine Le Pen could become the next French president, and that Europe’s far right will grow further. Against this background, Merkel sees an urgent need to oppose the populism, racism and gender ideology of the extreme right, and this feeling is shared by many Germans.

    Can she win?

    Merkel’s statement was a manifestation of everything that people love and hate about her. She carefully assesses situations before taking decisions, she is stubbornly committed to Christian values and the European project, she seeks consensus rather than victory, and she displays a striking lack of charisma.

    The New York Times has called Merkel “the liberal west’s last defender” and while she is too smart to get excited about such headlines, she knows that her approach and personality traits have become a rare commodity in the post-truth era of global politics.

    Merkel has described herself as a “chancellor for turbulent times” and there is good reason to believe that she could act as an important counterbalance to the charismatic, impulsive, erratic, and polemical President Trump.

    Recent polls suggest Merkel’s popularity scores are slowly recovering. Although it is to be expected that some CDU voters will switch to vote for the right-wing populist Alternative for Germany (AFD), she has a good chance of re-election. She may not win an outright majority, but her party would be able to form a coalition with various other parties, which would leave the CDU in a strong position to push through their candidate for the chancellorship.The Conversation

    Katharina Karcher, Sutasoma Research Fellow at Lucy Cavendish College, Cambridge, University of Cambridge

    This article was originally published on The Conversation. Read the original article.

    In this article, Katharina Karcher from the Department of German and Dutch discusses the election prospects of the self-described “chancellor for turbulent times”. 

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    Throughout 2016, politicians and pundits have been caught off guard time and time again by the successes of populist politics – first Brexit, then Trump and the rise of far-right parties across Europe. Casting about for answers, advocates of the centre are trying to account for unexpected shifts in the allegiances of white, working-class and rural voters.

    But instead trying to understand why particular demographic groups are attracted to anti-immigrant, anti-establishment rhetoric, perhaps it’s better to ask what can be done to bridge the social divides it creates.

    We’ve been carrying out research into local responses to national economic and political crises. Our study compared charitable programmes and services, interfaith collaborations and economic initiatives across four capital cities of Europe: London, Rome, Paris and Berlin. We wanted to understand how local relationships change when people are faced with scarce resources, violence and a sudden influx of refugees.

    Rather than becoming more hostile and divided under these circumstances, we actually found that individuals and groups of different ethnic, national, socio-economic and religious backgrounds build networks and cooperate, in order to protect the local community.

    This can take the form of supporting a charity, or promoting small businesses and employment opportunities. One example is a food bank in London, which depends on donations from both Muslim and Christian supporters. Another is a Protestant refugee project in Berlin, which has reached out to the secular, and often militantly atheist, young population, who are seeking ways to support diversity and multicultural life in their city.

    Catholic care

    Religious leaders are also taking on a significant role in community activism. They are increasingly responsible for uniting individuals of different backgrounds, and espousing values of liberalism, universalism and tolerance.

    For instance, the Italian state has struggled to deal with the arrival of hundreds of thousands of migrants – particularly those who want to travel on to other European countries such as France or the UK. While the national government remains largely disinterested, Catholic charities have intervened to offer protection and basic hospitality.

    A volunteer meets Syrian refugee children in Rome.Telenews/EPA


    The assumption of responsibility has become politicised – yet, rather than seeking the protection of a particular demographic, local activists criticise policies that discriminate against migrants and increase poverty. In Rome, makeshift camps hosting refugees have become spaces for collective political action. We witnessed refugees and volunteers routinely venturing from their temporary shelters into the city centre, to protest and demand representation and protection.

    A broader view

    The initiatives that do work with the state – such as the Paris-based organisation Mozaik RH, which helps young people from minority and low-income backgrounds find jobs – point to failures to live up to national values such as “egalité”. They call for public and private sector employers to hire more applicants from diverse origins to promote equal life opportunities.

    The aim of all of these initiatives is to build and sustain a local infrastructure for collective agency that demonstrates power and control. Faith can inspire individual motivation, but the local community and its initiatives are represented as being inclusive – and progress is associated with cooperation between diverse actors.

    Church and community.from


    As one Church of England vicar put it: “We need to broaden our scope” to bring in groups with different affiliations, or none at all – “we need to broaden our view of what religion is”. In other words, religion is about expressing values and public solidarity as much as professing a particular faith.

    But can these very local, urban examples be scaled up into a national political strategy? One problem is that these initiatives are typically “quiet”. They may attract local or even national media attention – such as the Muslim-Jewish café for low-income residents in Nottingham (funded by Near Neighbours). But they succeed because they are about everyday cooperation, rather than by expressing anger or discontent.

    Local lessons

    That said, there are a few lessons that national political parties can draw from local experience. For one thing, local activists trust government officials who enable and appreciate community cooperation – even when they have little influence over its processes and outcomes.

    What’s more, the argument that cooperation can benefit everyone is persuasive and effective. Participants recognise the mutual benefits of sharing assets such as buildings. They also realise that cooperation offers an opportunity to make real changes – ultimately achieving greater control over their own lives.

    Perhaps most importantly, local activism across countries and neighbourhoods evokes republican notions of citizenship and an understanding of a general public good – as opposed to competition for resources based on ethnic, racial, or economic grounds. However, it’s still important for state authorities to safeguard rights such as freedom of expression and access to public benefits. This kind of citizenship is always a work in progress, so people are motivated to continue helping throughout their lives.

    This motivation transcends periods of crisis and changes in government, as well as transformations in the make-up of the local population. By recognising the cooperation across divisions that is already going on, it is possible for political parties – especially those on the left – to start rebuilding themselves for the future. They can argue for effective, collective engagement to confront the short- and long-term challenges facing us all.

    The Conversation

    Shana Cohen, Senior Research Associate, University of Cambridge; Jan-Jonathan Bock, Research Fellow, University of Cambridge, and Samuel Everett, Research Fellow, University of Cambridge

    This article was originally published on The Conversation. Read the original article.

    Shana Cohen and colleagues from the Woolf Institute argue that the political left in Europe should look to the local cooperation across religious and cultural divisions that is already going on across the continent. 

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    All societies divide people into male or female. There is a biological truth behind this: different sex chromosomes (XY,XX). But could many gender differences be down to social conditioning? If we treated girls and boys the same from birth, what would the consequences be? More equal opportunities? Or a complete breakdown of the concepts of masculinity and femininity? These ideas partly depend on what we understand by “gender identity”.

    Gender identity is not a simple concept. It is usually defined as whether someone thinks of themselves as male or female, though it’s more than that. Even this is not a simple, binary division between all human beings. However, we do know that the hormones the brain is exposed to in early pregnancy have powerful effects on gender identity.

    For example, there’s a condition called androgen insensitivity syndrome. Girls with this condition are born looking just like other girls. Only at puberty do things start to change. This is because they are actually genetic males (they have the male XY chromosomes). They also have testes, hidden in their abdomen, but no uterus or ovaries.

    The condition is caused by a genetic insensitivity to the hormone testosterone, so that while these girls secrete male-type levels of testosterone, it doesn’t have any effect on their brain (or anywhere else). The important point is that their gender identity is female. Does that mean that testosterone is ultimately what makes someone masculine? The experimental evidence suggests as much. Giving little female rats testosterone during early life makes them very male-like, and the opposite occurs if little males are castrated.

    Testosterone seems to be important, but is it the whole story? Is the fact that individuals with androgen insensitivity syndrome look like women responsible for others treating them as female, thus influencing how they see themselves?

    Women with AIS and related DSD conditions who want their condition to be represented by real, proud people.Ksaviano/wikipedia, CC BY

    In the 1960s, John Money, a prominent psychologist, convinced himself that gender identity was independent of early hormones. Put simply, if a parent thought their baby was a boy, and treated him as such, then he developed a male gender identity, and vice versa. This idea was put to the test: after a surgical accident, a one-year old boy was castrated and given a vagina. He was dressed as a girl and given a female name. But it failed. Eventually, the “girl” reverted to being a boy. You might think that was the end of the “parent” theory of gender identity. But a second case, which started when the baby was two months, succeeded. The “boy” grew up as a “girl” and accepted her gender identity, though she was bisexual.

    So why the different results? Note that single case reports are unreliable as evidence. But it seems likely that exposure of the brain to testosterone during development does influence various aspects of sexuality, including gender identity. We also know that the brain in early life is very susceptible to external events. So both testosterone and parental behaviour can influence gender identity.

    Beyond hormones

    But gender identity is also how a person expresses themselves in that society. In a society that represses expressions of sexuality, this will alter how women and men see themselves. The important point here is that gender identity is both “biological” and “social”. But none of these factors results in a simple binary division.

    So could we abolish differences in gender by altering upbringing? Schemes exist to minimise gender-stereoptypical play behaviour, for example some Scandinavian nurseries. While this may have some impact, research has nevertheless shown that little boys still prefer to play with trains, and little girls with dolls. Giving such toys to societies that have never seen them in real life has the same result.

    There are, of course, established gender differences in muscular strength and height that are not controversial. And yet there are women who are stronger or taller than some men: in other words, there is an overlap between the sexes despite the sex difference. Accepting that there may be gender differences in brain function has proved much more controversial. Many studies have shown, for example, that males are better at visuo-spatial tasks and females are better at languages and empathy. These differences are small and overlap, so sometimes they are not observed; but we should not discount their influence.

    There are also well-established but very small gender differences in the brain, such as men having a larger hypothalamus. The hypothalamus is responsible for initiating eating, drinking, sex and other behaviours essential for survival. Relating these differences to those in behaviour has not, so far, been very successful: this may reflect our ignorance of how the brain actually works.

    Soceity’s responsibility

    There are those who decry the small differences that have been recorded, or even consider that they do not exist. But why should we want to abolish them? It seems to me that these both reflect identity and contribute to it.

    It’s no secret that sex differences have been used as an excuse for gender inequality. But that just means we need to redress that inequality, not deny that gender differences exist. It’s opportunity that is crucial.

    A man’s job?Alfred T. Palmer

    If this were equal, would we see an even distribution of males and females across all occupations and activities? Not in my opinion. If a job requires physical strength, then it is likely that men will predominate. Also, in the branch of medicine dealing with brain disorders, about 50% of psychiatrists are female, but only about 15-20% are neurologists, and a mere 5% neurosurgeons. Is this gender-related prejudice, or individual preference? Should we insist on an equal gender distribution? Of course not, provided the choice was unfettered. It may be that males are attracted by more technical aspects of medicine, and females by the more person-orientated specialities for reasons that are not just due to upbringing or expectations, but genuine differences in the brain.

    But, of course, social norms also contribute to which professions we choose. So we have to make an effort to ensure that women are not hindered from a free choice of profession by social expectations, burdens of child-rearing or selective education. But ultimately, an unequal gender distribution is no longer controversial if opportunities are the same for all. If gender differences then remain, we should accept them.

    Thankfully we now see an increasing number of women as distinguished scientists, CEOs of major companies and world leaders. We don’t even bat an eyelid when a woman plays King Lear, that most masculine of roles. Gender identities are changing; but let us not muddle the essential distinction between similarity and equality.

    Joe Herbert, Professor Emeritus of Neuroscience, University of Cambridge

    This article was originally published on The Conversation. Read the original article.

    Joe Herbert, Emeritus Professor of Neuroscience, explores what we mean by 'gender identity' and asks whether we should insist on an equal gender distribution across occupations and activities.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    Donald Trump’s astonishing rise to the presidency has put racism at the heart of American politics. From the very start of his campaign, Trump called Mexicans “criminals” and “rapists” while pledging to build a wall between the US and its southern neighbour. He shocked the world by promising to ban Muslim visitors from the US, and is now reportedly considering a “Muslim registration system”. He dismissed the concerns of the Black Lives Matter movement and refused to disavow the support he received from white supremacists.

    Among his supporters is David Duke, former Grand Wizard of the Ku Klux Klan, who described Trump’s triumph as a victory for “our people”. A week after Trump’s victory, a white nationalist group met in Washington DC to “hail Trump” with Hitler salutes and decry the mainstream media with the Nazi-era term “Lügenpresse”, or “lying press”.

    After the symbolic breakthrough of Barack Obama’s presidency, this feels like a shocking step backwards on the issue of race. But it’s important not to overstate America’s progress during the Obama years, nor to ignore the ways in which racism extends far beyond the “whitelash” of Trump’s improbable rise.

    Instead, to properly confront America’s racist reality, we need a properly nuanced way to think about it in all its complexity and intractability. Broadly speaking, we can divide racism into three categories: structural, unconscious and unapologetic.

    Structural racism refers to the ways racial inequality endures across generations. Racial gaps in household wealth, homeownership and unemployment rates are still enormous. According to the federal government, America’s schools are more segregated today than they were a decade ago. Unarmed African-Americans are substantially more likely to be physically harassed by the police, and around six times as likely to be incarcerated as whites.

    Playing into these problems is unconscious racism. This term describes the ways people unintentionally discriminate against others on the basis of race. We know from extensive research that many employers treat people of colour differently from whites when they apply for jobs or promotions, even though they insist that they aren’t personally racist. Social scientists call this “unconscious bias”, and many US government agencies, public institutions and companies have only recently begun to tackle it.

    With such stark differences at the root of black and Latino experience, why don’t white people see racial equality as an urgent national imperative? Unconscious racism shapes and sustains structural racism, and leads white people away from the facts of persistent inequality. This interplay helps to explain how American society fails to prioritise racial justice despite the existence of these enormous divides.

    From implicit to explicit

    The third form of racism is what we’ve seen in the Trump campaign: overt efforts to stereotype or rank people on the basis of race, and “dog-whistle” racism which uses coded language to achieve the same effect.

    Overt racism is alarming and dangerous, and has the potential to set back race relations considerably. But to solve the deeper problem of race relations, the US’s leaders must not only condemn the unapologetic racists of the far-right; they must tackle racism’s structural and unconscious dimensions. This can only happen if the past and the present are kept in sharp focus.

    American racism is not just history.EPA/Jim Lo Scalzo

    Black people in America were denied the opportunity to own property for centuries; in fact, they were themselves held as property, and exploited to produce enormous wealth for their white owners and for the nation more broadly. Even after slavery was abolished in 1865, African-Americans were subjected to another century of open discrimination in housing, employment and every other aspect of communal life.

    As did the emergence of a black middle class in the 1960s and 1970s, Barack Obama’s rise to the White House has had a huge and positive effect on American society. But Obama’s election also enabled some (mostly white) commentators to declare that the US had “moved beyond” race – that the debts of slavery and racism had been paid in full, and that anyone still complaining was guilty of “racial entitlement”.

    Armed with this mistaken assumption, many white conservatives have dismissed black complaints of police misconduct as spurious or entitled, insisting that Obama’s victory proved there is no ceiling for people of colour in America. With this cynical twist, they can frame any action against racial inequality as a form of undeserved special treatment.

    Obama himself has addressed the question of race sporadically and cautiously, no doubt reasoning that white conservatives would seize on a full-scale assault on unconscious and structural racism as evidence of “bias” or self-interest. With Donald Trump in the White House, Democrats and progressives have a fresh opportunity to attack the problem of racism in all its guises.

    They won’t have the support of the president or a Congressional majority, until the 2018 midterm elections at least. But racism has always run deeper than the electoral cycle. Solving it demands education, dialogue, protest, activism and energy. These resources will be in limited supply given the sheer number of challenges thrown up by a Trump presidency, but they will be vital to the work of healing the deepest division in American life.

    The Conversation

    Nicholas Guyatt, University Lecturer in American History, University of Cambridge

    This article was originally published on The Conversation. Read the original article.

    Racism in the US has always run deeper than the electoral cycle, writes Nicholas Guyatt, University Lecturer in American History. Solving it demands education, dialogue, protest, activism and energy.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    The uncertainty created by the EU referendum left the chancellor with relatively little wiggle room. The Office for Budget Responsibility reduced its forecast of growth in the coming year from 2.3% to 1.4%, with adverse implications for tax revenues. Faced with this, the chancellor has produced a statement that is broadly responsible, prioritising measures that strengthen the resilience of the economy in a turbulent global context.

    This entails a relaxation of fiscal constraints. Hammond has delayed both the return to fiscal balance and the time at which the debt:GDP ratio starts to fall. Some will perceive this as can-kicking, but it seems wise.

    The chancellor has also announced major new infrastructure investment – in R&D, transport, digital resources and housing. This is welcome and should bring a boost to the economy.

    A particularly important provision concerns the reduction in the Universal Credit taper rate from 65% to 63%, increasing the amount of in-work benefits people can receive. This could be seen as a response to the view that the referendum result reflected a call for change by those most adversely affected by austerity. But the adjustment will most help the least needy of those who receive Universal Credit. It is a measure for the “just about managing” more than the “frankly not managing”.

    Michael Kitson, University Senior Lecturer in International Macroeconomics, Cambridge Judge Business School

    A damp squib: economic policy needed a reboot and instead it got a light makeover. The economy is facing two big challenges: productivity growth is stagnant (“shocking” according to the chancellor) and the Brexit iceberg is looming.

    First, Philip Hammond could have recast fiscal policy to allow significantly more public investment in the economy. Instead he merely acknowledged that the government will fail to achieve its fiscal targets. Monetary policy is running out of steam and an expansionary fiscal policy is one of the few effective options to deal with the challenges ahead. A preoccupation with “balancing the books” makes sense for a household, but not for an economy that is suffering from a severe lack of investment.

    Second, there is the need for a coherent industrial policy to promote productivity growth. Instead we get a few piecemeal initiatives which are more symbolic than substantial. There is £23 billion over five years for a new investment fund. Yes, “every little helps”, but this is a paltry figure when you consider that the annual size of the British economy was £1,865 billion in 2015. The devil will be in the detail, but the investment fund is likely to only amount to around 0.2% of GDP. As a result, the UK will continue to lag behind the investment in innovation of most of its major industrialised peers.

    The Autumn Statement was a lost opportunity to implement a major transformation in the economy’s productive potential and capacity for innovation – which are the best ways of ensuring future prosperity for “hard-working families”.


    Investment and productivity


    David Bailey, Professor of Industry, Aston University

    The marked deterioration in growth forecasts and the public finances announced in the Autumn Statement meant that the government didn’t have many options in terms of fiscal giveaways to develop a wide ranging industrial strategy. But, as Hammond quite rightly stressed, UK productivity lags behind that of other countries.

    To tackle this, he announced a “new national productivity investment fund” worth £23 billion which will focus on innovation and infrastructure. Investment in R&D will also rise by £2 billion a year by 2020. But this is small scale stuff.

    There were some other, good small scale steps like boosting support for exports, more funding for venture capital for growing firms, and £390m for low carbon vehicles and connected cars. But whether Hammond’s pretty modest package stacks up as being able to transform UK productivity is surely questionable.

    Rather than big transformational projects (which Hammond doesn’t anyway have the cash for), he favoured small-scale shovel-ready ones – like an extra £1.1 billion for English transport networks – which can deliver some small but quick economic wins.

    Stephen Roper, Professor of Enterprise and Director of the Enterprise Research Centre, Warwick Business School

    Addressing the UK’s productivity gap moved centre stage today as the chancellor announced a range of investments to make UK businesses more “resilient” and “match fit”.

    The productivity challenge is significant and borrowing to address it is a gamble. As the chancellor indicated, it currently takes an average UK worker five days to generate the same productivity as German employees generate in four. It remains to be seen how effective the chancellor’s package of measures will be in addressing the productivity challenge. Public investment alone is unlikely to close the gap, however. UK firms too will need to significantly increase their investment which is perhaps unlikely given the uncertainties of Brexit.

    Other aspects of the Autumn Statement will be welcomed by business. Confirmation of planned corporation tax reductions to 17%, business rate reductions and rural rate relief are all positive steps. Less welcome in some quarters will be the increase in insurance taxes.

    Reflecting Theresa May’s comments in June on the steps of 10 Downing Street, Hammond also signalled his aim to “build an economy which works for everyone”. Cash for regional transport, city deals that inject cash into regions and £1.8 billion from the Local Growth Fund to boost infrastructure in English regions will help perhaps. But there was little here to directly address issues of inclusivity or boosting growth in the UK’s poorest communities.


    Personal Finances


    Jonquil Lowe, Lecturer in Personal Finance, The Open University

    Just-about-managing households will welcome the further freeze on fuel duty, worth £130 a year to an average car driver. The duty accounts for just under 58p per litre or nearly half of the current pump price. Less welcome is the rise in Insurance Premium Tax (IPT) from the current rate of 10% to 12% from June 2017. IPT is a bit of a stealth tax. Because we buy most insurance only once a year and providers invariably hike up premiums for existing customers at renewal, we tend not to notice this embedded tax. But it’s a great money spinner for the government with each 1 percentage point rise bringing in an extra £400 million a year.

    Given current paltry returns, not-managing-at-all savers will welcome the announcement of a new National Savings & Investments savings bond next year, with an expected return of 2.2% a year (before tax) for a three-year term. However, they must wait until the March 2017 Budget for details. By then, inflation will be on the rise and interest rates may also be picking up – so we wait to see how attractive this bond will really be.




    Peter Taylor-Gooby, Professor of Social Policy, University of Kent

    The chancellor has announced a “cap” on welfare spending. This is an overall limit set for spending on all welfare benefits apart from pensions and those, like Jobseekers’ Allowance, where outside factors such as the state of the jobs market affects demand.

    Comparing the rate of increase allowed for the cap with expectations of economic growth shows how the poor – in work and not in work – will lose out by some 2% a year, with some slackening in the run-up to the next election. This throws into relief the chancellor’s failure to do much to address the continuing impact of the legacy of Osborne’s cuts.

    Expected GDP and Welfare Cap Growth, 2016-2021.Peter Taylor-Gooby

    The 4% increase in the national minimum/living wage is helpful to low-wage workers. So too is the 2% slackening of the taper on the Universal Credit work allowance (which means that the low-paid keep 2p more of any extra pound that they earn). Together, this may halve the likely cumulative cut of about 4% in living standards for this group over the life of the parliament.

    But these changes are incapable of addressing the serious problems of rising inequality and poverty resulting from the benefit freeze, tightening of eligibility for child benefit, removal of the family element and all the other cuts.




    Michael White, Director, Real Estate Economics and Investment Research Group, Nottingham Trent University

    The chancellor has announced a £2.3 billion housing infrastructure fund to build up to 100,000 new homes in high demand areas. London is to receive funding for 90,000 affordable homes and 40,000 new affordable homes are planned in England. These measures have to be welcomed along with the banning of letting agents’ fees.

    But the chancellor’s plans to pilot a right-to-buy scheme for housing association tenants are worrying. This would reduce the supply of more affordable homes.

    It is particularly at the lower end of the market that housing problems are most acute. Hence developing more affordable housing is essential if housing need and housing affordability are to be addressed.

    The chancellor’s announcements are far from enough. Even if the new proposed units are delivered it will not be enough to meet demand for housing, regardless of tenure type. Experts suggest 240,000 new homes are needed to meet demand annually. This total has never been achieved in the past 30 years, leaving a significant shortfall. Nothing has really managed to replace house building on the scale achieved by councils in the 1960s and 1970s.




    Karen Bloor, Professor of Health Economics and Policy, University of York

    This time last year I expressed concern that the promised spending increases for the NHS could well be inadequate. Much has changed in UK politics in 2016, but the financial landscape for health and social care remains extremely rocky. The Autumn Statement did little to alleviate this – in fact there was barely any mention of NHS finances.

    In the past year, NHS finances have worsened, with Trust deficits over £2.5 billion in 2015-16. This is affecting access to and quality of care: waiting time targets and four-hour A&E standards are increasingly missed.

    Next year, NHS spending will increase in real terms, but in later years it will flat-line. Access to the government’s new “Sustainability and Transformation Fund” for hospitals depends on meeting finance and performance targets, and locally drafted plans to reorganise services are likely to meet with considerable resistance. Restructuring services is extremely challenging, even with increased spending – and it may be impossible without it.

    Outside NHS England, finances are even worse. Public health spending continues to fall, limiting capacity to prevent illness. And strains on social care continue to worsen: older people accessing publicly funded social care have fallen by over a quarter since 2010 as a result of local authority funding constraints. We are failing to care for our frail elderly people outside hospital, and increasing pressures on the NHS as a consequence.




    David Metz, Honorary Professor of Transport Studies, University College London

    The chancellor has announced additional spending on road infrastructure of £1.3 billion, intended to cut congestion, tackle pinch-points on major roads and upgrade local roads. The aim is to make journeys quicker and easier for millions of commuters.

    Transport infrastructure investment is the current fashion among all the political parties, who hope it will stimulate a sluggish economy and boost long-term economic growth in the regions. But there is too much wishful thinking about the scope for congestion reduction and the benefits of improving connectivity between cities. We know from experience that we can’t build our way out of congestion– any increase in road capacity in populated parts of the country attracts additional commuting traffic.

    The chancellor has backed a recommendation of the National Infrastructure Commission to fund the next stage of development of an Oxford-Cambridge Expressway, worth £27m. But the commission’s main concern was the need to stimulate housing development if the potential economic benefits along the Oxford-Milton Keynes-Cambridge corridor are to be achieved. To this end, his plans to boost house building in areas of high demand are welcome.


    Science and research


    James Wilsdon, Professor of Research Policy, University of Sheffield

    The confirmation of the £2bn of extra funding for R&D in today’s Autumn Statement is hugely welcome. Experience teaches us that it’s sensible to treat headline numbers like this with caution until there’s opportunity to read the fine print but it seems that this is genuinely new money, which is fantastic news. And it’s a real credit to John Kingman, chairman of UK Research and Innovation, and Jo Johnson, as science minister, that they have secured such a substantial boost from HM Treasury.

    Some important details are still to emerge: how much of the new money will go to the research councils, and how much to the new Industrial Strategy Challenge Fund? How will the new challenges and priorities be defined, and with what mix of government, academic, disciplinary and user input? And will this extra investment be enough to offset the damaging uncertainties being created for the research community by Brexit?

    But overall, this is an unexpected and exciting statement of intent. When the Prime Minister said this money was coming in a speech on November 21, I put some champagne on ice. Tonight, after the Autumn Statement, I’ll be cracking open the bottle.




    **Anna Vignoles, Professor of Education, Jesus College, University of Cambridge

    The Autumn Statement confirmed capital funding for new grammar schools, despite the fact that the government is still consulting on this issue. Given the difficult financial situation faced by the education sector, this resource could be used to shield existing schools from existing cuts.

    So far the proposals for the expansion of grammar school places have been met with strong opposition from both the education sector and the wider public. The evidence is quite clear about the potential negative impact the expansion of selective schooling would have on social mobility. Of course, if the consultation does not result in a large number of new grammar schools, this capital funding may still remain available.

    The announcement of funds for research and development – which Theresa May outlined earlier this week, is to be welcomed of course. But given Brexit – and the implications of that for EU funded research grants – higher education still faces considerable uncertainty about research funding going forward. All in all, difficult times ahead for education, as for so many public services.

    The Conversation

    Geraint Johnes, Professor of Economics, Lancaster University; Anna Vignoles, Professor of Education, Jesus College, University of Cambridge; David Bailey, Professor of Industry, Aston University; David Metz, Honorary Professor of Transport Studies, UCL; James Wilsdon, Professor of Research Policy, University of Sheffield; Jonquil Lowe, Senior Lecturer in Economics and Personal Finance, The Open University; Karen Bloor, Professor of Health Economics and Policy, University of York; Michael Kitson, University Senior Lecturer in International Macroeconomics, Cambridge Judge Business School; Michael White, Director, Real Estate Economics and Investment Research Group, Nottingham Trent University; Peter Taylor-Gooby, Professor of Social Policy, University of Kent, and Stephen Roper, Professor of Enterprise and Director of the Enterprise Research Centre, Warwick Business School, University of Warwick

    This article was originally published on The Conversation. Read the original article.

    The Chancellor's Autumn Statement has met with a mixed response from expert academics at some of the country's leading universities - including Cambridge.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    Conductive textile

    Wearable, textiles-based electronics present new possibilities for flexible circuits, healthcare and environment monitoring, energy conversion, and many others. Now, researchers at the Cambridge Graphene Centre (CGC) at the University of Cambridge, working in collaboration with scientists at Jiangnan University, China, have devised a method for depositing graphene-based inks onto cotton to produce a conductive textile. The work, published in the journal Carbon, demonstrates a wearable motion sensor based on the conductive cotton.

    Cotton fabric is among the most widespread for use in clothing and textiles, as it is breathable and comfortable to wear, as well as being durable to washing. These properties also make it an excellent choice for textile electronics. A new process, developed by Dr Felice Torrisi at the CGC, and his collaborators, is a low-cost, sustainable and environmentally-friendly method for making conductive cotton textiles by impregnating them with a graphene-based conductive ink.

    Based on Dr Torrisi’s work on the formulation of printable graphene inks for flexible electronics, the team created inks of chemically modified graphene flakes that are more adhesive to cotton fibres than unmodified graphene. Heat treatment after depositing the ink on the fabric improves the conductivity of the modified graphene.  The adhesion of the modified graphene to the cotton fibre is similar to the way cotton holds coloured dyes, and allows the fabric to remain conductive after several washes.

    Although numerous researchers around the world have developed wearable sensors, most of the current wearable technologies rely on rigid electronic components mounted on flexible materials such as plastic films or textiles. These offer limited compatibility with the skin in many circumstances, are damaged when washed and are uncomfortable to wear because they are not breathable.

    “Other conductive inks are made from precious metals such as silver, which makes them very expensive to produce and not sustainable, whereas graphene is both cheap, environmentally-friendly, and chemically compatible with cotton,” explains Dr Torrisi.

    Co-author Professor Chaoxia Wang of Jiangnan University adds: “This method will allow us to put electronic systems directly into clothes. It’s an incredible enabling technology for smart textiles.”

    Electron microscopy image of a conductive graphene/cotton fabric. Credit: Jiesheng Ren

    The work done by Dr Torrisi and Prof Wang, together with students Tian Carey and Jiesheng Ren, opens a number of commercial opportunities for graphene-based inks, ranging from personal health technology, high-performance sportswear, military garments, wearable technology/computing and fashion.

    “Turning cotton fibres into functional electronic components can open to an entirely new set of applications from healthcare and wellbeing to the Internet of Things,” says Dr Torrisi “Thanks to nanotechnology, in the future our clothes could incorporate these textile-based electronics and become interactive.”

    Graphene is carbon in the form of single-atom-thick membranes, and is highly conductive. The group’s work is based on the dispersion of tiny graphene sheets, each less than one nanometre thick, in a water-based dispersion. The individual graphene sheets in suspension are chemically modified to adhere well to the cotton fibres during printing and deposition on the fabric, leading to a thin and uniform conducting network of many graphene sheets. This network of nanometre flakes is the secret to the high sensitivity to strain induced by motion. A simple graphene-coated smart cotton textile used as a wearable strain sensor has been shown to reliably detect up to 500 motion cycles, even after more than 10 washing cycles in normal washing machine.

    The use of graphene and other related 2D materials (GRMs) inks to create electronic components and devices integrated into fabrics and innovative textiles is at the centre of new technical advances in the smart textiles industry. Dr Torrisi and colleagues at the CGC are also involved in the Graphene Flagship, an EC-funded, pan-European project dedicated to bringing graphene and GRM technologies to commercial applications.

    Graphene and GRMs are changing the science and technology landscape with attractive physical properties for electronics, photonics, sensing, catalysis and energy storage. Graphene’s atomic thickness and excellent electrical and mechanical properties give excellent advantages, allowing deposition of extremely thin, flexible and conductive films on surfaces and – with this new method – also on textiles. This combined with the environmental compatibility of graphene and its strong adhesion to cotton make the graphene-cotton strain sensor ideal for wearable applications.

    The research was supported by grants from the European Research Council’s Synergy Grant, the International Research Fellowship of the National Natural Science Foundation of China and the Ministry of Science and Technology of China. The technology is being commercialised by Cambridge Enterprise, the University’s commercialisation arm.

    Ren, J. et al. Environmentally-friendly conductive cotton fabric as flexible strain sensor based on hot press reduced graphene oxide. Carbon; 19 Oct 2016; DOI: 10.1016/j.carbon.2016.10.045

    A new method for producing conductive cotton fabrics using graphene-based inks opens up new possibilities for flexible and wearable electronics, without the use of expensive and toxic processing steps.

    Turning cotton fibres into functional electronic components can open to an entirely new set of applications from healthcare and wellbeing to the Internet of Things
    Felice Torrisi

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    Graphene is a sheet form of carbon that is a single atom thick, which can be produced by successively peeling thin layers off graphite using tape until an individual atomic layer is left. In the ink produced here, powdered graphite is mixed with alcohol then forced at high pressure through micrometre-scale capillaries made of diamond. 

    This was the first time that Macleod, a 32 year old technician at the Department, had entered the competition. His is one of more than 140 images that showcase the breadth of research taking place there.

    The competition, sponsored by ZEISS, international leaders in the fields of optics and optoelectronics, has been held annually for the last 12 years. The panel of judges included Roberto Cipolla, Professor of Information Engineering, Dr Allan McRobie, Reader in Engineering, Professor David Cardwell, Head of Department, Dr Kenneth Png, Senior Applications Engineer at Carl Zeiss Microscopy and Philip Guildford, Director of Research.

    Second prize went to Toby Call for his photo showing bacteria on a graphene coated carbon foam anodic surface. The bacteria (shown in red) produce conductive nanowires to connect to the surface. Also captured in the top left of the image is a ciliate (tiny protozoan) which either feed on the abundant electricity producing bacteria, or compete for resources.

    Simon Stent was awarded third prize for an image showing a 2-km map of a power tunnel network in London. Due for completion in 2018, the network will channel up to six 400 kV electricity cables underground, doubling power capacity to the city. The image was captured and processed by a low-cost robotic device. Each of the 12 columns in the image spans a distance of 170 metres and shows the full 360 degree tunnel circumference unwrapped.

    Mr Guildford says: "This year’s entries form yet another collection of incredible images that offer us an insight into the varied world of engineering. These photos show how some scientific applications and processes can convey stark beauty.  From tiny particles and microscopic images, to sections of tunnel on the Crossrail project in London, these photos represent the full spectrum of engineering."

    Some of the images submitted to the competition are tiny, and can only be viewed properly through a microscope, while others are on a much grander scale. Behind them all lies a passion for the subject matter being studied by the photographer.

    The winning images can be viewed online via the Department's Flickr pages, where they can be accessed alongside dozens of other entries.



    It could be a crystal ball from a mythical age showing the swirling mists of time, but James Macleod’s  image, which has won this year’s Department of Engineering Photography Competition, actually shows graphene being processed in alcohol to produce conductive ink.

    These photos show how some scientific applications and processes can convey stark beauty
    Philip Guildford
    Graphene being processed in alcohol to poduce conductive ink

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


older | 1 | .... | 98 | 99 | (Page 100) | 101 | 102 | .... | 141 | newer