Quantcast
Channel: University of Cambridge - Latest news
Viewing all 4368 articles
Browse latest View live

Combating cybercrime when there's plenty of phish in the sea

$
0
0

We’ve all received the emails, hundreds, maybe thousands of them. Warnings that our bank account will be closed tomorrow, and we’ve only got to click a link and send credit card information to stop it from happening. Promises of untold riches, and it will only cost a tiny fee to access them. Stories of people in desperate circumstances, who only need some kind soul to go to the nearest Western Union and send a money transfer to save them.

Tricking people into handing over sensitive information such as credit card details – known as ‘phishing’ – is one of the ways criminals scam people online. Most of us think we’re smarter than these scams. Most of us think that we could probably con the con artist if we tried. But we would be wrong.

Across the world, cybercrime is booming. When the UK government included cybercrime in the national crime statistics for the first time in 2015, it doubled the crime rate overnight. Millions of people worldwide are victimised by online scams, whether it’s blocking access to a website, stealing personal or credit card information, or attempting to extort money by remotely holding the contents of a personal computer hostage.

“Since 2005, the police have largely ignored cybercrime,” says Professor Ross Anderson of Cambridge’s Computer Laboratory. “Reported crime fell by as much as a half in some categories. Yet, now that online and electronic fraud are included, the number of reported crimes has more than doubled. Crime was not falling; it was just moving online.”

In 2015, computer scientists, criminologists and legal academics joined forces to form the Cambridge Cybercrime Centre, with funding from the Engineering and Physical Sciences Research Council. Their aim is to help governments, businesses and ordinary users to construct better defences.

To understand how the criminals operate, researchers use machine learning and other techniques to recognise bad websites, understand what kinds of brands tend to be attacked and how often, determine how many criminals are behind an attack by looking at the pattern of the creation of fake sites and how effective the various defence systems are at getting them taken down.

One way in which studying cybercrime differs from many other areas of research is that the datasets are difficult to come by. Most belong to private companies, and researchers need to work hard to negotiate access. This is generally done through nondisclosure agreements, even if the data is out of date. And once researchers complete their work, they cannot make the data public, since it would reduce the competitive advantage of corporate players, and it may also make it possible for criminals to reverse engineer what was detected (and what wasn’t) and stay one step ahead of law enforcement.

One of the goals of the Cambridge Cybercrime Centre is to make it easier for cybercrime researchers from around the world to get access to data and share their results with colleagues.

To open up cybercrime research to colleagues across the globe, the team will leverage their existing relationships to collect and store cybercrime datasets, and then any bona fide researcher can sign a licence with the Centre and get to work without all the complexity of identifying and approaching the data holders themselves.

“Right now, getting access to data in this area is incredibly complicated,” says Dr Richard Clayton of Cambridge’s Computer Laboratory, who is also Director of the Centre. “But we think the framework we’ve set up will create a step change in the amount of work in cybercrime that uses real data. More people will be able to do research, and by allowing others to work on the same datasets more people will be able to do reproducible research and compare techniques, which is done extremely rarely at the moment.”

One of the team helping to make this work is Dr Julia Powles, a legal researcher cross-appointed between the Computer Laboratory and Faculty of Law. “There are several hurdles to data sharing,” says Powles. “Part of my job is to identify which ones are legitimate – for example, when there are genuine data protection and privacy concerns, or risks to commercial interests – and to work out when we are just dealing with paper tigers. We are striving to be as clear, principled and creative as possible in ratcheting up research in this essential field.”

Better research will make for better defences for governments, businesses and ordinary users. Today, there are a lot more tools to help users defend themselves against cybercrime – browsers are getting better at recognising bad URLs, for example – but, at the same time, criminals are becoming ever more effective, and more and more people are getting caught in their traps.

“You don’t actually have to be as clever as people once thought in order to fool a user,” says Clayton when explaining how fake bank websites are used to ‘phish’ for user credentials. “It used to be that cybercriminals would register a new domain name, like Barclays with two Ls, for instance. But they generally don’t do that for phishing attacks anymore, as end users aren’t looking at the address bar, they’re looking at whether the page looks right, whether the logos look right.”

The Centre is also looking at issues around what motivates someone to commit cybercrime, and what makes them stop.

According to Dr Alice Hutchings, a criminologist specialising in cybercrime, cybercriminals tend to fall into two main categories. The first category is the opportunistic offender, who may be motivated by a major strain in their lives, such as financial pressures or problems with gambling or addiction, and who uses cybercrime as a way to meet their goals. The second type of offender typically comes from a more stable background, and is gradually exposed to techniques for committing cybercrime through associations with others.

Both groups will usually keep offending as long as cybercrime meets their particular needs, whether it’s financial gratification, or supporting a drug habit, or giving them recognition within their community. What often makes offenders stop is the point at which the costs of continuing outweigh the benefits: for instance, when it takes a toll on their employment, other outside interests or personal relationships.

“Most offenders never get caught, so there’s no reason to think that they won’t go back to cybercrime,” says Hutchings. “They can always start again if circumstances in their lives change.

“There is so much cybercrime happening out there. You can educate potential victims, but there will always be other potential victims, and new ways that criminals can come up with to social engineer somebody’s details, for example. Proactive prevention against potential offenders is a good place to start.”

Criminologist Professor Lawrence Sherman believes the collaboration between security engineering and criminology is long overdue, both at Cambridge and globally: “Cybercrime is the crime of this century, a challenge we are just beginning to understand and challenge with science.”

“We’re extremely grateful to the people giving us this data, who are doing it because they think academic research will make a difference,” says Clayton.  “Our key contribution is realising that there was a roadblock in terms of being able to distribute the data. It’s not that other people couldn’t get the data before, but it was very time-consuming, so only a limited number of people were doing research in this area – we want to change that.”

“Our Cybercrime Centre will not only provide detailed technical information about what’s going on, so that firms can construct better defences,” says Anderson. “It will also provide strategic information, as a basis for making better policy.”

As more and more crime moves online, computer scientists, criminologists and legal academics have joined forces in Cambridge to improve our understanding and responses to cybercrime, helping governments, businesses and ordinary users construct better defences.

You don’t actually have to be as clever as people once thought in order to fool a user
Richard Clayton
TeQi's Graffitti Phish

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Opinion: Thirty years on as 'new Cold War' looms, US and Russia should remember the Rekyjavik summit

$
0
0

In what looks very like a tit-for-tat downgrading of bilateral relations, Russia and America have traded diplomatic insults in recent weeks over nuclear weapons, geopolitics and economics, prompting speculation about “a new Cold War”.

Moscow acted first, announcing on October 3 that it had suspended its agreement with Washington on the disposal of surplus weapons-grade plutonium. Russian president, Vladimir Putin, accused the United States of “creating a threat to strategic stability as a result of unfriendly actions towards Russia”. He cited the recent build up of American forces in Eastern Europe, especially the Baltic states.

For its part, the US suspended talks with Russia over the war in Syria, on top of its existing sanctions against Moscow over Russia’s 2014 military actions in Ukraine.

How to escape from this standoff? Are there any lessons to be learned from the era of détente and the end of the Cold War in the 1970s and 1980s? In particular, about the role of international statecraft and personal dialogue between leaders?

Icelandic freeze

October 2016 marks the 30th anniversary of the summit between Ronald Reagan and Mikhail Gorbachev in Reykjavik, Iceland which had aimed for an agreement on bilateral nuclear arms reductions. At the time the meeting was depicted in the media as a total failure, particularly over Star Wars, the US plan for a sophisticated anti-ballistic missile defence system. “No Deal. Star Wars Sinks the Summit,” Time magazine trumpeted on its cover with a photo of two drained and dejected men, unable to look each other in the eye.

How Time magazine reported the summit failure.TIME Magazine

The last session ended in total deadlock between the two leaders– maybe a fateful missed opportunity. “I don’t know when we’ll ever have another chance like this,” Reagan lamented. “I don’t either”, replied Gorbachev. They wondered when – or even if – they would meet again.

This familiar, negative narrative was – and is – shortsighted. In reality, both leaders soon came to a more positive view of the summit. Far from being a “failure”, Gorbachev judged Reykjavik to be “a step in a complicated dialogue, in a search for solutions”. Reagan told the American people: “We are closer than ever before to agreements that could lead to a safer world without nuclear weapons.”

Reagan and Gorbachev had both learned how open discussion between those at the top could cut through much of the red tape and political misunderstanding that ties up international relations. At Reykjavik, even though Stars Wars proved a (temporary) stumbling block, both sides agreed that they could and should radically reduce their nuclear arsenals without detriment to national security. And this actually happened, for the first time ever, just a year later when they signed away all their intermediate-range nuclear forces – Soviet SS-20s and US Cruise and Pershing II missiles – in Washington in December 1987.

The treaty testifies to the value of summit meetings that can be part of a process of dialogue that deepens trust on both sides and promotes effective cooperation. Reagan and Gorbachev clicked as human beings at Geneva in 1985, they spoke the unspeakable at Reykjavik in 1986 with talk of a nuclear-free world – and they did the unprecedented in Washington in 1987 by eliminating a whole category of nuclear weapons. All this helped to defuse the Cold War.

It’s good to talk

Today, however, the world seems in turmoil and trust is once again in short supply. We seem to be back to political posturing, megaphone diplomacy and military brinkmanship. Is there is any place for summitry in a situation of near-total alienation? This question was, of course, at the heart of the easing of hostilities in the 1970s, when East and West tried to thaw relations and find ways of living together peacefully.

Helmut Schmidt, West Germany’s “global chancellor” of the 1970s, was a great practitioner of what he called “Dialogpolitik”. He argued that leaders must always try to put themselves in the other person’s shoes in order to understand their perspective on the world, especially at times of tension. He favoured informal summit meetings as a way to exchange views privately and candidly, rather than feeding the insatiable media craving to spill secrets and proclaim achievements.

Rapport matters.EPA/Stephane Mahe

In the early 1980s, when superpower relations were stuck in a deep freeze, Schmidt conducted shuttle diplomacy as the self-styled “double-interpreter” between Washington and Moscow. Even when no real deals were in the offing, he believed it particularly vital to keep talking.

The German chancellor, Angela Merkel, recently revived Schmidt’s approach, emphasising the need to maintain lines of communication with the Kremlin at a time of renewed East-West tension. Equally, however, she has insisted on the importance of a strong defence capability. Merkel is surely right. There is always a delicate balance to be struck between the diplomacy of dialogue and the politics of deterrence – making up your mind when to reach out and when to stand firm. Three decades on from Reykjavik, that remains the perennial challenge for those who have the vision, skill and nerve to venture to the summit.

David Reynolds, Professor of International History, Fellow of Christ's College, University of Cambridge and Kristina Spohr, Associate Professor of History, London School of Economics and Political Science

This article was originally published on The Conversation. Read the original article.

David Reynolds (Faculty of History) and Kristina Spohr (London School of Economics and Political Science) discuss current relations between the US & Russia, and whether there are any lessons to be learned from the era of détente and the end of the Cold War in the 1970s and 1980s. 

Reagan Bids Gorbachev Farewell

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Kettle’s Yard on the move to celebrate 50th anniversary

$
0
0

Portraits of Place: Works from Kettle’s Yard and Richard Long opens on November 5 at Downing College and brings together paintings, sculptures, collages and works on paper by leading artists who have been inspired to respond to the places in which they have lived and worked. The exhibition runs until January 15, 2017.

The exhibition includes an array of intimate depictions of British landscapes from the Kettle’s Yard Collection, ranging from early paintings of Cumberland and Cornwall by Nicholson and Wallis, to evocative textual and material compositions by Ian Hamilton Finlay. Selected collages, paintings and photographs of rural and urban sites by non-British artists such as Italo Valenti and Frankenthaler are also on display.

The Kettle’s Yard Collection particularly highlights how artists have found new ways to represent their emotional, visual and physical connections to places, just as Kettle’s Yard House has become a poignant site within Cambridge. Portraits of Place also extends beyond Kettle’s Yard’s Collection to include works by British artist Richard Long (b. 1945), lent by the artist, whose works respond to the feelings, scale and textures of places and journeys.

The title Portraits of Place is inspired by John Constable’s East Bergholt, a painting from Downing College which is not usually on public display but is included in this exhibition. The landscape is significant because it portrays Constable’s home village. Other landscapes in the exhibition depict places of significance to the artists.

This is one of ten exhibitions around Cambridge to celebrate the 50th anniversary of the gift of Kettle’s Yard to the University of Cambridge, for a full list see: kettlesyard.co.uk/fifty

The exhibition is being held at Downing’s Heong Galley as Kettle’s Yard is currently closed to carry out a major building project (http://www.kettlesyard.co.uk/about/development-plans/). The Heong Gallery opened in February 2016 as a new Gallery for exhibitions of modern and contemporary art. Its design was influenced by the feel and ambience of Kettle’s Yard.

During its closure, Kettle’s Yard is working with a number of partner galleries across the UK to present works from its permanent collection in new contexts. Displays are taking place in Cambridge, Wakefield, Nottingham and other locations.

Opening times for Portraits of Place: Works from Kettle’s Yard and Richard Long are Wednesdays 10am–8pm, weekends and bank holidays 10am–6pm.

Works by some of the leading artists of the 20th and 21st centuries – including Ben Nicholson, Alfred Wallis, LS Lowry and Helen Frankenthaler – are to go on display in Cambridge as Kettle’s Yard celebrates 50 years as part of the University of Cambridge.

Portraits of Place brings together paintings, sculptures, collages and works on paper by leading artists.
Ben Nicholson, 1928 (Banks Head – Cumbrian Landscape).

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Cambridge extends world leading role for medical imaging with powerful new brain and body scanners

$
0
0

The equipment, funded by the Medical Research Council (MRC), Wellcome Trust and Cancer Research UK, sits within the newly-refurbished Wolfson Brain Imaging Centre (WBIC), which today celebrates two decades at the forefront of medical imaging.

At the heart of the refurbishment are three cutting-edge scanners, of which only a very small handful exist at institutions outside Cambridge – and no institution other than the University of Cambridge has all three. These are:

  • a Siemens 7T Terra Magnetic Resonance Imaging (MRI) scanner, which will allow researchers to see detail in the brain as tiny as a grain of sand
  • a GE Healthcare PET/MR scanner that will enable researchers to collect critical data to help understand how cancers grow, spread and respond to treatment, and how dementia progresses
  • a GE Healthcare hyperpolarizer that enables researchers to study real-time metabolism of cancers and other body tissues, including whether a cancer therapy is effective or not

These scanners, together with refurbished PRISMA and Skyra 3T MRI scanners at the WBIC and at the Medical Research Council Cognition and Brain Sciences Unit, will make the Cambridge Biomedical Campus the best-equipped medical imaging centre in Europe.

Professor Ed Bullmore, Co-Chair of Cambridge Neuroscience and Scientific Director of the WBIC, says: “This is an exciting day for us as these new scanners will hopefully provide answers to questions that we have been asking for some time, as well as opening up new areas for us to explore in neuroscience, mental health research and cancer medicine.

“By bringing together these scanners, the research expertise in Cambridge, and the latest in ‘big data’ informatics, we will be able to do sophisticated analyses that could revolutionise our understanding of the brain – and how mental health disorders and dementias arise – as well of cancers and how we treat them. This will be a powerful research tool and represents a big step in the direction of personalised treatments.”

Dr Rob Buckle, Director of Science Programmes at the MRC, adds: “The MRC is proud to sponsor this exciting suite of new technologies at the University of Cambridge. They will play an important role in advancing our strategy in stratified medicine, ultimately ensuring that the right patient gets the right treatment at the right time.”

 

Slide show: Click on images to expand

7T Medical Resonance Imaging (MRI) scanner

The Siemens 7T Terra scanner – which refers to the ultrahigh strength of its magnetic field at 7 Tesla – will allow researchers to study at unprecedented levels of detail the workings of the brain and how it encodes information such as individual memories. Current 3T MRI scanners can image structures 2-3mm in size, whereas the new scanner has a resolution of just 0.5mm, the size of a coarse grain of sand.

“Often, the early stages of diseases of the brain, such as Alzheimer’s and Parkinson’s, occur in very small structures – until now too small for us to see,” explains Professor James Rowe, who will be leading research using the new 7T scanner. “The early seeds of dementia for example, which are often sown in middle age, have until now been hidden to less powerful MRI scanners.”

The scanner will also be able to pick up unique signatures of neurotransmitters in the brain, the chemicals that allow its cells to communicate with each other. Changes in the amount of these neurotransmitters affect how the brain functions and can underpin mental health disorders such as depression and schizophrenia.

“How a patient responds to a particular drug may depend on how much of a particular neurotransmitter present is currently present,” says Professor Rowe. “We will be looking at whether this new scanner can help provide this information and so help us tailor treatments to individual patients.”

The scanner will begin operating at the start of December, with research projects lined up to look at dementias caused by changes to the brain almost undetectable by conventional scanners, and to look at how visual and sound information is converted to mental representations in the brain.

PET/MR scanner

The new GE Healthcare PET/MR scanner brings together two existing technologies: positron emission tomography (PET), which enables researchers to visualise cellular activity and metabolism, and magnetic resonance (MR), which is used to image soft tissue for structural and functional details.

Purchased as part of the Dementias Platform UK, a network of imaging centres across the UK, the scanner will enable researchers to simultaneously collect information on physiological and disease-related processes in the body, reducing the need for patients to return for multiple scans. This will be particularly important for dementia patients.

Professor Fiona Gilbert, who will lead research on the PET/MR scanner, explains: “Dementia patients are often frail, which can present challenges when they need separate PET and MR scanners. So, not only will this new scanner provide us with valuable information to help improve understanding and diagnosis of dementia, it will also be much more patient-friendly.”

PET/MR  will allow researchers to see early molecular changes in the brain, accurately map them onto structural brain images and follow their progression as disease develops or worsens. This could enable researchers to diagnose dementia before any symptoms have arisen and to understand which treatments may best halt or slow the disease.

As well as being used for dementia research, the scanner will also be applied to cancer research, says Professor Gilbert.

“At the moment, we have to make lots of assumptions about what’s going on in tumour cells. We can take biopsies and look at the different cell types, how aggressive they are, their genetic structure and so on, but we can only guess what’s happening to a tumour at a functional level. Functional information is important for helping us determine how best to treat the cancer – and hence how we can personalise treatment for a particular patient. Using PET/MR, we can get real-time information for that patient’s specific tumour and not have to assume it is behaving in the same way as the last hundred tumours we’ve seen.”

The PET/MR scanner will begin operation at the start of November, when it will initially be used to study oxygen levels and blood flow in the tumours of breast cancer patients and in studies of brain inflammation in patients with Alzheimer’s disease and depression.

Hyperpolarizer

The third new piece of imaging equipment to be installed is a GE Healthcare hyperpolarizer, which is already up and running at the facility.

MRI relies on the interaction of strong magnetic fields with a property of atomic nuclei known as ‘spin’. By looking at how these spins differ in the presence of magnetic field gradients applied across the body, scientists are able to build up three-dimensional images of tissues. The hyperpolarizer boosts the ‘spin’ signal from tracers injected into the tissue, making the MRI measurement much more sensitive and allowing imaging of the biochemistry of the tissue as well as its anatomy.

“Because of underlying genetic changes in a tumour, not all patients respond in the same way to the same treatment,” explains Professor Kevin Brindle, who leads research using the hyperpolarizer. “Using hyperpolarisation and MRI, we can potentially tell whether a drug is working, from changes in the tumour’s biochemistry, within a few hours of starting treatment. If it’s working you continue, if not you change the treatment.”

The next generation of imaging technology, newly installed at the University of Cambridge, will give researchers an unprecedented view of the human body – in particular of the myriad connections within our brains and of tumours as they grow and respond to treatment – and could pave the way for development of treatments personalised for individual patients.

By bringing together these scanners, the research expertise in Cambridge, and the latest in ‘big data’ informatics, we will be able to do sophisticated analyses that could revolutionise our understanding of the brain – and how mental health disorders and dementias arise – as well of cancers and how we treat them
Ed Bullmore

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Elvis is alive and the Moon landings were faked: the (conspiracy) theory of everything

$
0
0

Elvis is alive, the Moon landings were faked and members of the British Royal Family are shapeshifting lizards.

Not only that: 9/11 was an inside job, governments are deliberately concealing evidence of alien contact, and we are all being controlled by a sinister, shadowy cartel of political, financial and media elites who together form a New World Order.

As a global population we are awash with conspiracy theories. They have permeated every major event, across every level of society; from the French Revolution to the War on Terror. In doing so, they have attracted devotees in their millions; from lone survivalists to presidential nominees such as Donald Trump – who claimed Ted Cruz’s father had links to Lee Harvey Oswald and, by inference, to the murder of President John F. Kennedy.

But what effects do conspiracy theories really have on the public as we go about our day-to-day lives? Are they merely harmless flights of fancy propagated by those existing on the margins of society, or is their reach altogether more sinister? Do runaway conspiracy theories influence politicians, decision-makers and, by extension, the public at large? And what effect has the advent of the internet and mass, instant communication across social media platforms had on the spread of conspiracy theories around the world?

Since 2013, a team of Cambridge researchers and visiting fellows has been examining the theories and beliefs about conspiracies that have become such an enduring feature of modern society. Conspiracy and Democracy: History, Political Theory and Internet Research is a five-year, interdisciplinary research project based at CRASSH (Centre for Research in the Arts, Social Sciences and Humanities) and funded by the Leverhulme Trust.

The project brings together historians, political theorists, philosophers, anthropologists and internet engineers as it seeks to understand what additional factors must be at work for conspiracy theories to enjoy such prevalence in the 21st century.

Professor John Naughton who, along with Professor Sir Richard Evans and Professor David Runciman, is one of the three project directors, explains: “Studying conspiracy theories provides opportunities for understanding how people make sense of the world and how societies function, as well as calling into question our basic trust in democratic societies.

“Our project examines how conspiracies and conspiracy theorising have changed over the centuries and what, if any, is the relationship between them? Have conspiracy theories appeared at particular moments in history, and why?

“We wanted to counter the standard academic narrative that conspiracy theories are beneath contempt. We were anxious to undertake a natural history of theorising, to study it seriously from a 21st-century context.”

Despite the onset of the digital age, Naughton and his colleagues do not believe that the internet has necessarily increased the influence of conspiracy theories on society as a whole. Indeed, research suggests that although the spread of conspiracy theories is often instantaneous in the digital world, so too is the evidence to debunk them.

Likewise, the team’s work so far suggests that online, as in life, we largely surround ourselves with people of like-minded views and opinions, effectively partitioning ourselves from a diversity of world views.

“The internet doesn’t make conspiracy theories more persuasive, it actually seems to compartmentalise people,” adds Naughton. “We more efficiently come into contact with those who hold similar views, but we also mostly end up working in echo chambers. That’s the way the internet works at the moment – especially in social media: you end up somewhere where everyone has the same views.

“The effect is a more concentrated grouping of opinions, and that’s the same for everything else, not just conspiracy theories. I follow 800 people on Twitter. Not one of them celebrated Brexit. I was in an echo chamber.”

Dr Alfred Moore, a postdoctoral researcher on the project, adds: “The question of the effect of the internet is a really interesting one. How far can the emergence and success of today’s populist movements be explained in terms of technological changes and especially social media? My first instinct is to say a little bit, but probably not much.

“Technologies have made it less costly to communicate, which means it’s easier to find, talk to and organise supporters without the financial and organisational resources of political parties. Both Corbyn and Trump make heavy use of social media as an alternative to a supposedly biased ‘mainstream’ media and the influence of their parties. It also demonstrates how the internet can promote polarisation by making it easy for people to find information they agree with and to filter out everything else.”

For those reasons, Naughton and Moore believe that some of the most famous conspiracy theories – such as David Icke’s theories about shapeshifting reptiles or feverish claims about the death of Princess Diana – are not particularly dangerous as they don’t appear to generate tangible actions or outcomes in the real world. In fact, the Conspiracy and Democracy team question whether these silos effectively disable the capacity for many conspiracy theories to take a firm hold in the public consciousness or threaten our democratic processes.

“A lot remains to be done in researching the history, structure and dynamics of conspiracy theories, their relationships with real conspiracies, and the changes they have undergone through time,” adds Evans. “You might think that conspiracy theories cause anxiety and depression among ordinary people, and undermine trust in our political institutions and the people who run them, but there are plenty of other reasons for this lack of trust apart from conspiracy theories.

“The debate goes on, but it’s not a case of conspiracy theories threatening democracies. By themselves, such theories may reinforce political suspicion and prejudice but they’re not the origin of it. On the whole, I think it’s fair to conclude that the scale of the threat is pretty limited.

“Some varieties, like antisemitism, can cause huge damage, but others are pretty harmless. Does it really matter that some people think the moon landings were faked? In the end, few people believe we are ruled by alien lizards.”

As a global population we are awash with conspiracy theories. But what effect do these really have on the public as we go about our day-to-day lives, asks a team of Cambridge researchers.

The internet doesn’t make conspiracy theories more persuasive, it actually seems to compartmentalise people
John Naughton
Moon1

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Opinion: What men would do to fix the workplace equality gap

$
0
0

It is still true that far more men than women have leading roles in many organisations. If you ask women to explain this, as many researchers have, they point to workplace culture as a prime culprit.

Many readers will be familiar with the kinds of experiences that frustrate women’s career progress; things like being interrupted or talked over in meetings. But that is old news. Less well known are more subtle trends such as the tendency of men to be promoted on their potential to take on a new role, while women need experience of the role to get the job. This is often called “ability bias” and often happens simply because men’s networks are stronger. Men tend to gravitate to other men in the informal chats that happen in corridors, coffee shops and after five-a-side football games.

These conversations are where connections are made that open the door to opportunities. Men can also kill careers with misplaced “kindness”. In their concern not to upset a woman, men often give less direct and less useful feedback than they do to men. Over time, the consequence of these behaviours for women is that they feel they do not fit in, that they are not being promoted on merit and, as a result, ambition and confidence can melt away.

Men: the missing ingredient

Lack of gender parity is influenced just as much by what men are doing as what women are doing. And yet male voices have been quiet, or not listened to, while initiatives focus on things like mentoring female employees to be more assertive – in effect “fixing the women” to fit in with how things are done.

Closed networks need to be opened.maxuser/Shutterstock

The better news for women is that many men support gender parity at the top of organisations. The business case is well accepted and, for men in early or mid career in particular, gender parity is a moral issue too. Men tell us that it’s not right that their daughters, female partners and friends should have a different experience of the workplace from the men they work alongside.

So its surprising that very little research has been done to find out what men think of the problems with workplace culture that women report, or to discover what men think can be done about it. This is why Murray Edwards College started the Collaborating with Men research project, working with 40 men in early career, middle management and senior leadership roles in both small businesses and large organisations – and in both the public and private sector.

If individuals are not prepared to change something themselves, then no amount of company policy is going to make a tangible difference. However, given that most men haven’t done much thinking about how their behaviour may negatively affect women’s careers, it follows that it’s not obvious what they can do to help. So the first thing we did was to share a summary of the research from women’s perspective. Many of the participants then discussed this research with their female colleagues. The effect was to make unconscious behaviours visible and to prompt many men to suggest practical changes which could help redress the balance. Let’s pick out some highlights here:

Solutions to improve workplace culture

Just talk– A first step would bring teams together in a facilitated, neutral meeting where evidence on workplace culture can be aired and women can describe issues they think they experience because of their gender. Teams can then discuss solutions with the help of the ideas that came from this research.

Making visible how things get done in practice– Power audits are needed by mixed gender teams after a project is finished to make visible how and where decisions were made. This will improve how gender diverse teams work together.

Finding the power source.Bo Insogna, TheLightningMan.com/Flickr, CC BY-NC-ND

Building close relationships– A key insight from the research stresses the importance of extending mixed gender networks to make it more likely that a woman comes to mind when an opportunity arises. This might include networking with a social agenda: you might have “Walkabout Wednesdays” when everyone is expected to have coffee with someone new. Networking can also happen as a by-product of something useful: for example, “Take Two” buddy ups when you need cover for two hours while you do something outside work. Or it can be directly related to work, perhaps mixed-gender mentoring to share skills and perspectives on a project.

Bystanders amplify– Not having their comments and ideas heard in meetings is one of women’s biggest complaints. Men can help by amplifying a woman’s contribution when they notice it has not been heard; repeating their colleague’s idea and giving her credit.

Actions from leaders - Individual actions need to be authorised by leaders taking a clear stance. Male role models are needed to transform workplace culture – yet the men who take on this role often face backlash. Leaders can help by rewarding and supporting men who make changes to support gender parity.

Men and women in early career and middle management have a lot to get done. The best solutions will adapt easily into the normal working day. Many of these ideas may seem like small things but, by increments, we think that these small changes could make a big difference to how women feel about work, their access to sponsorship that comes with stronger networks and progress into leadership positions.

Individual action from men to improve workplace culture will work in tandem with other initiatives such as training to recognise unconscious bias, designing systems to remove gender bias from hiring and promotion and introducing flexible working policies that all aim to close the gender equality gap.

Jill Armstrong, Research Associate at Murray Edwards College working on gender equality in careers, University of Cambridge

This article was originally published on The Conversation. Read the original article.

Jill Armstrong (Murray Edwards College) discusses her research on the behaviours and perceptions of men regarding women's workplace experiences. 

Workplace team business meeting

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Cambridge has waived application fee for graduate students from most African countries

$
0
0

Applause greeted the statement by Professor David Dunne, Director of the Cambridge-Africa Programme, as he confirmed that Cambridge has waived the usual application fee for nationals of many of the world’s least developed countries.

Welcoming participants to the third edition of the Cambridge-Africa Day, he expressed his aspiration that this policy will allow Cambridge to attract increasing numbers of talented graduate students to the various scholarship schemes available for Sub-Saharan students.

“The purpose of the Cambridge-Africa Programme is to make Cambridge’s support available to African researchers working on African problems, allowing them to build capacity in their home universities.” The fee waiver, he added, is a “major contribution” to that effort.

In his opening remarks, Professor Sir Leszek Borysiewicz, Vice-Chancellor of the University of Cambridge, asked: “Why should academic institutions get involved in tackling some of the world’s most insoluble problems?”

“Because we have to decide what our values are, and how they allow us to succeed in our mission. Cambridge’s mission is to contribute to society through education, learning and research. The definition of society has changed over the past 800 years –but today that society is global.”

He added: “Being high on league tables does not make a university global. The real challenge is: what are you giving up? How are you sharing your influence to support other institutions? It’s not about aggrandising oneself, but about aggrandising others.”

The Vice-Chancellor described the capacity building Cambridge-Africa Programme, set up in 2008, as an ambitious, long-term project with sustainability at its heart. “Cambridge-Africa is about planting seeds that allow partner institutions to thrive.”

Addressing a packed auditorium in Emmanuel College’s Queen’s Building, the Vice-Chancellor mentioned the Programme’s partnerships with over 50 African institutions across 23 countries, including its two regional hubs at the University of Ghana, Legon, and at Makerere University, in Uganda.

The Programme has enlisted the expertise and mentorship of a network of over 200 Cambridge collaborators, and to date has supported 70 African post-doctoral researchers and 35 African PhD students.

“The University of Cambridge has invested £4 million pounds in the Cambridge-Africa Programme, and has leveraged that to attract £6.9 million to the University for its mentorship and collaboration initiatives”, he said. “Crucially, this has led to almost £21.9 million in external funding being allocated to our African partners”.

In the day’s first keynote address, Kenyan palaeontologist Professor Richard Leakey said that internationally renowned universities like Cambridge should do more to educate civil servants and policymakers in Africa about the importance of research and education:

“A University like Cambridge can play a much bigger role in Africa by interacting with African government at the highest level.”

He assured the audience of students, academics, administrators, NGO representatives and philanthropists that the University of Cambridge is likely to be identified with the next major breakthrough in our understanding of African history.

“The story of Africa is important for Africa, and Africans. There is a gap of self awareness in terms of who we are. We have an opportunity to see some real cooperation between Cambridge and Kenya.”

The afternoon’s keynote speaker, Her Excellency Mrs Toyin Saraki, made an eloquent plea for closer collaboration between universities and Africa’s maternal and neo-natal health specialists.

Mrs Saraki, founder of the Wellbeing Foundation Africa, a non-profit organisation working with governments and NGOs across Africa to ensure better maternal, new-born and child health, remarked on the difficulties faced by women in her own country, Nigeria –where 14% of women are likely to die from maternal mortality complications.

“I started the Wellbeing Foundation out of personal suffering. We’ve moved beyond the suffering to providing the solutions.”

She challenged the audience to consider how academic research can cascade down to impact individuals at a community level: “What we need from our partnership with universities is the evidence that will allow us to advocate for the necessary support to improve maternal health and those providing it.”

The day’s final keynote speaker, Professor Ebenezer Owusu, Vice-Chancellor of the University of Ghana, recalled the collaborative links between his institution and the University of Cambridge going back to 1948, when Cambridge academics helped to found what was then called the University of the Gold Coast.

“Cambridge has long offered opportunities to train Ghana’s human capital and help meet Ghana’s developmental needs,” he said, reflecting on how the Cambridge-Africa Partnership for Research Excellence (CAPREx) has helped build capacity in research management at the University of Ghana.

“There is a shift in the place of Africa in the world today,” he said. “It is amplified by the ‘Africa Rising’ narrative, and by the resoluteness of African growth in the face of an economic downturn. Yet the continent’s higher education institutions have not changed. There is a need for more inclusive partnerships with universities like Cambridge.”

He added: “Africa must lead research initiatives in solving African problems.There is a need for a new type of partnership: equal partnership in the generation of knowledge and creative solutions, not just for Africa abut for the world.”

At the end of a day that included discussions on collaborations in African archaeology, conservation, maternal health, plant science, pharmacology, social enterprise and student-led Africa-focussed initiatives, Professor Eilís Ferran, Pro-Vice-Chancellor for International Affairs summed up the key ideas:

“We’ve learned about the need to study a problem in the population most affected by it. We’ve heard about the multiplying effect of training the key people in any discipline. We’ve considered the challenges of mentoring, and raised the question of whether we are doing enough to equip people to operate in challenging environments. And we are clear about the challenges of new partnerships, and the role of African universities in leading those partnerships.”

Regarding the sustainability of the University of Cambridge’s engagement with Africa, she concluded: “Frankly, it’s here to stay”.

 

Details of the University of Cambridge's Graduate Admissions policy for applicants from Least Developed and Low-Income Countries can be found here.

The University’s policy on graduate admissions was reiterated at the opening of the third Cambridge-Africa Day

The Cambridge-Africa Programme is about planting seeds that allow partner institutions to thrive.
Professor Sir Leszek Borysewicz
Prof Sir Leszek Borysiewicz

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Next-generation smartphone battery inspired by the gut

$
0
0

Researchers have developed a prototype of a next-generation lithium-sulphur battery which takes its inspiration in part from the cells lining the human intestine. The batteries, if commercially developed, would have five times the energy density of the lithium-ion batteries used in smartphones and other electronics.

The new design, by researchers from the University of Cambridge, overcomes one of the key technical problems hindering the commercial development of lithium-sulphur batteries, by preventing the degradation of the battery caused by the loss of material within it. The results are reported in the journal Advanced Functional Materials.

Working with collaborators at the Beijing Institute of Technology, the Cambridge researchers based in Dr Vasant Kumar’s team in the Department of Materials Science and Metallurgy developed and tested a lightweight nanostructured material which resembles villi, the finger-like protrusions which line the small intestine. In the human body, villi are used to absorb the products of digestion and increase the surface area over which this process can take place.

In the new lithium-sulphur battery, a layer of material with a villi-like structure, made from tiny zinc oxide wires, is placed on the surface of one of the battery’s electrodes. This can trap fragments of the active material when they break off, keeping them electrochemically accessible and allowing the material to be reused.

“It’s a tiny thing, this layer, but it’s important,” said study co-author Dr Paul Coxon from Cambridge’s Department of Materials Science and Metallurgy. “This gets us a long way through the bottleneck which is preventing the development of better batteries.”

A typical lithium-ion battery is made of three separate components: an anode (negative electrode), a cathode (positive electrode) and an electrolyte in the middle. The most common materials for the anode and cathode are graphite and lithium cobalt oxide respectively, which both have layered structures. Positively-charged lithium ions move back and forth from the cathode, through the electrolyte and into the anode.

The crystal structure of the electrode materials determines how much energy can be squeezed into the battery. For example, due to the atomic structure of carbon, each carbon atom can take on six lithium ions, limiting the maximum capacity of the battery.

Sulphur and lithium react differently, via a multi-electron transfer mechanism meaning that elemental sulphur can offer a much higher theoretical capacity, resulting in a lithium-sulphur battery with much higher energy density. However, when the battery discharges, the lithium and sulphur interact and the ring-like sulphur molecules transform into chain-like structures, known as a poly-sulphides. As the battery undergoes several charge-discharge cycles, bits of the poly-sulphide can go into the electrolyte, so that over time the battery gradually loses active material.

The Cambridge researchers have created a functional layer which lies on top of the cathode and fixes the active material to a conductive framework so the active material can be reused. The layer is made up of tiny, one-dimensional zinc oxide nanowires grown on a scaffold. The concept was trialled using commercially-available nickel foam for support. After successful results, the foam was replaced by a lightweight carbon fibre mat to reduce the battery’s overall weight.

“Changing from stiff nickel foam to flexible carbon fibre mat makes the layer mimic the way small intestine works even further,” said study co-author Dr Yingjun Liu.

This functional layer, like the intestinal villi it resembles, has a very high surface area. The material has a very strong chemical bond with the poly-sulphides, allowing the active material to be used for longer, greatly increasing the lifespan of the battery.

“This is the first time a chemically functional layer with a well-organised nano-architecture has been proposed to trap and reuse the dissolved active materials during battery charging and discharging,” said the study’s lead author Teng Zhao, a PhD student from the Department of Materials Science & Metallurgy. “By taking our inspiration from the natural world, we were able to come up with a solution that we hope will accelerate the development of next-generation batteries.”

For the time being, the device is a proof of principle, so commercially-available lithium-sulphur batteries are still some years away. Additionally, while the number of times the battery can be charged and discharged has been improved, it is still not able to go through as many charge cycles as a lithium-ion battery. However, since a lithium-sulphur battery does not need to be charged as often as a lithium-ion battery, it may be the case that the increase in energy density cancels out the lower total number of charge-discharge cycles.

“This is a way of getting around one of those awkward little problems that affects all of us,” said Coxon. “We’re all tied in to our electronic devices – ultimately, we’re just trying to make those devices work better, hopefully making our lives a little bit nicer.”

Reference:
Teng Zhao et al. ‘Advanced Lithium-Sulfur Batteries Enabled by a Bio-Inspired Polysulfide Adsorptive Brush.’ Advanced Functional Materials (2016). DOI: 10.1002/adfm.201604069

A new prototype of a lithium-sulphur battery – which could have five times the energy density of a typical lithium-ion battery – overcomes one of the key hurdles preventing their commercial development by mimicking the structure of the cells which allow us to absorb nutrients. 

This gets us a long way through the bottleneck which is preventing the development of better batteries.
Paul Coxon
Computer visualisation of villi-like battery material

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Research reveals accidental making of ‘Patient Zero’ myth during 1980s AIDS crisis

$
0
0

A new study proves that a flight attendant who became notorious as the human epicentre of the US AIDS crisis of the 1980s – and the first person to be labeled the ‘Patient Zero’ of any epidemic – was simply one of many thousands infected in the years before HIV was recognized. 

Research by a historian from the University of Cambridge and the genetic testing of decades-old blood samples by a team of US scientists has demonstrated that Gaétan Dugas, a French-Canadian gay man posthumously blamed by the media for spreading HIV across North America, was not the epidemic’s ‘Patient Zero’.

In fact, work by Dr Richard McKay, a Wellcome Trust Research Fellow from Cambridge’s Department of History and Philosophy of Science, reveals how the very term ‘Patient Zero’ – still used today in press coverage of outbreaks from Ebola to swine flu to describe the first known case – was created inadvertently in the earliest years of investigating AIDS.

Before he died, Dugas provided investigators with a significant amount of personal information to assist with studies into whether AIDS was caused by a sexually transmitted agent. McKay’s research suggests that this, combined with confusion between a letter and a number, contributed to the invention of ‘Patient Zero’ and the global defamation of Dugas.     

Dr McKay’s work has added important contextual information to the latest study, led by Dr Michael Worobey from the University of Arizona and published today in the journal Nature, which has compared a new analysis of Dugas’s blood with eight other archived serum samples dating back to the late 1970s.      

“Gaétan Dugas is one of the most demonised patients in history, and one of a long line of individuals and groups vilified in the belief that they somehow fuelled epidemics with malicious intent,” says McKay.  

While his wider research traces this impulse to blame back several centuries, for the Nature paper McKay located the immediate roots of the term “Patient Zero” in an early ‘cluster study’ of US AIDS patients.

Mistaken for zero

Reports emerged in early 1982 of historical sexual links between several gay men with AIDS in Los Angeles, and investigators from the Centers for Disease Control (CDC) undertook a study to interview these men for the names of their sexual contacts.

They uncovered more links across southern California, but one connection was named several times despite not residing in the state: Case 057, a widely travelled airline employee. Investigators found that his sexual contacts included men in New York City, and some of his sexual partners developed symptoms of AIDS after he did.

CDC investigators employed a coding system to identify the study’s patients, numbering each city’s cases linked to the cluster in the sequence their symptoms appeared (LA 1, LA 2, NY 1, NY 2, etc.). However, within the CDC, Case 057 became known as ‘Out(side)-of-California’ – his new nickname abbreviated with the letter ‘O.’

Because other cases were numbered, it was here that the accidental coining of a new term took place. “Some researchers discussing the investigation began interpreting the ambiguous oval as a digit, and referring to Patient O as Patient 0,” says McKay. “‘Zero’ is a capacious word. It can mean nothing. But it can also mean the absolute beginning.”

The LA study expanded, due in no small part to information provided by Case 057. Over 65% of men in the cluster reported more than 1,000 partners in their lifetimes, over 75% more than 50 in the past year. But most could offer only a handful of names of those partners.

As well as donating plasma for analysis, Case 057 managed to provide 72 names of the roughly 750 partners he’d had in the previous three years. Also, his distinctive name may have been easier for other men to remember, says McKay. “The fact that Dugas provided the most names, and had a more memorable name himself, likely contributed to his perceived centrality in this sexual network.”

By the time the expanded study was published in 1984, the same year Dugas died of his illness, the cluster showed dozens of cases connecting several North American cities. Near the very centre of an accompanying diagram is a floating case that links both coasts, the itinerant Dugas. Case 057, the ‘Out-of-California’ case, had been rechristened simply as “Patient 0” – causing much speculation in the media.

‘Casting’ an epidemic

The journalist Randy Shilts would use the LA cluster study as an important thread in his bestselling book on the AIDS crisis, And the Band Played On. During the book’s research, he became fascinated by the study’s ‘Patient 0’.

Motivated to find out more about this man, Shilts eventually learned his name in 1986. The journalist tracked down his friends and colleagues for interviews, and, as “Patient Zero,” made him one of the more memorable villains in his book.

To call attention to the crisis, Shilts set out to “humanise the disease”, says McKay, who discovered that an early outline for the book actually listed ‘The Epidemic’ itself among the cast of characters. “To Shilts, Dugas as Patient Zero came to represent the disease itself.”

The 1982 study had initially suggested to investigators that the period between infection and the appearance of AIDS symptoms might be several months.

By the time Shilts’s book was published in 1987, however, it was known that an infected individual might not display symptoms for several years, and that the study was unlikely to have revealed a network of infection. Yet Shilts uncritically resurrected the story of the Los Angeles cluster study and its ‘Patient 0,’ with long-standing consequences.

The journalist’s decision provoked immediate criticism from AIDS activists in lesbian and gay communities across North America and the UK. Some of their works of protest are cited in the Nature study, and explored in greater detail in McKay’s own forthcoming book and in a 2014 article he published in the Bulletin of the History of Medicine.

“In many ways, the historical evidence has been pointing to the fallacy of Patient Zero for decades,” explains McKay. “We now have additional phylogenetic evidence that helps to consolidate this position.”

McKay describes the very phrase ‘Patient Zero’ as “infectious.” “Long before the AIDS epidemic there was interest in locating the earliest known cases of disease outbreaks. Yet the phrases ‘first case,’ ‘primary case,’ and ‘index case’ didn’t carry the same punch.

“With the CDC’s accidental coining of this term, and Shilts’s well-honed storytelling instincts, you can see the consolidation of an ‘infectious’ formula that would become central to the way many would make sense of later epidemics.”

Blaming ‘others’

Now, almost 30 years since Shilts’s book, analysis of the HIV-1 genome taken from Dugas’s 1983 blood sample, contextualised through McKay’s historical research, has shown that he was not even a base case for HIV strains at the time, and that a trail of error and hype led to his condemnation as the so-called Patient Zero.

The researchers say it may be naïve to expect Patient Zero’s legendary status, or the popular impulse to attribute blame for disease outbreaks, to ever disappear.

“Blaming ‘others’ – whether the foreign, the poor, or the wicked – has often served to establish a notional safe distance between the majority and groups or individuals identified as threats,” says McKay.

“In many ways, the US AIDS crisis was no different – as the vilification of Patient Zero shows. It is important to remember that, in the 1970s, as now, the epidemic was driven by individuals going about their lives unaware they were contracting, and sometimes transmitting, a deadly infection.

“We hope this research will give researchers, journalists and the public pause before using the term Patient Zero. The phrase carries many meanings and a freighted history, and has seldom pointed to what its users have intended.”

A combination of historical and genetic research reveals the error and hype that led to the coining of the term ‘Patient Zero’ and the blaming of one man for the spread of HIV across North America.

We hope this research will give researchers, journalists and the public pause before using the term Patient Zero
Richard McKay
Harry Reasoner introduces the 60 Minutes program featuring ‘Patient Zero’ and the American AIDS crisis, broadcast on CBS in November 1987.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Self-renewable killer cells could be key to making cancer immunotherapy work

$
0
0

In order to protect us from invading viruses and bacteria, and from internal threats such as malignant tumour cells, our immune system employs an army of specialist immune cells. Just as a conventional army will be made up of different types of soldiers, each with a particular role, so each of these immune cells has a particular function.

Among these cells are cytotoxic T-cells – ‘killer T-cells’, whose primary function is to patrol our bodies, programmed to identify and destroy infected or cancerous cells. Scientists are now trying to harness these cells as a way to fight cancer, by growing T-cells programmed to recognise cancer cells in the laboratory in large numbers and then reintroducing them into the body to destroy the tumour – an approach known as adoptive T-cell immunotherapy.

However, this approach has been hindered by the fact that killer T-cells are short-lived – most killer T cells are gone within three days of transfer – so the army may have died out before it has managed to rid the body of the tumour.

Now, an international team led by researchers at the University of Cambridge has identified a way of increasing the life-span of these T-cells, a discovery that could help scientists overcome one of the key hurdles preventing progress in immunotherapy.

In a paper published today in the journal Nature, the researchers have identified a new role for a molecule known as 2-hydroxyglutarate, or 2-HG, which is known to trigger abnormal growth in tumour cells. In fact, the team has shown that a slightly different form of the molecule also plays a normal, but critical, role in T-cell function: it can influence T-cells to reside in a 'memory state’.  This is a state where the cells can renew themselves, persist for a very long period of time, and re-activate to combat infection or cancer.

The researchers found that by increasing the levels of 2-HG in the T-cells, the researchers could generate cells that could much more effectively destroy tumours. Rather than expiring shortly after reintroduction, the memory state T-cells were able to persist for much longer, destroying tumour cells more effectively.

“In a sense, this means that rather than creating killer T-cells that are active from the start, but burn out very quickly, we are creating an army of ‘renewable cells’ that can stay quiet for a long time, but will go into action when necessary and fight tumour cells,” says Professor Randall Johnson, Wellcome Trust Principal Research Fellow at the Department of Physiology, Development & Neuroscience, University of Cambridge.

“So, with a fairly trivial treatment of T-cells, we’re able to change a moderate response to tumour growth to a much stronger response, potentially giving people a more permanent immunity to the tumours they are carrying. This could make immunotherapy for cancer much more effective.”

The research was largely funded by the Wellcome Trust.

Reference
Tyrakis, PA et al. The immunometabolite S-2-hydroxyglutarate regulates CD8+ T-lymphocyte fate; Nature; 26 Oct 2016; DOI: 10.1038/nature2016

A small molecule that can turn short-lived ‘killer T-cells’ into long-lived, renewable cells that can last in the body for a longer period of time, activating when necessary to destroy tumour cells, could help make cell-based immunotherapy a realistic prospect to treat cancer.

Rather than creating killer T-cells that are active from the start, but burn out very quickly, we are creating an army of ‘renewable cells’ that can stay quiet for a long time, but will go into action when necessary and fight tumour cells
Randall Johnson
T lymphocyte

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Cause of phantom limb pain in amputees, and potential treatment, identified

$
0
0

Researchers have discovered that a ‘reorganisation’ of the wiring of the brain is the underlying cause of phantom limb pain, which occurs in the vast majority of individuals who have had limbs amputated, and a potential method of treating it which uses artificial intelligence techniques.

The researchers, led by a group from Osaka University in Japan in collaboration with the University of Cambridge, used a brain-machine interface to train a group of ten individuals to control a robotic arm with their brains. They found that if a patient tried to control the prosthetic by associating the movement with their missing arm, it increased their pain, but training them to associate the movement of the prosthetic with the unaffected hand decreased their pain.

Their results, reported in the journal Nature Communications, demonstrate that in patients with chronic pain associated with amputation or nerve injury, there are ‘crossed wires’ in the part of the brain associated with sensation and movement, and that by mending that disruption, the pain can be treated. The findings could also be applied to those with other forms of chronic pain, including pain due to arthritis.

Approximately 5,000 amputations are carried out in the UK every year, and those with type 1 or type 2 diabetes are at particular risk of needing an amputation. In most cases, individuals who have had a hand or arm amputated, or who have had severe nerve injuries which result in a loss of sensation in their hand, continue to feel the existence of the affected hand as if it were still there. Between 50 and 80 percent of these patients suffer with chronic pain in the ‘phantom’ hand, known as phantom limb pain.

“Even though the hand is gone, people with phantom limb pain still feel like there’s a hand there – it basically feels painful, like a burning or hypersensitive type of pain, and conventional painkillers are ineffective in treating it,” said study co-author Dr Ben Seymour, a neuroscientist based in Cambridge’s Department of Engineering. “We wanted to see if we could come up with an engineering-based treatment as opposed to a drug-based treatment.”

A popular theory of the cause of phantom limb pain is faulty ‘wiring’ of the sensorimotor cortex, the part of the brain that is responsible for processing sensory inputs and executing movements. In other words, there is a mismatch between a movement and the perception of that movement.

In the study, Seymour and his colleagues, led by Takufumi Yanagisawa from Osaka University, used a brain-machine interface to decode the neural activity of the mental action needed for a patient to move their ‘phantom’ hand, and then converted the decoded phantom hand movement into that of a robotic neuroprosthetic using artificial intelligence techniques.

“We found that the better their affected side of the brain got at using the robotic arm, the worse their pain got,” said Yanagisawa. “The movement part of the brain is working fine, but they are not getting sensory feedback – there’s a discrepancy there.”

The researchers then altered their technique to train the ‘wrong’ side of the brain: for example, a patient who was missing their left arm was trained to move the prosthetic arm by decoding movements associated with their right arm, or vice versa. When they were trained in this counter-intuitive technique, the patients found that their pain significantly decreased. As they learned to control the arm in this way, it takes advantage of the plasticity – the ability of the brain to restructure and learn new things – of the sensorimotor cortex, showing a clear link between plasticity and pain.

Although the results are promising, Seymour warns that the effects are temporary, and require a large, expensive piece of medical equipment to be effective. However, he believes that a treatment based on their technique could be available within five to ten years. “Ideally, we’d like to see something that people could have at home, or that they could incorporate with physio treatments,” he said. “But the results demonstrate that combining AI techniques with new technologies is a promising avenue for treating pain, and an important area for future UK-Japan research collaboration.”

Reference:
Takufumi Yanagisawa et al. ‘Induced sensorimotor brain plasticity controls pain in phantom limb patients.’ Nature Communications (2016). DOI: 10.1038/ncomms13209

Researchers have identified the cause of chronic, and currently untreatable, pain in those with amputations and severe nerve damage, as well as a potential treatment which relies on engineering instead of drugs.

We wanted to see if we could come up with an engineering-based treatment as opposed to a drug-based treatment.
Ben Seymour
Measurement of brain activity in a patient with phantom limb pain

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Top ten universities conduct a third of all UK animal research

$
0
0

The top ten institutions conduct more than two thirds of all UK university animal research between them, completing a combined total of 1.37 million procedures. Over 99% of these procedures were carried out on rodents or fish, and in line with national data they were roughly evenly split between experiments and the breeding of genetically modified animals.

The ten universities are listed below alongside the total number of procedures that they carried out in 2015. Each institution’s name links to a breakdown of their individual animal research statistics.

University of Oxford:             226,214
University of Edinburgh:        212,695
UCL:                                     202,554
University of Cambridge:       181,080
King’s College London:         175,296
University of Manchester:      145,457
Imperial College London:       101,179
University of Glasgow:           49,082
University of Birmingham:      47,657
University of Nottingham:       31,689

The universities employ more than 90,000 staff between them, and as you would expect the larger institutions tend to conduct the most animal research. All universities are committed to the ‘3Rs’ of replacement, reduction and refinement. This means avoiding or replacing the use of animals where possible, minimising the number of animals used per experiment and minimising suffering to improve animal welfare. However, as universities expand and conduct more research, the total number of animals used can rise even if fewer animals are used per study.

“The fact that we perform a significant proportion of the UK’s leading biomedical research is something to be proud of,” says Professor Michael Arthur, UCL President & Provost. “It’s no surprise that the universities who conduct the most world-leading research also use the most animals; despite advances in non-animal techniques, animals offer answers to many research questions that alternative methods cannot yet provide."

All ten universities are signatories to the Concordat on Openness on Animal Research in the UK, a commitment to be more open about the use of animals in scientific, medical and veterinary research in the UK. 107 organisations have signed the concordat including UK universities, charities, research funders and commercial research organisations.

Animal research has played a key role in the development of virtually every medicine that we take for granted today. However, despite decades of dedicated research, many widespread and debilitating conditions are still untreatable. Medical research is a slow process with no easy answers, but animal research helps to take us incrementally closer to treatments for cancer, dementia, stroke and countless other conditions.

While many animal studies do not lead directly to treatments for diseases, ‘basic science’ research helps scientists to understand different processes in the body and how they can go wrong, underpinning future efforts to diagnose and treat various conditions. Additionally, many studies will show that a line of research is not worth pursuing. Although this can be disappointing, such research is incredibly valuable as scientists need to know which methods do not work and why so that they can develop new ones. Animal studies can also help to answer a wide range of research questions that are not directly related to diseases, such as exploring how genes determine traits or how brain functions develop.

About animal research at the University of Cambridge

The ten UK universities who do the most world-leading biomedical research have announced their animal research statistics, revealing that they collectively conducted a third of all UK animal research in 2015.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Potential new treatment for haemophilia developed by Cambridge researchers

$
0
0

Around 400,000 individuals around the world are affected by haemophilia, a genetic disorder that causes uncontrolled bleeding. Haemophilia is the result of a deficiency in proteins required for normal blood clotting – factor VIII for haemophilia A and factor IX for haemophilia B. Currently, the standard treatment is administration of the missing clotting factor. However, this requires regular intravenous injections, is not fully effective, and in about a third of patients results in the development of inhibitory antibodies. Nearly three-quarters of haemophilia sufferers have no access to treatment and have a life-expectancy of only 10 years.

In a study published online today in Blood, the Journal of the American Society of Hematology, researchers report on a novel approach that gives the clotting process more time to produce thrombin, the enzyme that forms blood clots.  They suggest this treatment could one day help all patients with haemophilia, including those who develop antibodies against standard therapy. The therapy is based on observations relating to a disorder associated with excessive clotting, known as factor V Leiden.

“We know that patients who have haemophilia and also have mutations that increase clotting, such as factor V Leiden, experience less-severe bleeding,” says study co-author Dr Trevor Baglin, Consultant Haematologist at Addenbrooke’s Hospital, Cambridge University Hospitals.

Dr Baglin and colleagues therefore pursued a strategy of reducing the activity of an anticoagulant enzyme, known as activated protein C (APC). The principal function of APC is to breakdown the complex that makes thrombin, and the factor V Leiden mutation slows this process. The team, led by Professor Jim Huntington, exploited this insight by developing a specific inhibitor of APC based on a particular type of molecule known as a serpin.

“We hypothesized that if we targeted the protein C pathway we could prolong thrombin production and thereby induce clotting in people with clotting defects, such as haemophilia sufferers,” says Professor Huntington, from the Cambridge Institute for Medical Research at the University of Cambridge. “So, we engineered a serpin that could selectively prevent APC from shutting down thrombin production before the formation of a stable clot.”

To test their theory, the team administered the serpin to mice with haemophilia B and clipped their tails. The researchers found that the amount of blood loss decreased as the dose increased, with the highest dose reducing bleeding to the level found in normal mice. Further studies confirmed that the serpin helped haemophilia mice form stable clots, with higher doses resulting in faster clot formation. The serpin was also able to increase thrombin production and accelerate clot formation when added to blood samples from haemophilia A patients.

“It’s our understanding that because we are targeting a general anti-clotting process, our serpin could effectively treat patients with either haemophilia A or B, including those who develop resistance to more traditional therapy,” adds Professor Huntington. “Additionally, we have focused on engineering the serpin to be long-acting and to be delivered by injection under the skin instead of directly into veins. This will free patients from the inconvenience of having to receive infusions three times a week, as is the case with current treatments.”

The research team hopes that the discovery can be rapidly developed into an approved medicine to provide improved care to haemophilia sufferers around the world.

“Within three years, we hope to be conducting our first-in-man trials of a subcutaneously-administered form of our serpin,” says Dr Baglin. “It is important to remember that the majority of people in the world with haemophilia have no access to therapy. A stable, easily-administered, long-acting, effective drug could bring treatment to a great deal many more haemophilia sufferers.”

This study forms part of a patent application, filed in the name of Cambridge Enterprise, and the modified serpin is being developed by a start-up company, ApcinteX, with funding from Medicxi.

Adapted from a press release by American Society of Hematology.

Reference
Polderdijk, SGI et al. Design and characterization of an APC-specific serpin for the treatment of haemophilia. Blood; 27 Oct 2016; 

A new treatment that might one day help all patients with haemophilia, including those that become resistant to existing therapies, has been developed by researchers at the University of Cambridge.

Within three years, we hope to be conducting our first-in-man trials
Trevor Baglin

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Fossilised dinosaur brain tissue identified for the first time

$
0
0

An unassuming brown pebble, found more than a decade ago by a fossil hunter in Sussex, has been confirmed as the first example of fossilised brain tissue from a dinosaur.

The fossil, most likely from a species closely related to Iguanodon, displays distinct similarities to the brains of modern-day crocodiles and birds. Meninges – the tough tissues surrounding the actual brain – as well as tiny capillaries and portions of adjacent cortical tissues have been preserved as mineralised ‘ghosts’.

The results are reported in a Special Publication of the Geological Society of London, published in tribute to Professor Martin Brasier of the University of Oxford, who died in 2014. Brasier and Dr David Norman from the University of Cambridge co-ordinated the research into this particular fossil during the years prior to Brasier’s untimely death in a road traffic accident.

The fossilised brain, found by fossil hunter Jamie Hiscocks near Bexhill in Sussex in 2004, is most likely from a species similar to Iguanodon: a large herbivorous dinosaur that lived during the Early Cretaceous Period, about 133 million years ago.

Finding fossilised soft tissue, especially brain tissue, is very rare, which makes understanding the evolutionary history of such tissue difficult. “The chances of preserving brain tissue are incredibly small, so the discovery of this specimen is astonishing,” said co-author Dr Alex Liu of Cambridge’s Department of Earth Sciences, who was one of Brasier’s PhD students in Oxford at the time that studies of the fossil began.

According to the researchers, the reason this particular piece of brain tissue has been so well-preserved is that the dinosaur’s brain was essentially ‘pickled’ in a highly acidic and low-oxygen body of water – similar to a bog or swamp – shortly after its death. This allowed the soft tissues to become mineralised before they decayed away completely, so that they could be preserved.

“What we think happened is that this particular dinosaur died in or near a body of water, and its head ended up partially buried in the sediment at the bottom,” said Norman. “Since the water had little oxygen and was very acidic, the soft tissues of the brain were likely preserved and cast before the rest of its body was buried in the sediment.”

Working with colleagues from the University of Western Australia, the researchers used scanning electron microscope (SEM) techniques in order to identify the tough membranes, or meninges, that surrounded the brain itself, as well as strands of collagen and blood vessels. Structures that could represent tissues from the brain cortex (its outer layer of neural tissue), interwoven with delicate capillaries, also appear to be present. The structure of the fossilised brain, and in particular that of the meninges, shows similarities with the brains of modern-day descendants of dinosaurs, namely birds and crocodiles.

In typical reptiles, the brain has the shape of a sausage, surrounded by a dense region of blood vessels and thin-walled vascular chambers (sinuses) that serve as a blood drainage system. The brain itself only takes up about half of the space within the cranial cavity.

In contrast, the tissue in the fossilised brain appears to have been pressed directly against the skull, raising the possibility that some dinosaurs had large brains which filled much more of the cranial cavity. However, the researchers caution against drawing any conclusions about the intelligence of dinosaurs from this particular fossil, and say that it is most likely that during death and burial the head of this dinosaur became overturned, so that as the brain decayed, gravity caused it to collapse and become pressed against the bony roof of the cavity.

“As we can’t see the lobes of the brain itself, we can’t say for sure how big this dinosaur’s brain was,” said Norman. “Of course, it’s entirely possible that dinosaurs had bigger brains than we give them credit for, but we can’t tell from this specimen alone. What’s truly remarkable is that conditions were just right in order to allow preservation of the brain tissue – hopefully this is the first of many such discoveries.”

“I have always believed I had something special. I noticed there was something odd about the preservation, and soft tissue preservation did go through my mind. Martin realised its potential significance right at the beginning, but it wasn’t until years later that its true significance came to be realised,” said paper co-author Jamie Hiscocks, the man who discovered the specimen. “In his initial email to me, Martin asked if I’d ever heard of dinosaur brain cells being preserved in the fossil record. I knew exactly what he was getting at. I was amazed to hear this coming from a world renowned expert like him.”

The research was funded in part by the Natural Environment Research Council (NERC) and Christ’s College, Cambridge. 

Reference:
Martin D. Brasier et al.’ Remarkable preservation of brain tissues in an Early Cretaceous iguanodontian dinosaur.’ Earth System Evolution and Early Life: a Celebration of the Work of Martin Brasier. Geological Society, London, Special Publications, 448. (2016). DOI: 10.1144/SP448.3

Researchers have identified the first known example of fossilised brain tissue in a dinosaur from Sussex. The tissues resemble those seen in modern crocodiles and birds. 

The chances of preserving brain tissue are incredibly small, so the discovery of this specimen is astonishing.
Alex Liu
Image of specimen

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Plant ‘thermometer’ triggers springtime budding by measuring night-time heat

$
0
0

An international team of scientists led by the University of Cambridge has discovered the ‘thermometer’ molecule that enables plants to develop according to seasonal temperature changes.

Researchers have revealed that molecules called phytochromes – used by plants to detect light during the day – actually change their function in darkness to become cellular temperature gauges that measure the heat of the night.

The new findings, published today in the journal Science, show that phytochromes control genetic switches in response to temperature as well as light to dictate plant development.    

At night, these molecules change states, and the pace at which they change is “directly proportional to temperature” say scientists, who compare phytochromes to mercury in a thermometer. The warmer it is, the faster the molecular change – stimulating plant growth.

Farmers and gardeners have known for hundreds of years how responsive plants are to temperature: warm winters cause many trees and flowers to bud early, something humans have long used to predict weather and harvest times for the coming year.

The latest research pinpoints for the first time a molecular mechanism in plants that reacts to temperature – often triggering the buds of spring we long to see at the end of winter.   

With weather and temperatures set to become ever more unpredictable due to climate change, researchers say the discovery that this light-sensing molecule moonlights as the internal thermometer in plant cells could help us breed tougher crops.  

“It is estimated that agricultural yields will need to double by 2050, but climate change is a major threat to such targets. Key crops such as wheat and rice are sensitive to high temperatures. Thermal stress reduces crop yields by around 10% for every one degree increase in temperature,” says lead researcher Dr Philip Wigge from Cambridge’s Sainsbury Laboratory.  

“Discovering the molecules that allow plants to sense temperature has the potential to accelerate the breeding of crops resilient to thermal stress and climate change.”

In their active state, phytochrome molecules bind themselves to DNA to restrict plant growth. During the day, sunlight activates the molecules, slowing down growth.

If a plant finds itself in shade, phytochromes are quickly inactivated – enabling it to grow faster to find sunlight again. This is how plants compete to escape each other’s shade. “Light driven changes to phytochrome activity occur very fast, in less than a second,” says Wigge.

At night, however, it’s a different story. Instead of a rapid deactivation following sundown, the molecules gradually change from their active to inactive state. This is called “dark reversion”.  

“Just as mercury rises in a thermometer, the rate at which phytochromes revert to their inactive state during the night is a direct measure of temperature,” says Wigge.

“The lower the temperature, the slower phytochromes revert to inactivity, so the molecules spend more time in their active, growth-suppressing state. This is why plants are slower to grow in winter.

“Warm temperatures accelerate dark reversion, so that phytochromes rapidly reach an inactive state and detach themselves from DNA – allowing genes to be expressed and plant growth to resume.”

Wigge believes phytochrome thermo-sensing evolved at a later stage, and co-opted the biological network already used for light-based growth during the downtime of night.      

Some plants mainly use day-length as an indicator of the season. Other species, such as daffodils, have considerable temperature sensitivity, and can flower months in advance during a warm winter.

In fact, the discovery of the dual role of phytochromes provides the science behind a well-known rhyme long used to predict the coming season: Oak before Ash we'll have a splash, Ash before Oak we’re in for a soak.

Wigge explains: “Oak trees rely much more on temperature, likely using phytochromes as thermometers to dictate development, whereas Ash trees rely on measuring day length to determine their seasonal timing.

“A warmer spring, and consequently a higher likeliness of a hot summer, will result in Oak leafing before Ash. A cold spring will see the opposite. As the British know only too well, a colder summer is likely to be a rain-soaked one.”

The new findings are the culmination of twelve years of research involving scientists from Germany, Argentina and the US, as well as the Cambridge team. The work was done in a model system, a mustard plant called Arabidopsis, but Wigge says the phytochrome genes necessary for temperature sensing are found in crop plants as well.

“Recent advances in plant genetics now mean that scientists are able to rapidly identify the genes controlling these processes in crop plants, and even alter their activity using precise molecular ‘scalpels’,” adds Wigge.

“Cambridge is uniquely well-positioned to do this kind of research as we have outstanding collaborators nearby who work on more applied aspects of plant biology, and can help us transfer this new knowledge into the field.” 

A photoreceptor molecule in plant cells has been found to moonlight as a thermometer after dark – allowing plants to read seasonal temperature changes. Scientists say the discovery could help breed crops that are more resilient to the temperatures expected to result from climate change.

Discovering the molecules that allow plants to sense temperature has the potential to accelerate the breeding of crops resilient to thermal stress and climate change
Philip Wigge
Daffodil

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Facebook updates could provide a window to understanding – and treating – mental health disorders

$
0
0

Over a billion people worldwide use Facebook daily – one in seven of the global population – and social media use is increasing at three times the rate of other internet use. Evidence suggests that 92% of adolescents use the site daily and disclose considerably more about themselves online than offline.

Writing in today’s edition of Lancet Psychiatry, researchers from the University of Cambridge discuss how social networking sites might be harnessed to provide data to help further our understanding of the onset and early years of mental illness.

“Facebook is hugely popular and could provide us with a wealth of data to improve our knowledge of mental health disorders such as depression and schizophrenia,” says Dr Becky Inkster, the study’s lead-author, from the Department of Psychiatry. “Its reach is particularly broad, too, stretching across the digital divide to traditionally hard-to-reach groups including homeless youth, immigrants, people with mental health problems, and seniors.”

Dr Inkster and her colleagues argue that Facebook might be used to help improve the detection of mental health factors. Dr Michal Kosinski, co-author from Stanford Graduate Business School, adds that Facebook data tends to be more reliable than offline self-reported information, while still reflecting an individual’s offline behaviours. It also enables researchers to measure content that is difficult to assess offline, such as conversation intensity, and to reach sample sizes previously unobtainable.

Status updates, shares and likes can provide a wealth of information about users, they say. A previous study of 200 US college students over the age of 18 years found that one in four posted status updates showing depressive-like symptoms. By analysing the language, emotions and topics used in status updates, the researchers say that it may be possible to look for symptoms or early signs of mental illness. Even photographs might provide new insights; Facebook is the world’s largest photo sharing website, with some 350 million photos uploaded daily, and automated picture analysis of emotional facial expressions might offer unique representations of offline behaviours.

Studies have shown that social networks can have both positive and negative effects on user’s emotions. Being ‘unfriended’ can elicit negative emotions, but even an individuals’ News Feed, which reports what their friends are up to, can affect their mood: one study found that a reduction of the amount of positive content displayed by friends led to an increase in negative status updates by users, and vice-versa. Other research has shown that some people with mental health disorders report positive experiences of social media, suggesting that Facebook might be harnessed to offer people support. People with schizophrenia and psychosis, for example, have reported that social networking sites helped them socialise and did not worsen their symptoms.

The researchers suggest that the use of therapies based on users’ Facebook pictures and timelines could be trialled as possible ways to use online social networks to support individuals. This might assist with accessing autobiographical memories, which can be impaired in conditions such as depression, and for improving cognition and mood with older patients, similar to offline therapies for early dementia.

“Facebook relationships may help those with reduced self-esteem and provide companionship for individuals who are socially isolated,” says Dr Becky Inkster. “We know that socially isolated adolescents are more likely to suffer from depression and suicidal thoughts, so these online stepping stones could encourage patients to reform offline social connections.”

These online – potentially leading to offline – social connections can provide support for vulnerable individuals such as homeless youth, a population at increased risk of mental health problems. Research has shown that this support is associated with a reduction in their alcohol intake and a decrease in depression-like symptoms. Unlike virtual patient communities, an advantage of using social networking sites, especially Facebook, is that people naturally use them in their daily lives, which addresses concerns about the limited duration of participation in virtual communities.

Early detection of digital warning signs could enhance mental health service contact and improve service provision, the researchers say. Facebook already allows users who are worried about a friend’s risk of suicide to report the post, for example. However, the use of social networking sites in the context of mental health and young people raises potential ethical issues. Vulnerable individuals will need to fully understand what participation in psychiatry research and mental health-care practice involves and that consent is monitored throughout the various stages of their illness.

“People are uneasy at the idea of having their social media monitored and their privacy infringed upon, so this is something that will need to be handled carefully,” says co-author Dr David Stillwell from the Cambridge Judge Business School. “To see this, we only have to look at the recent furore that led to the abrupt suspension of the Samaritans’ Radar Twitter app, which with the best of intentions enabled users to monitor their friends’ Twitter activity for suicidal messages.”

Much of this research is still in its infancy and evidence is often anecdotal or insufficient, argue the team. Several issues need addressing, such as whether using social media might interfere with certain illnesses or symptoms more than others – such as digital surveillance-based paranoid themes – and to ensure confidentiality and data protection rights for vulnerable people. But they are optimistic about its potential uses.

“Although it isn’t clear yet how social networking sites might best be used to improve mental health care, they hold considerable promise for having profound implications that could revolutionise mental healthcare,” says Dr Inkster.

Reference
Becky Inkster, David Stillwell, Michal Kosinski, Peter Jones. A decade into Facebook: where is psychiatry in the digital age? Lancet Psychiatry; 27 Oct 2016; DOI: 10.1016/S2215-0366(16)30041-4

Our Facebook status updates, ‘likes’ and even photos could help researchers better understand mental health disorders with the right ethical safeguards, argue researchers from the University of Cambridge, who suggest that social networks may even be used in future to provide support and interventions, particularly among young people.

Facebook is hugely popular and could provide us with a wealth of data to improve our knowledge of mental health disorders such as depression and schizophrenia
Becky Inkster
Facebook Like Button

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Opinion: Economics has a serious gender problem – it needs more women

$
0
0

On the eve of the 2008 financial crisis, economists were feeling optimistic. The two-headed beast that had blighted the economy throughout the 1970s and 1980s – inflation combined with unemployment – had been tamed, and the business cycle seemed to be a thing of the past. Economists believed they had developed such a good understanding of the economy that they could keep it on an even keel. The Nobel Prize-winning economist and president of the American Economic Association, Robert Lucas, went as far as to announce that the Great Depression would never happen again.

When the unthinkable happened in 2008, no one was therefore more shocked than economists themselves – and economics has been trying to rebuild itself ever since. Along the way, it has been having to wrestle with two other not entirely unrelated problems: rising inequality and a slowdown in economic growth. If economics is to change for the better and not for the worse, economists need to draw on new ideas and new voices. That must include women.

Economics has a serious sex problem – this is, in my view, one of the prime reasons why it went “off piste” in the first place. Hence my call for a sexual revolution in economics. The presence of leading ladies such as Janet Yellen at the Fed or Christine Lagarde at the IMF masks a deep underlying problem in economics, one which is apparent from the fact that there has only ever been one female Nobel Prize-winning economist.

Whether we are looking at policymakers, academics or economics students, there are many more men than there are women at the helm of the economy. In the UK and US, there are almost three times as many male home students majoring in economics at university as there are female home students. In the UK, the proportion of girls studying for an economics degree has been on a downward as opposed to upward trend.

Whether an economist is male or female should not, in principle, matter. But given that our society has been one in which the male experience is very different to that of the female, how can a subject dominated by men not implicitly and unknowingly provide us with only half of the story?

While economists like to think of their discipline as being gender neutral, the reality is that economists have looked at the world around them through male eyes – and rather privileged male eyes at that. This male experience has traditionally been one of business and paid work, an experience that leaves family and community to the opposite sex. The interactions between society and the economy are ignored, and the vital role of reproduction, care and nurture – something which is just as important as investment in capital stock – is downplayed. It is, effectively, taken for granted.

Men, after all, have far more experience of investment in plant and machinery than they do of investment in the next generation – or of caring for the previous generation of “producers”. And since traditionally “rationality” has been seen as a male trait and “emotion” as female, economists have long taken the attitude that to incorporate real human characteristics into their way of thinking about the economy would be to make it less rigorous.

False dichotomy

While the economy affects everyone – male or female – the questions economists seek to answer, the tools they use to find an answer, the assumptions they make along the way and the economic phenomena they choose to measure are all dictated by the fact that economics is a discipline dominated by men. In turn, so are the economic policies that affect our daily lives.

Unsurprisingly, economists have placed markets on a pedestal, leaving life outside in the cold – including vital activities without which the economy and society could not function. The “upsides” of state interventions, many of which have a powerful effect on women’s lives, have received little attention relative to the much-trumpeted “downsides”. The welfare state has been demonised and women have suffered the consequences.

With this neglect of our wider lives, economists have typically divided the economy into twin spheres: the state and the market. Any expansion of the former is therefore seen as coming at the cost of the latter. Only by recognising a third sphere, involving life outside of the market and beyond the whims of the state, will we stop seeing the state and the market as if they are in a permanent zero-sum game. By supporting women’s labour force participation through social and welfare policy, the state can, for example, work in support of market activity rather than crowding it out.

His story must include her

In addition to the bias contained within economists’ models of the world, their interpretation of the past – of what has made the Western economy successful – also leaves something to be desired. The story we are typically told is supposedly gender neutral but, when you think about it, it’s very much a male tale – one involving the largely male engineers, inventors, industrialists and scientists of the Industrial Revolution. But history suggests that women’s choices about work, fertility and home were just as important for the rise of the West.

In Britain, women had already begun to enter the workforce hundreds of years before the Industrial Revolution and did not marry until their mid-20s – very different to the situation in many emerging economies today. The result was smaller families – meaning less downward pressure on wages, a greater ability for parents to educate the children they did have and spare resources for families to save for the future. By affecting wages, skills and savings, women’s choices about work and family sowed the longer-term seeds of economic growth.

By ignoring the relevance of gender to economic growth, economists have been blinkered to the potential which female empowerment provides to help resolve today’s pressing economic problems – including in the West. Whether it is a slowdown in growth, deflation, negative interest rates, poor productivity performance, stagnant wages, inequality or political battles about immigration, the problems we currently face are rooted in what I have recently termed for Bloomberg “a global sex problem”.

A lack of female empowerment in poorer countries has resulted in high fertility rates and rapid population growth over the past century. With the onset of globalisation, as rich and poor economies have come into greater contact, this has created significant downward pressure on wage growth in the West. Rising inequality and slow growth have been the inevitable result – as has animosity towards foreigners and to the forces of globalisation.

To my mind, it is not globalisation that is the underlying cause of our problems: it is the lack of freedom for women in poorer countries across the world – including their lack of freedom to take charge of their bodies. Our economic suffering reflects their own sufferings: excessive population growth abroad resulting from women’s lack of freedom hurts wage growth in the West, particularly of less skilled workers. This affects inequality and lowers incentives for businesses to invest.

Unfortunately, the gender problem in economics has meant that the connection between women’s empowerment and current-day economic problems has remained unexplored. Take what is perhaps the most respected book on the challenges facing the Western economy – Secular Stagnation: facts, causes and cures, edited by the economists Coen Teulings and Richard Baldwin. None of the 20 or so contributors was female – gender did not receive a mention. And, take Thomas Piketty’s Capital in the Twenty-First century. Gender hardly features at all. I only counted one mention of it in the text.

In the process of the economy remaking itself, economists need to admit that their discipline has a serious sex problem – one that desperately needs to be addressed if we are to get to grips with the major challenges we face: slow growth, inequality and recurrent crises. By ignoring the problem, or by presuming that it is women who need to change, not the discipline itself, we will be destined to repeat past mistakes. And that will hurt everyone – male or female.

The Conversation

Victoria Bateman, Lecturer and Fellow in Economics, University of Cambridge

This article was originally published on The Conversation. Read the original article.

Victoria Bateman (Lecturer and Fellow in Economics) calls for fresh thinking to prevent the exclusion of women's voices.

Ministru prezidents piedalās Latvijas bankas un Starptautiskā Valūtas fonda konferences "Apstākļiem spītējot: Baltijas valstu tautsaimniecības atveseļošanās pieredze" atklāšanā

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Protecting our data and identity: how should the law respond?

$
0
0

The freedom of expression and the need for privacy may be strange bedfellows today – but could full-blown estrangement beckon in a digital future that makes the leap from user-controlled content to unfiltered, online sharing of, well, everything?

A future where streaming your life online becomes the norm is not unthinkable, according to Dr David Erdos, whose research in the Faculty of Law explores the nature of data protection. “Take something like Snapchat Spectacles or Google Glass,” he says. “Such technology could very quickly take off, and all of a sudden it becomes ‘normal’ that everyone is recording everything, both audibly and visually, and the data is going everywhere and being used for all sorts of purposes – some individual, some organisational.”

This makes questions about what control we have over our digital footprint rather urgent.

“You can see that we need to get some grip on how the right to privacy can be enforced as technologies continue to develop that can pose serious threats to individuals’ sense of dignity, reputation, privacy and safety,” he adds.

One enforcement Erdos refers to is Google Spain, a ruling made in 2014 by the Court of Justice of the European Union (CJEU) that examined search engines’ responsibilities when sharing content about us on the world wide web.

The CJEU ruled that people across all of the 28 EU Member States have a ‘right to be forgotten’ online, giving them an ability to prohibit search engines indexing inadequate, irrelevant or other illegal information about them against their name. This right to be forgotten is based on Europe’s data protection laws and applies to all online information about a living person.

Google responded by publishing a form you can submit to have such links to content (not the actual content) removed. I put it to the test – Google refuses on the basis that web links to my long-closed business are ‟justified” as they ‟may be of interest to potential or current consumers”.

Erdos explains that data protection doesn’t always work as it was originally intended to. “On paper, the law is in favour of privacy and the protection of individuals – there are stringent rules around data export, data transparency and sensitive data, for example.

“But that law was in essence developed in the 1970s, when there were few computers. Now we have billions of computers, and the ease of connectivity of smartphones and the internet.  Also, sharing online is not practically constrained by EU boundaries.

“That means the framework is profoundly challenged. There needs to be a more contextual legal approach, where the duties and possibly also the scope take into account risk as well as the other rights and interests that are engaged.  That law must then be effectively enforced.”

In fact, the EU data protection law currently extends surprisingly far. “By default, the law regulates anyone who alone, or jointly with others, does anything with computerised information that mentions a living person,” Erdos explains. “That could include many individuals on social networking sites. If you’re disseminating information about a third party to an indeterminate number of people, you’re (in theory at least) responsible for adherence to this law.”

Tweeters, for instance, may have to respond to requests for data (Tweets) to be rectified for inaccuracy or even removed entirely, and field ‘subject access requests’ for full lists of everything they’ve Tweeted about someone. And under the new General Data Protection Regulation that comes into effect in 2018, the maximum penalty for an infringement is €20 million (or, in the case of companies, up to 4% of annual global turnover).

When it comes to search engines or social media, Erdos admits that a strict application of the law is “not very realistic”. He adds: “There’s a systemic problem in the gap between the law on the books and the law in reality, and the restrictions are not desperately enforced.”

Erdos believes inconsistencies in the law could be exploited online by the ruthless. “The very danger of all-encompassing, stringent laws is that it seems as if responsible organisations and individuals who take them seriously are hamstrung while the irresponsible do whatever they want.”

This also applies to ‘derogations’ – areas where the law instructs a balance must be struck between data protection and the rights to freedom of journalistic, literary and artistic expression.

“Member states have done radically different things in their formal law here – from nothing at all through to providing a blanket exception – neither of which was the intention of the EU scheme.”

As the new law in 2018 will empower regulators to hand out fines of up to hundreds of millions of euros to large multinational companies, Erdos is passionate about the urgency of Europe getting a coordinated and clear approach on how its citizens can exercise their data protection rights.

“We are giving these regulators quite enormous powers to enforce these rules and yet do we have a good understanding of what we want the outcome to be and what we’re expecting individuals and organisations to do?” Erdos ponders.

“To me, this means that the enforcement will become more and more important. Data protection is not just a technical phrase – people really do need protection. The substance of the law needs to be hauled into something that’s more reasonable. That protection needs to be made real.”

Erdos’ research also explores the nature of data protection and academic freedom, and he successfully argued for academic expression to be added to the list of free speech derogations in the 2018 legislation. “I have come across the most egregious examples of research guidance stipulating alleged data protection requirements, including claims that published research can’t include any identifiable personal data at all,” says Erdos.

“In a survey of EU data protection authorities, I asked whether a journalist’s undercover investigation into extremist political beliefs and tactics and an academic’s undercover research into widespread claims of police racism could be legal under data protection. Not one regulator said the activity of the journalist would in principle be illegal, but almost half said the academic’s activity would be unlawful.

 “Academics aim to write something of public importance, and make it rigorous. The old law was seen to prioritise even tittle-tattle in a newspaper over academic research; one hopes this will largely be removed by the new law.”

For many, the greatest concern remains the potential threats to their privacy. In order for consumers to feel safe with emerging technology, law makers may have to legislate for potential breaches now, rather than react after the damage is done.

“We don’t want to respond in a panic of registering or documenting everything, but the alternative of collapse into an ‘anything goes’ situation is equally dangerous.

“Apps like Snapchat show many people value being able to upload certain pictures and information that soon disappear. We don’t want people forgetting what they’re sharing today, and then worrying far too late how third parties are using that information.”

Would Erdos himself ever use Snapchat Spectacles or Google Glass (he does own a smartphone)? He laughs. “Let’s face it, email, the internet, Google search… people ended up having to use them. So, never say never!”

Many of us see our privacy as a basic right. But in the digital world of app-addiction, geolocation tracking and social oversharing, some may have cause to wonder if that right is steadily and sometimes willingly being eroded away.

You can see that we need to get some grip on how the right to privacy can be enforced as technologies continue to develop that can pose serious threats to individuals’ sense of dignity, reputation, privacy and safety
David Erdos
Banksy stencil

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Does your empathy predict if you would stop and help an injured person?

$
0
0

A team of psychologists at the University of Cambridge has conducted a social psychology experiment to test the theory that an individual’s level of empathy influences their behaviour. The results of their preliminary study, dubbed “The Trumpington Road Study” and published in the journal Social Neuroscience, suggest that this theory is correct.

In the experiment, one of the team posed as an injured person, sitting on the grass on Trumpington Road, one the main roads running through Cambridge, next to the Cambridge University Botanic Garden. Next to the ‘injured’ person was his upturned bicycle. Another member of the team was standing innocently across the road, watching to see if anyone was approaching from the side road of Brooklands Avenue.

As soon as a member of the public approached the street corner, alone, and was about to turn into Trumpington Road, he gave a quiet signal to the ‘injured’ person to start rubbing his ankle. The experiment had begun. The researcher across the street then noted if the passer-by stopped to ask the ‘injured’ man if he was OK.

Irrespective of whether passers-by stopped or not, once they had walked further up Trumpington Road, they were intercepted by a third researcher who told them she was conducting a ‘memory’ experiment, inviting them to describe what they had seen along the road in the last few minutes. Various items had been left on the sidewalk (such as a scarf) to make this a plausible cover story. Those who agreed to take part were also asked to visit a website in their own time, and complete the Empathy Quotient (EQ) and Autism Spectrum Quotient (AQ) questionnaires, and were told they would receive a token payment of £6 for taking part.

As the team predicted, EQ scores were higher in those who had stopped to help the injured cyclist, than in those who walked past him, presumably focused on their own agenda.

The study was led by Richard Bethlehem, a Cambridge PhD student, and Professor Simon Baron-Cohen, Director of the Autism Research Centre at the University of Cambridge. 37 (19 males, 18 females) completed both the EQ and also the AQ. They ranged in age from 18 to 77 years old.

Interestingly, how many autistic traits a person recorded was not related to whether they stopped to help or not, suggesting that empathy is the key factor, not autistic traits. Nor did age predict who stopped or not. Of those who stopped to help, 80% were female.

Richard Bethlehem said: “Experimental studies are often confined to the lab, which means they lack ‘ecological validity’. In this novel study we tested if empathy scores predict if people will act altruistically in a real-world setting. Our results support the theory that people who do good are, at least partially, driven by empathy.”

Dr Carrie Allison, a member of the team, commented: “How much empathy one has is itself a complex outcome of both biological factors and early upbringing and is a skill that can improve with development, learning, and practice.”

Professor Baron-Cohen, author of Zero Degrees of Empathy and the Chair of Trustees of the Canadian-based charity “Empathy for Peace”, said: “This research is a first step towards understanding why some people may or may not stop to help a person in distress. Studies conducted ‘in the wild’ are notoriously difficult to undertake, and even this small sample was derived from over 1,000 passers by. We will need to await a larger-scale replication. These results suggest that one factor that predicts which individuals will not stand idly by, is how many degrees of empathy they have.”

The study was supported by the Autism Research Trust, the Medical Research Council, the Pinsent Darwin Trust, and the Cambridge Trust, and was conducted in association with the NIHR CLAHRC for Cambridgeshire and Peterborough NHS Foundation Trust.

Reference
Bethlehem, CA et al. Does empathy predict altruism in the wild? Social Neuroscience; 19 Oct 2016; DOI: 10.1080/17470919.2016.1249944

 

 

If you see an injured person by the side of the road, would you stop and help them, or are you more likely to walk on by? What motivates people to do good in such a situation? 

How much empathy one has is itself a complex outcome of both biological factors and early upbringing and is a skill that can improve with development, learning, and practice
Carrie Allison
'Injured' cyclist on Trumpington Road

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Opinion: Brexit and the importance of languages for Britain #5

$
0
0
Who wants to talk?
The Elizabethan teacher and translator John Florio wasn’t the sort of person who sugar-coated his opinions. In 1578, he complained about the Englishmen he saw in the company of foreigners, ‘who can neither speak, nor understand with them, but stands as one mute’ – this poor monoglot Englishman is ‘mocked of them, and despised of all’. ‘What a shame is that?’ asks Florio – ‘what a reproach to his parents? what a loss to him? and what heart’s grief to think thereon?’ Florio’s England is not ours, but his exasperation might sound familiar.
 
When I tell people that I study the history of foreign-language learning in England, they often ask whether the English ever learnt other languages at all. Our idea of the English as a monoglot nation, though, is a modern one – Florio lived in an era long before English was an international lingua franca, when anyone who wanted to trade or to travel had to become a language-learner. Even English merchants, he wrote, had no use for English when they were out of the country: ‘it liketh them not, and they do not speak it’. 
 
Do we, in post-Referendum Britain, have anything to learn from Florio’s England? Well, I've just read a report about sixth-form colleges dropping modern foreign languages– like Florio’s monoglot Englishmen, we risk not stepping up to the challenges of a multilingual world. If we think only English is enough, and we allow foreign languages to wither and die in all but the wealthiest and most exclusive schools, we impoverish ourselves – literally, by restricting our ability to speak and do business with the wider world, and more broadly, by closing our ears and our minds to the languages and literatures of other nations.  
 
The arguments for language-learning after Brexit aren’t simply economic. A few years after Florio’s words were written, two men – the glassmaker Marcantonio Bassano and the weaver Valentine Wood – were walking near Aldgate in London. They passed by a group of soldiers who heard Marcantonio utter some words in Italian, and responded threateningly; they would later claim that they had taken his language for Spanish, and thought that his words were the insults of a Catholic enemy of the English state. A brawl broke out between the parties, and Marcantonio only narrowly escaped with his life.
 
Reading back over the documents of the investigation that followed, it’s hard for me to shake the thought of how people have been assaulted in the street – here, in Britain, in 2016 – for speaking foreign languages. Migrants living in Britain have reported being unwilling to speak their native language when out and about, for fear of drawing the wrong kind of attention. 
 
When, seven years ago, I started to research the history of language-learning in early modern England, I never thought of my work as politically relevant. Today, looking at how the teaching and learning of languages is under threat, how multilingualism can be stigmatised, and how Britain’s place in the world is changing radically – and not necessarily for the better – I’m coming to see how the questions addressed by Florio and his contemporaries were central to a debate about what kind of country their England was. They challenge us, too: as Britain embarks into the unknown, do English-speakers want to ‘stand as one mute’, tongue-tied and embarrased, or to become language-learners again?
 
To say ‘This is England, we speak English’ has always been historically ignorant. What better way to show openness to the world, hope for the future, and solidarity with the people of an international Britain, than to learn to speak, listen, and communicate with a wider world? 
 
 

In the fifth of a new series of comment pieces written by linguists at Cambridge, Dr John Gallagher, historian of early modern Europe, argues that Britain should look to its past to rediscover the importance of language learning.

To say ‘This is England, we speak English’ has always been historically ignorant. What better way to show hope for the future than to learn to communicate with a wider world.
Dr John Gallagher
Who wants to talk?
Studying languages at Cambridge

 

 
Inspiring events for prospective students for these subjects are run by the University and the Cambridge Colleges throughout the year:
 
More information and advice for prospective students and teachers of Modern Languages and Asian and Middle Eastern Studies
 
Upcoming events organised by The University of Cambridge Language Centre are listed here
 
More information about Cambridge's Widening Participation programmes is available here

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
Viewing all 4368 articles
Browse latest View live




Latest Images