Quantcast
Channel: University of Cambridge - Latest news
Viewing all 4507 articles
Browse latest View live

Co-offenders likely to violently turn on one another, UK crime gang study shows

$
0
0

The first study to take a 'network analysis' approach to patterns of violence within UK organised crime gangs (OCG) has shown that OCG members who previously offended together are likely to end up attacking one another.

The research also reveals cycles of escalating violence within the criminal milieu of Thames Valley. For example, OCG members who harass other members are far more likely to become victims of violence, primarily from those they harassed. 

Researchers found these 'relational effects'– whether one OCG member has worked with or fallen out with another – to be much stronger predictors of violent crime than more traditional ‘rap sheets’: prior offence lists of individuals.

The study, led by the University of Cambridge and using 16 years of data from Thames Valley Police, is published in the Journal of Quantitative Criminology. It marks an initial foray into 'networks of violence' research for the UK.

While network analyses have previously been used to help police some of the most violent cities in the USA, such as Chicago and Boston, this is the first time the technique has been deployed in a less violent European context.   

“Our work shows the importance of taking relationships into account when developing policing risk factors and ‘red flags’,” said Dr Paolo Campana from Cambridge’s Institute of Criminology. “These techniques could help police identify at an earlier stage the social networks set to spiral into violence.”

Within the wider milieu of hardened OCG and all their known current and former associates, having co-offended – or been suspected of offending – with an OCG member dramatically increased the odds of becoming a victim of OCG violence by 56 times, typically from the former partner-in-crime.  

Having harassed an OCG member or associate increased the odds of violent victimisation by a factor of 243, while those who had attacked someone in the network were 479 times more likely to become victims of violence themselves.

However, simply having a record of criminal violence, or of hard drugs offences, was found to have no significant effect on the potential for future violence. 

Researchers say that such high odds ratios are due in part to limited data in this early study, but expect to see similarly strong correlations in future research. Campana is working with Cambridgeshire and Merseyside police to build bigger datasets.  

“It often comes down to tit-for-tat retaliation that generates circuits of violence,” said Campana.

“In the Thames Valley data we can see how prior co-offending relationships turn sour and become a mechanism for further violence. Harassment within criminal networks also dramatically increases the potential for violence.”

“Violence is like a virus, it spreads through proximity and familiarity. Those within certain social bubbles are most at risk. In some US cities, co-offending bubbles account for over 80% of the violence,” he said.  

“As we collect more data, we can expect to identify more of the chains and feedback loops that sustain violence and render it endemic within groups and locations.”

The study used anonymised records from Thames Valley Police between 2000 and 2016 to build a network model for organised crime across a population of just over two million, including cities such as Oxford and Reading.

Definitions of an OCG member includes those working with others to 'commit serious crime on a continuing basis', with elements of planning, structure and coordination.

Campana and his colleague Dr Nynke Niezink from Carnegie Mellon University in the USA analysed a criminal environment of 6,234 individuals, of which 833 were longstanding OCG: active as part of a gang for two years both before and after their first and last recorded offences.

Overall, belonging to an OCG carried a slightly lower risk of becoming a victim of violence than those in the wider criminal network, but it increased the risk of being attack by fellow gang members. 

Researchers whittled over 23,000 events down to 156 OCG-instigated violent acts with sufficient data on the connections and criminal histories of the gang members involved.  

Acts included murder and attempted murder, manslaughter, assault, and actual and grievous bodily harm with and without intent. Related incidents of threats and harassment were added to data models in addition to core acts of violence.

The hardened OCG members were overwhelmingly male (93%), and most had been active in drug dealing. Half (51%) had been involved in a violent act, while a quarter (26%) had been a victim of violence.

The few female OCG members were twice as likely as male members to be victims of violence. This was despite researchers removing incidents related to domestic violence. 

Police initially supplied records on all events involving at least one OCG member as offender or victim, along with information on all others connected to the event.

Over the data period, the average size of a crime gang in Thames Valley’s jurisdiction – which includes cities such as Oxford and Reading – was 5-6 members, with the largest composed of 21 members.    

Researchers use over a decade of data from Thames Valley Police to reveal 'mechanisms' that generate and sustain violence within networks of organised crime.

Violence is like a virus, it spreads through proximity and familiarity
Paolo Campana
Arrest warrant executed in West Bromwich, UK

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
License type: 

Cambridge spin-out aiming to make it easier to find and apply regulations

$
0
0

Regulation is critical to the global economy but keeping track of it all has become a major challenge – for both the regulated and the regulators.

RegGenome’s vision is to transform the way the world consumes regulatory information. The company provides structured machine-readable regulatory content that is dynamic, granular, and interoperable—all powered by AI-based textual information extraction techniques. This enables regulatory authorities to share their regulatory information more effectively and empowers organisations to deepen their regulatory intelligence and digitise their compliance and risk management processes.

Robert Wardrop, Management Practice Professor of Finance at Cambridge Judge Business School and Executive Chairman of RegGenome, said: “We are thrilled to be working with a group of investors that share our view that the world is rapidly entering into a period of regulatory uncertainty, requiring interoperable content to power the next generation of regulatory applications for the digital economy.”

Marcio Siqueira, Head of Physical Sciences, Cambridge Enterprise, said: “RegGenome is a superb example of how the University’s transformational IP can be applied for global impact. The company is primed to deliver a quantum leap in the way regulatory content is shared and harnessed. Cambridge Enterprise has been an integral part of RegGenome from the outset, from enabling access to the required technology to investing in its funding. We look forward to seeing RegGenome’s vision come to life.”

The funding round is led by Evolution Equity Partners, with participation from AlbionVC, Cambridge Enterprise, and Mastercard.

Adapted from a news release by Cambridge Enterprise

Photo credit: Joshua Sortino on Unsplash

RegGenome, a commercial spin-out from the University of Cambridge, has announced the completion of a $6 million seed funding round.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Women are ‘running with leaded shoes’ when promoted at work, says study

$
0
0
Businesswoman interacting with colleagues sitting at conference table during meeting in board room - stock photo

Women and men feel different at work, as moving up the ranks alleviates negative feelings such as frustration less for women than for men, says a sweeping new study on gender differences in emotion at work. 

The study, led by researchers at Yale University and co-authored by Jochen Menges at Cambridge Judge Business School, finds that rank is associated with greater emotional benefits for men than for women, and that women reported greater negative feelings than men across all ranks. 

Because emotions are important for leadership, this puts women at a disadvantage akin to running with ‘leaded shoes’, according to the study, which is based on nearly 15,000 workers in the US.

The results, published in Sex Roles: A Journal of Research, tie the different ways women and men experience emotions at work to underrepresentation at every level of workplace leadership.

Little previous research on gender and workplace emotions 

The study notes that, while the glass ceiling for women has been extensively documented, there has been surprisingly little research on gender differences in emotions at work. Understanding this is particularly important as emotions influence job performance, decision-making, creativity, absence, conflict resolution and leadership effectiveness.

The practical implications of the study are that organisations must provide support to women as they advance, including formal mentoring relationships and networking groups that can provide opportunities to deal with emotions effectively while supporting women as they rise within organisational ranks.

“It would be hard for anyone to break through a glass ceiling when they feel overwhelmed, stressed, less respected and less confident,” said Menges, who teaches at both the University of Zurich and Cambridge Judge Business School.

“This emotional burden may not only hamper promotion opportunities for women, but also prevent them from contributing to an organisation to the best of their ability. More needs to be done to level the playing field when it comes to emotional burdens at work,” said Menges, whose research often focuses on leadership, motivation and other workplace issues.

Women feel more ‘overwhelmed, stressed, frustrated’ at work 

The study finds gender does make a difference for the emotions that employees experience at work. Compared to men, women reported feeling more overwhelmed, stressed, frustrated, tense, and discouraged, and less respected and confident.

Women reported greater negative feelings than men across all ranks. Although these feelings decreased for both men and women as they moved up in rank, the extent to which rank diminished negative feelings differed between the sexes. For instance, moving up rank did alleviate frustration and discouragement in both men and women, but it did so more for men than for women.

The study says that because women experience more negative and fewer positive feelings in climbing the organisational ladder, this puts women at a disadvantage in attaining leadership roles. 

At the lowest levels of employment, women reported feeling significantly more respected than men, yet this reverses as people climb within an organisation, resulting in men feeling significantly more respected than women at higher levels.

The research used data from 14,618 adult US workers (50.7% male, 49.3% female) reflecting a diversity of race, ethnicity and industries, to test the following factors: 

--Differences in the emotions that men and women experience at work. 

--If gender interacts with rank to predict emotions. 

--Whether the association between gender and emotions is mediated by emotional labour demands. 

--If this relationship differs as a function of the proportion of women in an industry or organisational rank. 

Feelings ranging from ‘inspired’ to ‘stressed’ 

Emotions were assessed using two different methods. Participants used a sliding scale to indicate how often they had experienced 23 feelings at work in the previous three months. The items included ten positive emotions such as “interested”, “proud” and “inspired”, and 13 negative responses including “bored”, “stressed” and “envious”. Participants were also asked to report their typical feelings about work in open-ended responses about how their job had made them feel over the past six months.  

In addition, to assess positional power, participants were asked to place themselves on a ladder with ten steps representing where people stand in their organisation.  

Inhibiting negative emotion is not the answer 

The study concludes that simply smothering emotion in the workplace isn’t the answer: Inhibiting negative emotions for a prolonged time increases burnout, and negatively impacts performance and personal well-being.

It recognises there are areas of future research which include how gender interacts with other categories of identity, such as race and ethnicity, social class, and sexuality. Women of colour face stronger glass ceiling effects than white women and have to simultaneously navigate bias and discrimination based on their gender and race.

The authors also suggest further investigation to establish whether women’s negative experiences can impose an emotional glass ceiling because obstacles such as unequal treatment at work causes emotions such as feeling disrespected, which in turn can become an additional barrier to advancement.  

Reference:
Christa L. Taylor et al. ‘Gender and Emotions at Work: Organizational Rank Has Greater Emotional Benefits for Men than Women.’ Sex Roles (2022). DOI: 10.1007/s11199-021-01256-z

Adapted from a story on the Cambridge Judge Business School website.

Promotion at work has greater emotional benefit for men than women, says a new study on gender and workplace emotion.

Colleagues sitting at conference table

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

'Threatening' faces and beefy bodies do not bias criminal suspect identification, study finds

$
0
0

We’re all familiar with the classic “look” of a movie bad guy: peering through narrowing eyes with a sinister sneer (like countless James Bond villains, including Christopher Walken’s memorable Max Zorin in A View to a Kill) or pumped up to cartoon-like dimensions (like the Soviet boxer Drago who growls “I must break you” to Rocky Balboa in Rocky IV). 

Yet a detailed new study of identifying criminal suspects finds, to the authors’ surprise, no bias toward selecting people with threatening facial characteristics or muscular bodies. The study does find, however, that suspects with highly muscled, “threatening” bodies are most accurately identified by eyewitnesses in line-ups. 

‘No systematic bias’ 

“These findings suggest that while no systematic bias exists in the recall of criminal bodies, the nature of the body itself and the context in which it is presented can significantly impact identification accuracy,” says the research published in the journal Memory & Cognition. “Participant identification accuracy was highest for the most threatening body stimuli high in musculature.”  

Eyewitness testimony and the identification of suspects lies at the heart of the criminal justice system. In the absence of incriminating physical evidence, an eyewitness can be crucial in convincing a court of the defendant’s guilt. Previous studies have revealed identification errors may be due to people finding it hard to recognise unfamiliar faces, as well as height and weight frequently being underestimated.  

Computer-generated images varying in levels of threat 

“Misidentification of innocent defendants plays a significant role in most cases of prisoners later exonerated through DNA evidence,” says study co-author Magda Osman, Head of Research and Analysis at the Centre for Science and Policy, University of Cambridge, which is affiliated with Cambridge Judge Business School.

“Having a stereotypically ‘criminal’ or threatening appearance has long been established to be a disadvantage in the judicial system, both in terms of the likelihood of initially being arrested and in terms of courtroom sentencing,” adds co-author Terence J. McElvaney of the Department of Biological and Experimental sychology, Queen Mary University of London. “What we wanted to establish through this new research was whether some people are also more likely to be falsely identified as a criminal because they naturally have a more threatening appearance – and, contrary to our expectations, we found that this was not the case.  

In three separate experiments, participants were first presented with either the outline of a violent crime, neutral information, or no background information. They were then shown a realistic computer-generated image of the male suspect (target) and asked to identify him from a selection of images (foils) that varied in facial threat or body muscle. 

“Although this does not match the procedural experience of real eyewitnesses, this allowed us to explore the potential biasing effects of criminal context while maintaining tight control over the stimuli,” the study explains. In some experiments a delay between witnessing the crime and trying to identify the suspect was simulated.  All faces in the dataset were Caucasian and converted to greyscale.  

Three experiments form basis of study 

Around 200 hundred adults living in the UK took part in each of the three experiments: 

Experiment 1 

Participants were divided into two teams, with one group told the person they were about to see was involved in an armed robbery. The other group was told the aim of the experiment was to see how accurately they could identify unfamiliar people. The groups completed 20 trials in total, identifying a different suspect each time from a selection of faces and body shapes with blurred heads. In each case, the target image was shown for one second, followed by a blank screen for one second, followed by the line-up.    

Experiment 2 

This experiment introduced a distractor task adding a five-minute delay between participants seeing the target image for 30 seconds and trying to identify it. Contributors were divided into three categories. In the crime and neutral groups, they were presented with background information such as a shop robbery resulting in a murder, or someone purchasing a winning lottery ticket. The final group was told to study the person for later identification. Fixation dots and a random noise mark were also added to the start of each trial to break concentration. This time, faces or bodies were shown individually with those taking part responding Yes or No to the question: “Did that face/body EXACTLY match the one you previously studied?”  

Experiment 3 

Participants were again provided with a criminal context, neutral context, or no additional information. They were given 30 seconds to study the target, then following a distractor task lasting ten minutes, were asked to identify him from a line-up of bodies only, from which the perpetrator was missing.    

Impact of stereotypes on memory 

The authors expected that if no background context was provided, participants would not show any bias in recalling a body or a face. They hypothesised that more threatening faces and larger bodies would be selected when the perpetrator was presented in a criminal context, rather than in a neutral context, but this did not turn up in the findings. 

Previous research suggests associating someone with a crime can distort their appearance in memory by automatically activating racial stereotypes linked to the crime being committed, such as a Caucasian stereotype being activated for crimes such as identity theft or embezzlement. 

This new research found giving criminal background information about the suspects did not significantly influence participants’ memory. “Participants viewing images of alleged violent criminals were no more likely to overestimate the facial threat or musculature of the target stimuli than those who studied the targets in empty or neutral contexts,” the study says.  

“These results suggest that, although errors of eyewitness identification can or do occur, they may not be driven by systematic biases related to how threatening a criminal is later recalled.” 

The authors identified several limitations in their study. These included the use of computer-generated still images rather than video footage. Although a delay was introduced in the process, it does not reflect the days or weeks experienced by real eyewitnesses, or difficulties presented by lighting or distance.

Crucially, due to the images used, all the conclusions are restricted to Caucasian defendants.

“Although it’s possible participants didn’t perceive the images to be of a particular race because they’re computer generated, further research could use morphing software to produce photo-realistic facial images of different races that vary in perceived threat”, says co-author Isabelle Mareschal, also of the Department of Biological and Experimental Psychology, Queen Mary University of London. 

The study in Memory & Cognition – entitled “Identifying criminals: No biasing effect of criminal context on recalled threat” – is co-authored by Terence J. McElvaney and Isabelle Mareschal, both of the Department of Biological and Experimental Psychology, Queen Mary University of London; and Magda Osman of the Centre of Science and Policy, Cambridge Judge Business School.  

Research shows that there is no bias toward selecting people with muscular bodies or facial characteristics perceived as threatening when identifying criminal suspects in line-ups. 

Misidentification of innocent defendants plays a significant role in most cases of prisoners later exonerated through DNA evidence
Magda Osman
Various levels of musculature in Experiment 1. Left-to-right: 0%, 25%, 50%, 75%, 100%. Target stimulus (e.g., 50% musculature) shown in the centre.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Largest study of whole genome sequencing data reveals new clues to causes of cancer

$
0
0
Merkel Cell Carcinoma

In the biggest study of its kind, a team of scientists led by Professor Serena Nik-Zainal from Cambridge University Hospitals (CUH) and the University of Cambridge, analysed the complete genetic make-up or whole-genome sequences (WGS) of more than 12,000 NHS cancer patients.   

Because of the vast amount of data provided by whole genome sequencing, the researchers were able to detect patterns in the DNA of cancer, known as ‘mutational signatures’, that provide clues about whether a patient has had a past exposure to environmental causes of cancer such as smoking or UV light, or has internal, cellular malfunctions.

The team were also able to spot 58 new mutational signatures, suggesting that there are additional causes of cancer that we don't yet fully understand. The results are reported in the journal Science.

The genomic data were provided by the 100,000 Genomes Project: an England-wide clinical research initiative to sequence 100,000 whole genomes from around 85,000 patients affected by rare disease or cancer.

“WGS gives us a total picture of all the mutations that have contributed to each person’s cancer,” said first author Dr Andrea Degasperi, from Cambridge’s Department of Oncology. “With thousands of mutations per cancer, we have unprecedented power to look for commonalities and differences across NHS patients, and in doing so we uncovered 58 new mutational signatures and broadened our knowledge of cancer.”

“The reason it is important to identify mutational signatures is because they are like fingerprints at a crime scene - they help to pinpoint cancer culprits,” said Serena Nik-Zainal, from the Department of Medical Genetics and an honorary consultant in clinical genetics at CUH. “Some mutational signatures have clinical or treatment implications – they can highlight abnormalities that may be targeted with specific drugs or may indicate a potential ‘Achilles heel’ in individual cancers.

“We were able to perform a forensic analysis of over 12,000 NHS cancer genomes thanks to the generous contribution of samples from patients and clinicians throughout England.  We have also created FitMS, a computer-based tool to help scientists and clinicians identify old and new mutational signatures in cancer patients, to potentially inform cancer management more effectively.”

Michelle Mitchell, chief executive of Cancer Research UK, which funded the research, said:

“This study shows how powerful whole genome sequencing tests can be in giving clues into how the cancer may have developed, how it will behave and what treatment options would work best.  It is fantastic that insight gained through the NHS 100,000 Genomes Project can potentially be used within the NHS to improve the treatment and care for people with cancer.”

Professor Matt Brown, chief scientific officer of Genomics England said:

“Mutational signatures are an example of using the full potential of WGS.  We hope to use the mutational clues seen in this study and apply them back into our patient population, with the ultimate aim of improving diagnosis and management of cancer patients.”

Professor Dame Sue Hill, chief scientific officer for England and Senior Responsible Officer for Genomics in the NHS said:

“The NHS contribution to the 100,000 Genomes Project was vital to this research and highlights how data can transform the care we deliver to patients, which is a cornerstone of the NHS Genomic Medicine Service.”

Reference:
Andrea Degasperi et al. ‘Substitution mutational signatures in whole-genome–sequenced cancers in the UK population.’ Science (2022). DOI: 10.1126/science.abl9283

Adapted from a CUH press release.

DNA analysis of thousands of tumours from NHS patients has found a ‘treasure trove’ of clues about the causes of cancer, with genetic mutations providing a personal history of the damage and repair processes each patient has been through.

The reason it is important to identify mutational signatures is because they are like fingerprints at a crime scene - they help to pinpoint cancer culprits
Serena Nik-Zainal
Merkel Cell Carcinoma

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
License type: 

Remote working is a ‘mixed bag’ for employee wellbeing and productivity, study finds

$
0
0
Woman using laptop for team meeting

The shift to remote working for many office-based workers at the start of the pandemic initially led to an increase in productivity, especially by reducing commute times, but a new large-scale study has outlined the many ways in which remote working has affected wellbeing and productivity over the past two years, both positively and negatively.

One of the big changes for remote workers was the number and quality of meetings. As outlined in a new article in MIT Sloan Management Review, the study from Cambridge Judge Business and School and the Vitality Research Institute, part of the wellness and financial services group Vitality, found that the average number of meetings increased by 7.4% from June 2020 to December 2021.

The study, based on more than 1,000 Vitality employees, also found that people in most departments spent more hours in low-quality meetings – defined as meetings in which participants multitask, are double-booked into competing meetings or tasks, or are accompanied by another person with a similar role.

“Low-quality meetings often translate into less productivity and high levels of multitasking can increase stress,” said study co-author Thomas Roulet from Cambridge Judge Business School. 

The study, which looked at employees from four Vitality locations in the UK and across all business units, is based on automated data collection using Microsoft Workplace Analytics complemented by weekly surveys.

The authors focused on five core workplace behaviours that have the most significant impact on a range of wellbeing and work outcomes: collaboration hours (meetings, calls, dealing with emails); low-quality meeting hours; multitasking hours during meetings (including sending emails); ‘focus’ hours (blocks of at least two hours with no meetings); and workweek span (number of hours worked per week).

Work capacity was captured based on four factors: life and work satisfaction, anxiety and stress levels, work energy, and work-life balance.

The relationships emerging from the data are clear: employees were working longer (a higher workweek span), spent time in more low-quality meetings, and had higher levels of multitasking, all of which are associated with worse outcomes, including a decline in work-life balance and quality of work.

More after-hours work predominantly affects one’s sense of work engagement but has no real impact on work productivity and quality. Increased focus hours affect work outcomes but not work engagement.

The authors conclude that the shift over the past two years toward remote or hybrid working has improved wellbeing for some workers but not others, so they caution against a ‘blanket approach’ to workplace rules such as requiring employees to come into the office for a set number of days or under specific conditions.

The research found, for example, that increasing ‘focus’ hours was beneficial to senior employees who may need to concentrate on more complex tasks, but it decreased well-being for junior employees who want more social interactions rather than working in isolation from their team.

The article in MIT Sloan Management Review – entitled “How Shifts in Remote Behavior Affect Employee Well-being"– is co-authored by Shaun Subel, Director at the Vitality Research Institute; Martin Stepanek, Lead Researcher at the Vitality Research Institute; and Thomas Roulet, Associate Professor in Organisational Strategy at Cambridge Judge Business School.

 

Adapted from a story published on the Cambridge Judge Business School website. 

Adapting remote and hybrid work policies to employees’ specific work-life situations can result in increased well-being and productivity, but many employees are stuck in an increasing number of low-quality meetings when working remotely, according to a new study.

Woman using laptop for team meeting

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

First-ever Cambridge Foundation Year offers made to prospective students

$
0
0

The landmark new programme will provide a new route to undergraduate education at Cambridge for around 50 talented individuals every year who have experienced educational and social disadvantage, and demonstrate the potential to succeed in a degree in the arts, humanities, or social sciences. 

The one-year, full-time residential course will welcome its first intake of students to Cambridge for the start of the new academic year, in October 2022. Following a rigorous admissions process, offers have been made to 52 students.

Free and fully funded, the Cambridge Foundation Year is aimed at engaging an entirely new stream of applicants who have been prevented from reaching their full potential by their circumstances. This includes students with experience of the care system, estrangement from parents, low levels of household income, and schools with little history of sending students to highly selective universities. Their selection has taken into account their educational background and contextualised their achievements, recognising that circumstances and opportunity should not be a barrier to future academic success. 

The programme’s engaging and challenging curriculum will prepare students for further study at Cambridge, or another top university.

Typical offers for the Cambridge Foundation Year - which is open to those ordinarily resident in the UK who meet specific eligibility criteria - require 120 UCAS Tariff Points, which is equivalent to BBB at A-Level. The usual Cambridge offer is at least A*AA.

In total, there were 267 applications to the pilot Foundation Year programme, around 5 applications for every place, which is comparable to the number of applications the University normally receives for undergraduate study (6 applications for every place). Cambridge Foundation Year applicants, including mature students, came from diverse backgrounds and from across the UK. They have received guidance during the process through a University online applicant support programme to help them make the strongest possible application.

A Foundation Year Offer Holder Day will be held in June, giving students an opportunity to find out more about life at Cambridge and visit colleges, and a Residential Pre-Term Induction Week will take place in September.

Dr Alex Pryce, Foundation Year Course Director, said: “This is a big day for those who are receiving their Cambridge Foundation Year offer, and a big day for the University. This is the first time in its history that Cambridge has run a pre-degree foundation year programme, aimed at talented applicants who might not otherwise consider applying to study here, and the number of applications we received shows that it is competitive and that there is a clear appetite for it.

“I’d like to congratulate everyone who has received an offer; we look forward to welcoming our first-ever Cambridge Foundation Year students to Cambridge very soon.” 

Professor Stephen Toope, Vice-Chancellor of the University of Cambridge, said: “The Cambridge Foundation Year offers a fresh approach to widening participation at Cambridge. It is an innovative programme that aims to reach an entirely new field of Cambridge candidates, and to transform lives. After all the planning that has gone into creating the Cambridge Foundation Year, and the hard work of many people across the University and Colleges, I’m delighted that we have reached this important moment.”

A cornerstone gift from philanthropists Christina and Peter Dawson is funding the launch of the programme and full one-year scholarships for all students who are accepted. Students will study at one of the 13 Cambridge colleges participating in the pilot scheme, and will benefit from the community, support and academic stimulation this offers, which is intrinsic to the Cambridge experience. 

As with all courses at Cambridge, there was a rigorous admissions process designed to help admit students who will thrive on the Foundation Year and be able to progress to a degree at Cambridge – including interviews and assessment. Students also have to prove their eligibility to receive the generous scholarship given to all students on the course. 

On successful completion of the programme, Cambridge Foundation Year students will receive a recognised CertHE qualification from the University of Cambridge, and with suitable attainment can progress to degrees in the Arts, Humanities and Social Sciences at Cambridge without the need to apply to the University again in the usual admissions round. Students will also be supported during the programme in finding alternative university places if they do not wish to continue to undergraduate study at Cambridge, or do not meet the required level of attainment.

Along with the Cambridge Foundation Year in Arts, Humanities and Social Sciences, the University last year launched the STEM SMART programme to support hundreds of UK state school students through their maths and science A-levels with enhanced learning, encouragement and mentoring. The two programmes build on widening participation progress made by the University in recent years, including the use of the August Reconsideration Pool to reconsider candidates who exceed expectations in examinations, and the launch of an enhanced bursary scheme.

In 2021, 72% of Cambridge’s new undergraduate students were from state schools and more than a quarter were from the least advantaged backgrounds.
For more information visit: www.foundationyear.cam.ac.uk

More than 50 students from backgrounds of educational disadvantage have been offered a place on the University of Cambridge’s first-ever pre-degree foundation year.

The Cambridge Foundation Year is an innovative programme that aims to reach an entirely new field of Cambridge candidates, and to transform lives.
Professor Stephen Toope, Vice-Chancellor
Current students at the new West Hub, where the Cambridge Foundation Year will be taught

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Improved approach to the ‘Travelling Salesperson Problem’ could improve logistics and transport sectors

$
0
0
Courier checking parcel for delivery

A notorious theoretical question that has puzzled researchers for 90 years, the Travelling Salesperson Problem also has real relevance to industry today. Essentially a question about how best to combine a set of tasks so that they can be performed in the fastest and most efficient way, finding good solutions to the problem can greatly help improve sectors such as transport and logistics.

Researchers from the University of Cambridge have developed a hybrid, data-driven approach to the problem that not only produces high-quality solutions, but at a faster rate than other state-of-the-art approaches. Their results are presented this week at the International Conference on Learning Representations.

“The importance of global logistics system was brought home to us during the pandemic,” said Dr Amanda Prorok from Cambridge’s Department of Computer Science and Technology, who led the research. “We’re highly reliant on this kind of infrastructure to be more efficient – and our solution could help with that as it targets both in-warehouse logistics, such as the routing of robots around a warehouse to collect goods for delivery, and those outside it, such as the routing of goods to people.”

The Travelling Salesperson Problem involves a notional delivery driver who must call at a set number of cities – say, 20, 50 or 100 – that are connected by highways all in one journey. The challenge is to find the shortest possible route that calls at each destination once and to find it quickly.

“There are two key components to the problem. We want to order the stops, and we also want to know the cost, in time or distance, of going from one stop to another in that order,” said Prorok.

Twenty years ago the route from the warehouse to the destinations might have been fixed in advance. But with today’s availability of real-time traffic information, and the ability to send messages to the driver to add or remove delivery locations on the fly, the route may now change during the journey. But minimising its length or duration still remains key.

There’s often a cost attributed to waiting for an optimal solution or hard deadlines at which decisions must be taken. For example, the driver cannot wait for a new solution to be computed – they may miss their deliveries, or the traffic conditions may change again.

And that is why there is a need for general, anytime combinatorial optimisation algorithms that produce high-quality solutions under restricted computation time.

The Cambridge-developed hybrid approach does this by combining a machine learning model that provides information about what the previous best routes have been, and a ‘metaheuristic’ tool that uses this information to assemble the new route.

“We want to find the good solutions faster,” said Ben Hudson, the paper’s first author. “If I’m a driver for a courier firm I have to decide what my next destination is going to be as I’m driving. I can’t afford to wait for a better solution. So that’s why in our research we focused on the trade-off between the computational time needed and the quality of the solution we got.”

To do this, Hudson came up with a Guided Local Search algorithm that could differentiate routes from one city to another that would be costly – in time or distance – from routes that would be less costly to include in the journey. This enabled the researchers to identify high-quality, rather than optimal, solutions quickly.

They did this by using a measure of what they call the ‘global regret’ – the cost of enforcing one decision relative to the cost of an optimal solution – of each city-to-city route in the Guided Local Search algorithm. They used machine learning to come up with an approximation of this ‘regret’.

“We already know the correct solution to a set of these problems,” said Hudson. “So we used some machine learning techniques to try and learn from those solutions. Based on that, we try to learn for a new problem – for a new set of cities in different locations – which paths between the cities are promising.

“When we have this information, it then feeds into the next part of the algorithm – the part that actually draws the routes. It uses that extra information about what the good paths may be to build a good solution much more quickly than it could have done otherwise.”

The results they came up with were impressive. Their experiments demonstrated that the hybrid, data-driven approach converges to optimal solutions at a faster rate than three recent learning-based approaches for the Travelling Salesperson Problem.

In particular, when trying to solve the problem when it had a 100-city route, the Cambridge method reduced the mean optimality gap from 1.534% to 0.705%, a two-fold improvement. When generalising from the 20-city problem route to the 100-city problem route, the method reduced the optimality gap from 18.845% to 2.622%, a seven-fold improvement.

“A lot of logistics companies are using routing methods in real life,” said Hudson. “Our goal with this research is to improve such methods so that they produce better solutions – solutions that result in lower distances being travelled and therefore lower carbon emissions and reduced impact on the environment.”

Amanda Prorok is a Fellow of Pembroke College, Cambridge. 

Reference:
Benjamin Hudson et al. ‘Graph Neural Network Guided Local Search for the Traveling Salesperson Problem.’ Paper presented at the International Conference on Learning Representations: https://iclr.cc/virtual/2022/calendar.

A new approach to solving the Travelling Salesperson Problem – one of the most difficult questions in computer science – significantly outperforms current approaches.

We’re highly reliant on this kind of infrastructure to be more efficient – and our solution could help with that
Amanda Prorok
Courier checking parcel for delivery

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Existing infrastructure will be unable to support demand for high-speed internet

$
0
0
copper wires

The researchers, from the University of Cambridge and BT, have established the maximum speed at which data can be transmitted through existing copper cables. This limit would allow for faster internet compared to the speeds currently achievable using standard infrastructure, however it will not be able to support high-speed internet in the longer term.

The team found that the ‘twisted pair’ copper cables that reach every house and business in the UK are physically limited in their ability to support higher frequencies, which in turn support higher data rates.

While full-fibre internet is currently available to around one in four households, it is expected to take at least two decades before it reaches every home in the UK. In the meantime, however, existing infrastructure can be improved to temporarily support high-speed internet.

The results, reported in the journal Nature Communications, both establish a physical limit on the UK’s ubiquitous copper cables, and emphasise the importance of immediate investment in future technologies.

The Cambridge-led team used a combination of computer modelling and experiments to determine whether it was possible to get higher speeds out of existing copper infrastructure and found that it can carry a maximum frequency of about 5 GHz, above the currently used spectrum, which is lower than 1 GHz. Above 5 GHz however, the copper cables start to behave like antennas.

Using this extra bandwidth can push data rates on the copper cables above several Gigabits per second on short ranges, while fibre cables can carry hundreds of Terabits per second or more.

“Any investment in existing copper infrastructure would only be an interim solution,” said co-author Dr Anas Al Rawi from Cambridge’s Cavendish Laboratory. “Our findings show that eventual migration to optical fibre is inevitable.”

The twisted pair– where two conductors are twisted together to improve immunity against noise and to reduce electromagnetic radiation and interference – was invented by Alexander Graham Bell in 1881. Twisted pair cables replaced grounded lines by the end of the 19th century and have been highly reliable ever since. Today, twisted pair cables are standardised to carry 424 MHz bandwidth over shorter cable lengths owing to deeper fibre penetration and advancement in digital signal processing.

These cables are now reaching the end of their life as they cannot compete with the speed of fibre-optic cables, but it’s not possible to get rid of all the copper cables due to fibre’s high cost. The fibre network is continuously getting closer to users, but the connection between the fibre network and houses will continue to rely on the existing copper infrastructure. Therefore, it is vital to invest in technologies that can support the fibre networks on the last mile to make the best use of them.

“High-speed internet is a necessity of 21st century life,” said first author Dr Ergin Dinc, who carried out the research while he was based at Cambridge’s Cavendish Laboratory. “Internet service providers have been switching existing copper wires to high-speed fibre-optic cables, but it will take between 15 and 20 years for these to reach every house in the UK and will cost billions of pounds. While this change is happening, we’ve shown that existing copper infrastructure can support higher speeds as an intermediate solution.”

The Cambridge researchers, working with industry collaborators, have been investigating whether it’s possible to squeeze faster internet speeds out of existing infrastructure as a potential stopgap measure, particularly for rural and remote areas.

“No one had really looked into the physical limitations driving the maximum internet speed for twisted pair cables before,” said Dinc. “If we used these cables in a different way, would it be possible to get them to carry data at higher speeds?”

Using a mix of theoretical modelling and experimentation, the researchers found that twisted pair cables are limited in the frequency they can carry, a limit that’s defined by the geometry of the cable. Above this limit, around 5 GHz, the twisted pair cables start to radiate and behave like an antenna.

“The way that the cables are twisted together defines how high a frequency they can carry,” said Dr Eloy de Lera Acedo, also from the Cavendish, who led the research. “To enable higher data rates, we’d need the cables to carry a higher frequency, but this can’t happen indefinitely because of physical limitations. We can improve speeds a little bit, but not nearly enough to be future-proof.”

The researchers say their results underline just how important it is that government and industry work together to build the UK’s future digital infrastructure, since existing infrastructure can handle higher data rates in the near future, while the move to a future-proof full-fibre network continues.

The work is part of an ongoing collaboration between the Cavendish, the Department of Engineering, BT and Huawei in a project led by Professor Mike Payne, also of the Cavendish Laboratory. The research was also supported by the Royal Society, and the Science and Technology Facilities Council, part of UK Research and Innovation.

 

Reference:
Ergin Dinc et al. ‘High-Frequency Electromagnetic Waves on Unshielded Twisted Pairs: Upper Bound on Carrier Frequency.’ Nature Communications (2022). DOI: 10.1038/s41467-022-29631-8

Researchers have shown that the UK’s existing copper network cables can support faster internet speeds, but only to a limit. They say additional investment is urgently needed if the government is serious about its commitment to making high-speed internet available to all.

We can improve speeds a little bit, but not nearly enough to be future-proof
Eloy de Lera Acedo
Copper wires

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Seven hours of sleep is optimal in middle and old age, say researchers

$
0
0
Alarm clock at night

Sleep plays an important role in enabling cognitive function and maintaining good psychological health. It also helps keep the brain healthy by removing waste products. As we get older, we often see alterations in our sleep patterns, including difficulty falling asleep and staying asleep, and decreased quantity and quality of sleep. It is thought that these sleep disturbances may contribute to cognitive decline and psychiatric disorders in the aging population.

In research published today in Nature Aging, scientists from the UK and China examined data from nearly 500,000 adults aged 38-73 years from the UK Biobank. Participants were asked about their sleeping patterns, mental health and wellbeing, and took part in a series of cognitive tests. Brain imaging and genetic data were available for almost 40,000 of the study participants.

By analysing these data, the team found that both insufficient and excessive sleep duration were associated with impaired cognitive performance, such as processing speed, visual attention, memory and problem-solving skills. Seven hours of sleep per night was the optimal amount of sleep for cognitive performance, but also for good mental health, with people experiencing more symptoms of anxiety and depression and worse overall wellbeing if they reported sleeping for longer or shorter durations.

The researchers say one possible reason for the association between insufficient sleep and cognitive decline may be due to the disruption of slow-wave – ‘deep’ – sleep. Disruption to this type of sleep has been shown to have a close link with memory consolidation as well as the build-up of amyloid – a key protein which, when it misfolds, can cause ‘tangles’ in the brain characteristic of some forms of dementia. Additionally, lack of sleep may hamper the brain’s ability to rid itself of toxins.

The team also found a link between the amount of sleep and differences in the structure of brain regions involved in cognitive processing and memory, again with greater changes associated with greater than or less than seven hours of sleep.

Having a consistent seven hours’ sleep each night, without too much fluctuation in duration, was also important to cognitive performance and good mental health and wellbeing. Previous studies have also shown that interrupted sleep patterns are associated with increased inflammation, indicating a susceptibility to age-related diseases in older people.

Professor Jianfeng Feng from Fudan University in China said: “While we can’t say conclusively that too little or too much sleep causes cognitive problems, our analysis looking at individuals over a longer period of time appears to support this idea. But the reasons why older people have poorer sleep appear to be complex, influenced by a combination of our genetic makeup and the structure of our brains.”

The researchers say the findings suggest that insufficient or excessive sleep duration may be a risk factor for cognitive decline in ageing. This is supported by previous studies that have reported a link between sleep duration and the risk of developing Alzheimer’s disease and dementia, in which cognitive decline is a hallmark symptom.

Professor Barbara Sahakian from the Department of Psychiatry at the University of Cambridge, one of the study’s authors, said: “Getting a good night’s sleep is important at all stages of life, but particularly as we age. Finding ways to improve sleep for older people could be crucial to helping them maintain good mental health and wellbeing and avoiding cognitive decline, particularly for patients with psychiatric disorders and dementias.”

The research was supported by the National Key R&D Program of China, the Shanghai Municipal Science and Technology Major Project, the Shanghai Center for Brain Science and Brain-Inspired Technology, the 111 Project, the National Natural Sciences Foundation of China and the Shanghai Rising Star Program.

Reference
Li, Y, Sahakian, BJ, et al. The brain structure and genetic mechanisms underlying the nonlinear association between sleep duration, cognition and mental health. Nature Aging; 28 Apr 2022l DOI: 10.1038/s43587-022-00210-2

Seven hours is the ideal amount of sleep for people in their middle age and upwards, with too little or too much sleep associated with poorer cognitive performance and mental health, say researchers from the University of Cambridge and Fudan University.

Getting a good night’s sleep is important at all stages of life, but particularly as we age
Barbara Sahakian
Insomnia or Sleep changes and disorders in elderly concept

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cognitive impairment from severe COVID-19 equivalent to 20 years of ageing, study finds

$
0
0
Senior woman wearing face mask lying on hospital bed

The findings, published in the journal eClinicalMedicine, emerge from the NIHR COVID-19 BioResource. The results of the study suggest the effects are still detectable more than six months after the acute illness, and that any recovery is at best gradual.

There is growing evidence that COVID-19 can cause lasting cognitive and mental health problems, with recovered patients reporting symptoms including fatigue, ‘brain fog’, problems recalling words, sleep disturbances, anxiety and even post-traumatic stress disorder (PTSD) months after infection. In the UK, a study found that around one in seven individuals surveyed reported having symptoms that included cognitive difficulties 12 weeks after a positive COVID-19 test.

While even mild cases can lead to persistent cognitive symptoms, between a third and three-quarters of hospitalised patients report still suffering cognitive symptoms three to six months later.

To explore this link in greater detail, researchers analysed data from 46 individuals who received in-hospital care, on the ward or intensive care unit, for COVID-19 at Addenbrooke’s Hospital, part of Cambridge University Hospitals NHS Foundation Trust. 16 patients were put on mechanical ventilation during their stay in hospital. All the patients were admitted between March and July 2020 and were recruited to the NIHR COVID-19 BioResource.

The individuals underwent detailed computerised cognitive tests an average of six months after their acute illness using the Cognitron platform, which measures different aspects of mental faculties such as memory, attention and reasoning. Scales measuring anxiety, depression and post-traumatic stress disorder were also assessed. Their data were compared against matched controls.

This is the first time that such rigorous assessment and comparison has been carried out in relation to the after effects of severe COVID-19.

COVID-19 survivors were less accurate and with slower response times than the matched control population – and these deficits were still detectable when the patients were following up six months later. The effects were strongest for those who required mechanical ventilation. By comparing the patients to 66,008 members of the general public, the researchers estimate that the magnitude of cognitive loss is similar on average to that sustained with 20 years ageing, between 50 and 70 years of age, and that this is equivalent to losing 10 IQ points.

Survivors scored particularly poorly on tasks such as verbal analogical reasoning, a finding that supports the commonly-reported problem of difficulty finding words. They also showed slower processing speeds, which aligns with previous observations post COVID-19 of decreased brain glucose consumption within the frontoparietal network of the brain, responsible for attention, complex problem-solving and working memory, among other functions.

Professor David Menon from the Division of Anaesthesia at the University of Cambridge, the study’s senior author, said: “Cognitive impairment is common to a wide range of neurological disorders, including dementia, and even routine ageing, but the patterns we saw – the cognitive 'fingerprint' of COVID-19 – was distinct from all of these.”

While it is now well established that people who have recovered from severe COVID-19 illness can have a broad spectrum of symptoms of poor mental health – depression, anxiety, post-traumatic stress, low motivation, fatigue, low mood, and disturbed sleep – the team found that acute illness severity was better at predicting the cognitive deficits.

The patients’ scores and reaction times began to improve over time, but the researchers say that any recovery in cognitive faculties was at best gradual and likely to be influenced by a number of factors including illness severity and its neurological or psychological impacts.

Professor Menon added: “We followed some patients up as late as ten months after their acute infection, so were able to see a very slow improvement. While this was not statistically significant, it is at least heading in the right direction, but it is very possible that some of these individuals will never fully recover.”

There are several factors that could cause the cognitive deficits, say the researchers. Direct viral infection is possible, but unlikely to be a major cause; instead, it is more likely that a combination of factors contribute, including inadequate oxygen or blood supply to the brain, blockage of large or small blood vessels due to clotting, and microscopic bleeds. However, emerging evidence suggests that the most important mechanism may be damage caused by the body’s own inflammatory response and immune system.

While this study looked at hospitalised cases, the team say that even those patients not sick enough to be admitted may also have tell-tale signs of mild impairment.

Professor Adam Hampshire from the Department of Brain Sciences at Imperial College London, the study’s first author, said: “Around 40,000 people have been through intensive care with COVID-19 in England alone and many more will have been very sick, but not admitted to hospital. This means there is a large number of people out there still experiencing problems with cognition many months later. We urgently need to look at what can be done to help these people.”

Professor Menon and Professor Ed Bullmore from Cambridge’s Department of Psychiatry are co-leading working groups as part of the COVID-19 Clinical Neuroscience Study (COVID-CNS) that aim to identify biomarkers that relate to neurological impairments as a result of COVID-19, and the neuroimaging changes that are associated with these.

The research was funded by the NIHR BioResource, NIHR Cambridge Biomedical Research Centre and the Addenbrooke’s Charitable Trust, with support from the NIHR Cambridge Clinical Research Facility.

Reference
Hampshire, A et al. Multivariate profile and acute-phase correlates of cognitive deficits in a COVID-19 hospitalised cohort. eClinicalMedicine; 28 Apr 2022; DOI: 10.1016/j.eclinm.2022.101417

Cognitive impairment as a result of severe COVID-19 is similar to that sustained between 50 and 70 years of age and is the equivalent to losing 10 IQ points, say a team of scientists from the University of Cambridge and Imperial College London.

Cognitive impairment is common to a wide range of neurological disorders, but the patterns we saw – the cognitive 'fingerprint' of COVID-19 – was distinct from all of these
David Menon
Senior woman wearing face mask lying on hospital bed

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Research exposes long-term failure of Russian propaganda

$
0
0
Vladimir Putin, illustration

A study of thousands of stories from media outlets churning out propaganda in Ukrainian Donbas following Russia’s first invasion suggests that Kremlin disinformation has long neglected any coherent or convincing messaging to foster support for Russia in the war-torn region.

After 2014, when news media in the so-called 'People’s Republics' of Donetsk and Luhansk was forcibly taken over by Russian-backed insurgents, efforts to instil a pro-Russian 'identity' were lazy and half-baked, and dwindled to nothing within months.

This is according to University of Cambridge researcher Dr Jon Roozenbeek, who says that – based on his analysis of over four years of media content – such limited efforts likely had little effect on the consciousness of Russian-speaking Ukrainians in Donbas.

For example, Vladimir Putin has long trumpeted the idea of “Novorossiya”, or ‘New Russia’, in an attempt to resurrect terminology once used to describe Donbas during the reign of Catherine the Great, when it temporarily sat within the Russian Empire, and claim the region belongs in Russia.

While waves of propaganda demonised Ukraine’s government, the study shows that Novorossiya was hardly mentioned, and Russian disinformation lacked any real 'in-group' story, the ‘us’ to oppose a ‘them’ – a fundamental flaw in any attempt to generate lasting division, says Roozenbeek.

Instead of identity-building, almost the entire Russian propaganda effort relied on portraying the leadership in Kyiv as fascistic – the basis of outlandish “denazification” claims – to create what psychologists call an 'outgroup' on which to focus hostility.

However, as Russia shifts its war onto Donbas, Roozenbeek cautions that it may turn to spreading Novorossiya-style propaganda narratives in the region and far beyond to justify land seizure and war atrocities, and claim that these actions are supported by local populations.

He calls for a pre-emptive global debunking – or ‘pre-bunking’ – of the notion that ideological projects such as ‘Novorossiya’ have deep roots in the region, and that the people of Donbas have ever bought into these myths.

Otherwise, he says, we risk such falsehoods taking hold in the West via pundits and politicians who tow the Kremlin line. Roozenbeek’s findings are publicly available for the first time today.

“Eight years of Russian propaganda have failed to provide a convincing alternative to Ukrainian nationhood in eastern Ukraine,” said Roozenbeek.

“The Kremlin's decision to favour outgroup animosity over in-group identity building, and its vast overestimation of the extent to which its lies about non-existent Ukrainian ‘fascists’ promoted pro-Russian sentiment, are key reasons why the invasion has been a strategic and logistical disaster.”

“If the nonsense of Novorossiya or other half-baked ideological narratives start to spread in the West, it could end up being used to pressure Ukraine into relinquishing large swathes of its territory, as a drawn-out war in the Donbas causes the global community’s nerves to fray,” he said.  

For his PhD research, Roozenbeek used ‘natural language processing’ to algorithmically comb through over 85,000 print and online articles from 30 local and regional media outlets across Luhansk and Donetsk between 2014 and 2017, charting the patterns of content through use of key words and phrases in the wake of the first Russian invasion of Ukraine.

While half the coverage in print media remained 'business as usual'– sport, entertainment, etc – some 36% was dedicated to the 'shaping of identity' via propaganda. Much of this was done through parallels to World War II: the Donbas war as an attack by Ukrainian “neo-Nazis”.    

Only one newspaper paid any attention to Putin’s adopted concept of “Novorossiya”. Obvious opportunities to leverage history for identity-building propaganda were missed, such the fact that part of Donbas declared itself a Soviet republic in 1918, or indeed any mention of the Soviet Union.

“Description of an in-group identity that situated Donbas as part of the ‘Russian World’ were almost entirely absent from the region’s print media,” said Roozenbeek.

This pattern was largely replicated in online news media, which were arguably more ferocious in attempts to demonise the 'outgroup' Kyiv government – including using English language to try and spread propaganda internationally – while ignoring a pro-Russian 'this is us' identity. 

Roozenbeek found a handful of stories covering “patriotic” cultural events organised by the Kremlin-owned leadership in Luhansk, but even here the in-group identity was “lazily assumed”, he says, rather than established.          

All this despite the fact that a 'blueprint' strategy for propaganda in Donbas explicitly called for the image of a benevolent Russia to be cultivated by emphasising the 'Russian World' philosophy.

This strategy, leaked to German newspapers in 2016, is widely believed to be the work of Vladislav Surkov, the Kremlin’s former propagandist-in-chief, often dubbed Putin’s puppet master. It describes the need to construct and promote an ideology of 'cultural sovereignty' in Russian-occupied Donbas, one that can act as a stepping stone to statehood.

“Despite the importance given to constructing identity and ideology after the Russian-backed takeover in Luhansk and Donetsk, including as directed by the Kremlin, very little in-group identity was promoted,” said Roozenbeek.

“What identity-building propaganda I could find in Donbas after 2014 was vague, poorly conceived, and quickly forgotten. Political attempts to invoke Novorossiya were cast aside by the summer of 2015, but such weak propaganda suggests they didn’t stand much chance anyway.”

“Putin has severely underestimated the strength of Ukrainian national identity, even in Donbas, and overestimated the power of his propaganda machine on the occupied areas of Ukraine.”        

Roozenbeek’s research was conducted for his PhD between 2016 and 2020, and will feature in his forthcoming book Influence, Information and War in Ukraine, due out next year as part of the Society for the Psychology Study of Social Issues book series Contemporary Social Issues, published by Cambridge University Press.

A study of the propaganda that flooded Donbas for years reveals a failure to build pro-Russian 'in-group' identities in the region, despite Putin’s claims of support.

What identity-building propaganda I could find in Donbas after 2014 was vague, poorly conceived, and quickly forgotten
Jon Roozenbeek
Putin print manipulation

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Want more students to learn languages? Win over the parents, research suggests

$
0
0
Girl using headphones in a classroom

Children’s attitudes towards learning languages and their willingness to see themselves as ‘multilingual’ are influenced far more by the views of their parents than by their teachers or friends, new research indicates.

The finding implies that parents may have an important part to play in reversing the national decline in language-learning. The authors of the study, which was led by researchers at the University of Cambridge, say that efforts to increase uptake in these subjects would benefit from involving families, as well as schools.

Entry rates for modern languages have declined steadily, at both GCSE and A-Level, since the early 2000s. GCSE entry data, for example, show that the combined total number of pupils taking French, German, Spanish and other Modern Languages last year was almost half that of 2001.

The new study surveyed more than 1,300 Year 8 students, aged 12-13, to understand what makes them self-identify as ‘multilingual’: as capable learners and users of other languages. The responses revealed that their parents’ beliefs about languages had almost twice as much influence as the opinions of their teachers, and were also significantly more influential than the views of their peers.

Specifically, parental attitudes help students who are still forming a view about languages work out whether these subjects matter personally to them. In general, the study shows that they are more likely to consider themselves ‘multilingual’ if they identify with languages at this personal level and see them as relevant to their own lives. Simply learning languages at school and being told that they are useful appears to make less difference.

Professor Linda Fisher, from the Faculty of Education, University of Cambridge, said: “Students’ personal commitment to languages is determined by their experiences, their beliefs, and their emotional response to speaking or using them. Slightly surprisingly, the people who feed into that most appear to be their parents.”

“This can be a positive or negative influence depending on the parents’ own views. Its importance underlines the fact that if we want more young people to learn languages, we need to pay attention to wider social and cultural attitudes to languages beyond the classroom. Waning interest in these subjects is a public communication challenge; it’s not just about what happens in schools.”

Some language-learning specialists argue that most people are fundamentally 'multilingual'. Even if they do not speak another language fluently, they may know assorted words and phrases, or another kind of ‘language’: such as a dialect, sign language, or computer code.

Recognising that they have this multilingual capability appears to strengthen students’ self-belief when they encounter modern languages at school. There is also evidence that students who self-identify as multilingual perform better across the school curriculum, including in non-language subjects.

The study explored what leads students to see themselves in these terms, and whether this varies between different groups – for example, those who have ‘English as an Additional Language’ (EAL), and typically speak another language at home.

In the survey, students were asked to state how strongly they agreed or disagreed with various statements, such as: 'Learning other languages is pointless because everyone speaks English', and: 'My parents think that it’s cool to be able to speak other languages.' They were also asked about their own experience with languages, and how multilingual they considered themselves to be. The researchers then developed a model showing the relative importance of different potential influences on their self-identification as language-learners.

Although some influences – such as that of peers – differed for EAL and non-EAL students, that of parents was consistently strong. Across the board, the relative impact of parents’ attitudes on students’ willingness to see themselves as multilingual was found to be about 1.4 times greater than that of their friends, and almost double that of their teachers.

The researchers suggest that encouraging more parents to recognise their own multilingual capabilities would positively affect their children’s own language-learning. “In an ideal world we should be encouraging adults, as well as children, to see themselves as having a repertoire of communicative resources,” Fisher said. “It’s remarkable how quickly attitudes change once you start asking: ‘What words do you already know, what dialect do you speak; can you sign?’”

More broadly, the study found that young people are more likely to see themselves in these terms if they are exposed to meaningful experiences that involve other languages – for example by hearing and using them in their communities, or while travelling abroad. This, along with their personal and emotional response to the idea of languages, informs the degree to which they self-describe as multilingual.

The researchers argue that this raises questions about recent Government reforms to language GCSEs, which are meant to help students 'grow in confidence and motivation'. The new measures focus narrowly on so-called linguistic 'building blocks': for example, requiring students to learn 1,700 common words in the target language. Head teachers’ bodies have already criticised them as “prescriptive and grinding” and liable to alienate pupils further.

The new study similarly indicates that encouraging more young people to learn languages requires a broader-minded approach.

“There’s no evidence that if you just focus on the mechanics – phonics, grammar and so on – you’re going to motivate students or, for that matter, teachers,” Fisher said. “Students need to discover what languages mean to them, which means they also need to learn about culture, identity and self-expression. Simply drilling verb forms into them will only persuade a swathe of the school population that these subjects are not for them. That is especially likely if their parents don’t value languages either.”

The research is published in the International Journal of Multilingualism.

Parents influence children’s attitudes to languages far more than their teachers or friends, research finds. This implies that efforts to reverse the national decline in language-learning need to target families as well as schools, researchers say.

Waning interest in these subjects is a public communication challenge; it’s not just about what happens in schools
Linda Fisher
Girl listening in the classroom

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
License type: 

Taste of the future: robot chef learns to ‘taste as you go’

$
0
0

Working in collaboration with domestic appliances manufacturer Beko, researchers from the University of Cambridge trained their robot chef to assess the saltiness of a dish at different stages of the chewing process, imitating a similar process in humans.

Their results could be useful in the development of automated or semi-automated food preparation by helping robots to learn what tastes good and what doesn’t, making them better cooks.

When we chew our food, we notice a change in texture and taste. For example, biting into a fresh tomato at the height of summer will release juices, and as we chew, releasing both saliva and digestive enzymes, our perception of the tomato’s flavour will change.

The robot chef, which has already been trained to make omelettes based on human taster’s feedback, tasted nine different variations of a simple dish of scrambled eggs and tomatoes at three different stages of the chewing process, and produced ‘taste maps’ of the different dishes.

The researchers found that this ‘taste as you go’ approach significantly improved the robot’s ability to quickly and accurately assess the saltiness of the dish over other electronic tasting technologies, which only test a single homogenised sample. The results are reported in the journal Frontiers in Robotics & AI.

The perception of taste is a complex process in humans that has evolved over millions of years: the appearance, smell, texture and temperature of food all affect how we perceive taste; the saliva produced during chewing helps carry chemical compounds in food to taste receptors mostly on the tongue; and the signals from taste receptors are passed to the brain. Once our brains are aware of the flavour, we decide whether we enjoy the food or not.

Taste is also highly individual: some people love spicy food, while others have a sweet tooth. A good cook, whether amateur or professional, relies on their sense of taste, and can balance the various flavours within a dish to make a well-rounded final product.

“Most home cooks will be familiar with the concept of tasting as you go – checking a dish throughout the cooking process to check whether the balance of flavours is right,” said Grzegorz Sochacki from Cambridge’s Department of Engineering, the paper’s first author. “If robots are to be used for certain aspects of food preparation, it’s important that they are able to ‘taste’ what they’re cooking.”

“When we taste, the process of chewing also provides continuous feedback to our brains,” said co-author Dr Arsen Abdulali, also from the Department of Engineering. “Current methods of electronic testing only take a single snapshot from a homogenised sample, so we wanted to replicate a more realistic process of chewing and tasting in a robotic system, which should result in a tastier end product.”

The researchers are members of Cambridge’s Bio-Inspired Robotics Laboratory run by Professor Fumiya Iida of the Department of Engineering, which focuses on training robots to carry out the so-called last metre problems which humans find easy, but robots find difficult. Cooking is one of these tasks: earlier tests with their robot ‘chef’ have produced a passable omelette using feedback from human tasters.

“We needed something cheap, small and fast to add to our robot so it could do the tasting: it needed to be cheap enough to use in a kitchen, small enough for a robot, and fast enough to use while cooking,” said Sochacki.

To imitate the human process of chewing and tasting in their robot chef, the researchers attached a conductance probe, which acts as a salinity sensor, to a robot arm. They prepared scrambled eggs and tomatoes, varying the number of tomatoes and the amount of salt in each dish.

Using the probe, the robot ‘tasted’ the dishes in a grid-like fashion, returning a reading in just a few seconds.

To imitate the change in texture caused by chewing, the team then put the egg mixture in a blender and had the robot test the dish again. The different readings at different points of ‘chewing’ produced taste maps of each dish.

Their results showed a significant improvement in the ability of robots to assess saltiness over other electronic tasting methods, which are often time-consuming and only provide a single reading.

While their technique is a proof of concept, the researchers say that by imitating the human processes of chewing and tasting, robots will eventually be able to produce food that humans will enjoy and could be tweaked according to individual tastes.

“When a robot is learning how to cook, like any other cook, it needs indications of how well it did,” said Abdulali. “We want the robots to understand the concept of taste, which will make them better cooks. In our experiment, the robot can ‘see’ the difference in the food as it’s chewed, which improves its ability to taste.”

“Beko has a vision to bring robots to the home environment which are safe and easy to use,” said Dr Muhammad W. Chughtai, Senior Scientist at Beko plc. “We believe that the development of robotic chefs will play a major role in busy households and assisted living homes in the future. This result is a leap forward in robotic cooking, and by using machine and deep learning algorithms, mastication will help robot chefs adjust taste for different dishes and users.”

In future, the researchers are looking to improve the robot chef so it can taste different types of food and improve sensing capabilities so it can taste sweet or oily food, for example.

The research was supported in part by Beko plc and the Engineering and Physical Sciences Research Council (EPSRC) Centre of Doctoral Training on Agri-Food Robotics (Agriforwards CDT), part of UK Research and Innovation (UKRI). Fumiya Iida is a Fellow of Corpus Christi College, Cambridge.

 

Reference:
Grzegorz Sochacki, Arsen Abdulali, and Fumiya Iida. ‘Mastication-Enhanced Taste-Based Classification of Multi-Ingredient Dishes for Robotic Cooking.’ Frontiers in Robotics & AI (2022). DOI: 10.3389/frobt.2022.886074

A robot ‘chef’ has been trained to taste food at different stages of the chewing process to assess whether it’s sufficiently seasoned.

If robots are to be used for certain aspects of food preparation, it’s important that they are able to ‘taste’ what they’re cooking
Grzegorz Sochacki

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New report assesses global anti-deforestation measures

$
0
0
Deforestation

A major scientific assessment, published by the Global Forest Expert Panels (GFEP) Programme, led by the International Union of Forest Research Organizations (IUFRO), has evaluated the world’s progress on reducing emissions from deforestation and forest degradation.

The report analyses the past 10 years of REDD+ implementation – a global action plan to reduce emissions from deforestation and forest degradation primarily in tropical and sub-tropical regions – with respect to forest governance, carbon measurements and effects on biodiversity and livelihoods. The findings are presented during World Forestry Congress week, taking place this week in Seoul.

One major conclusion is that while REDD+ has provided a convenient umbrella for many forest and land use related activities aimed at reducing deforestation and forest degradation – and associated greenhouse gas emissions – the interlinkages and complexities of relationships between forests, land use and climate are profound.

The report, which aims to inform ongoing policy discussions on the 2030 Agenda for Sustainable Development, comes at a pivotal time: Human-induced climate change and increases in extreme weather events are impacting nature and people faster and more severely than had been expected 20 years ago.

However, there is still a chance to reverse this trend and avoid further global warming, according to the Intergovernmental Panel on Climate Change. This requires drastic reductions in greenhouse gas emissions, particularly CO2, most of which stem from burning fossil fuels.

Forests also play an important role in the global carbon cycle: they absorb carbon as they grow and emit carbon when they are destroyed. Every year nearly one-third of the global carbon emissions produced by humans can be absorbed by forests, yet deforestation and forest degradation are responsible for up to 10% of the annual man-made CO2 emissions.

In addition, interest in forests as a ‘nature-based solution’ has probably never been higher and the number of initiatives aimed at conserving, sustainably managing and restoring forests has increased considerably.

“This report is being launched at a very important moment, and feeds directly into international discussions on climate change and biodiversity,” said lead author Professor Bhaskar Vira, Head of Cambridge’s Department of Geography. “There is an urgent focus on the role of land use and forests as part of our transitions towards a net zero future, and on the contributions that forests can make to biodiversity and livelihoods.

“REDD+ will only be effective if we learn the lessons from existing efforts and interventions in the forest sector, and the challenges they have faced. This report offers key insights into the ways in which new and innovative sources of funding and finance should be organised and governed to ensure equitable and sustainable future pathways that benefit all, especially the Indigenous Peoples and local communities who live in and around forests.”

In addition to promoting forest protection and carbon sink enhancement, a key focus of REDD+ is to move the scope of interventions beyond climate impacts towards an integrated view of climate, biodiversity and livelihoods. REDD+ can deliver numerous environmental benefits, including reduced soil erosion, enhanced water quality and quantity, and increased resilience to drought and floods. It can potentially deliver important biodiversity benefits, although the availability of up-to-date biodiversity data remains a major challenge.

“Such benefits have significant economic importance and may increase both the value of REDD+ programs and people’s willingness to engage with them. However, in the implementation of REDD+, greater attention to biodiversity and livelihood outcomes is needed,” said lead author and IUFRO President John Parrotta of the USDA Forest Service.

Evidence from social evaluations of REDD+ interventions indicates that, where direct and indirect benefits are clearly visible to local stakeholders, and have been delivered, community engagement is strong and projects have achieved positive carbon and social outcomes, at least in the short term. Furthermore, explicit attention to rights and tenure issues provides more transparent mechanisms for the reporting and monitoring of environmental and social co-benefits, as well as better, more equitable outcomes, particularly for more vulnerable communities.

Case studies from Indonesia show that insecure tenure can exacerbate distrust between resource users and the government, and can keep local people from further participating in REDD+ activities. Evidence from Latin America and the Caribbean suggests that deforestation is lower in areas where Indigenous and Tribal Peoples’ collective land rights are recognised.

“Since 2012, implementation of REDD+ has advanced considerably in many countries but ultimately it is REDD+ governance that determines its performance. Yet, governance is distributed across a complex landscape of institutions with different sources of authority and power dynamics that influence its outcomes,” said GFEP Programme Coordinator Christoph Wildburger.

REDD+ is being applied in a wide diversity of contexts with an equally wide diversity of governance strategies, which are changing over time. Brazil, for example, was initially a leading global source of deforestation, then a world leader in reducing deforestation, and is now experiencing rising deforestation once again. While Brazil’s federal government has played a key role in these swings in deforestation rates, a number of Brazilian states are pursuing their own REDD+ initiatives with positive outcomes. Ghana, a relatively small country where deforestation has been strongly linked to the production of cocoa for export, is pursuing the ‘world’s first commodity-driven’ REDD+ strategy with private sector investments in ‘climate smart cocoa’. Both Brazil and Ghana illustrate the important role that actors other than national governments may play in shaping REDD+, such as sub-national state actors or private companies trading in forest risk commodities like cocoa.

Adapted from an IUFRO press release.

Comprehensive scientific report shows progress and effects on climate, nature and people. 

REDD+ will only be effective if we learn the lessons from existing efforts and interventions in the forest sector, and the challenges they have faced.
Bhaskar Vira
Deforestation

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
License type: 

Professor Stephen J Toope to lead global research organisation

$
0
0

Based in Toronto, and working across national and disciplinary boundaries, CIFAR brings together some of the world’s best researchers to address the most pressing and complex issues facing individuals and societies.

Under Professor Toope’s leadership at Cambridge, sustainability and widening student access have been key areas of focus for the University, with several new and exciting research and teaching initiatives launched. These include the ambitious climate initiative Cambridge Zero, and the creation of the landmark Cambridge Foundation Year. Professor Toope has also led the university sector in pushing towards a carbon-neutral endowment fund, and has overseen remarkable progress on a £500 million Student Support Initiative. During his tenure, the "Dear World... Yours, Cambridge" fundraising campaign for the collegiate University surpassed its £2 billion target. He has guided the University through a global pandemic and into a programme of recovery that will help build the Cambridge of the future.

Professor Toope said: “CIFAR is an extraordinary organisation, and I am honoured to be offered this opportunity to lead it. While I look forward to this new challenge, I am immensely proud of what we as a university community have achieved in a remarkable five years. The University of Cambridge is without question a force for good in the world. It has been a great privilege to work with so many gifted people carrying out such extraordinary work.”

Professor Toope will take on the new role from 1 November 2022. The process to recruit a new Vice-Chancellor is under way.

Professor Stephen J Toope will become the next President and CEO of the Canadian Institute for Advanced Research (CIFAR) after completing his five-year term of office as Vice-Chancellor at the University of Cambridge.

While I look forward to this new challenge, I am immensely proud of what we as a university community have achieved in a remarkable five years.
Professor Stephen J Toope, Vice-Chancellor
Professor Stephen J Toope

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Programmes to host scholars and students affected by the war on Ukraine

$
0
0

In October, the University hopes to welcome upwards of 20 students affected by the war on Ukraine. They will be funded by a range of scholarships including The Rowan Williams Cambridge Studentship, which is a programme established by the Cambridge Trust to support undergraduate and postgraduate students applying to study at Cambridge from a conflict zone.

The Rowan Williams Scholarships will be fully-funded covering tuition fees and maintenance and will also assist with students’ upfront expenses such as travel, visa costs and the immigration health surcharge. The Cambridge Trust is working with other funders to maximise the number of offers we can make. All recipients of these funds must have a conditional offer from the University to be considered for funding.

The School of Clinical Medicine has made a twinning agreement with Kharkiv National Medical University to accept medical students on six week clinical placements in Cambridge. Students will be placed at either Cambridge University Hospitals NHS Foundation Trust or Cambridgeshire and Peterborough NHS Foundation Trust. Each student will be hosted by a member of the University or one of the hospitals, who have volunteered to provide accommodation during their visit. Ten students will take part initially, with further cohorts expected to follow. The first students are expected to arrive within the next one to two months, subject to the government granting visiting visas.

The Collegiate University has so far submitted two applications to the British Academy and Council for At-Risk Academics (Cara) Researchers at Risk Fellowship Scheme. If successful, the fellowships will bring two scholars and their dependents to Cambridge for up to two years. These applications were generously supported by Darwin College and Trinity College.

The University is also in conversations with Ukrainian institutions to establish non-residential scholarships for up to fifteen Ukrainian scholars who have been displaced by the war and are living in Ukraine or neighbouring countries. The scholarships will provide a stipend, formal links to Cambridge academics and remote access to resources that will enable them to continue academic study.

Some 21 students currently studying in Cambridge have been identified as having been directly affected by the war. They are being supported through the University’s Ukrainian Conflict Student Hardship Fund.

In response to conversations with Ukrainian university representatives, the University Library and Cambridge University Press and Assessment are identifying specific ways in which they can assist in partnership with their national professional bodies. Cambridge University Press and Assessment has made the majority of its academic journal content free to institutions registered in Ukraine.

Additional programmes are in development, and the University remains ready to sponsor and host displaced doctoral students and academic staff as soon as the government visa scheme enables the University to act as a visa sponsor. Thank you to those University institutions that have come forward with offers of support.

“I have been heartened by the generosity displayed by colleagues across Cambridge who have raised funds or proposed activities to support those affected by the tragic war on Ukraine," said Professor Kamal Munir, Pro-Vice-Chancellor for University Community and Engagement. "I know many within the University have worked tirelessly to put these programmes in place and we will continue to identify opportunities for the Collegiate University to provide further support in the medium to long-term.”

The humanitarian tragedy unfolding in Ukraine continues to galvanise our community. Led by the Ukraine Taskforce, the Collegiate University has developed a number of programmes to support scholars and students affected by the war.

I know many within the University have worked tirelessly to put these programmes in place
Kamal Munir
A participant at the University's vigil for Ukraine earlier this year.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Protected areas saw dramatic spikes in fires during COVID lockdowns, study finds

$
0
0

The number of fires inside protected conservation areas across the island of Madagascar shot up dramatically when COVID-19 lockdowns led to the suspension of any on-site management for five months during 2020.

The findings suggest that governments should consider keeping some staff in protected areas at all times as an “essential service”, even during periods of health crisis and travel restriction, argue the scientists behind the study.

They say that more attention must be paid to the management of protected areas, not just expanding their coverage, at the long-delayed convention to set international biodiversity goals later this year. 

Madagascar is a renowned biodiversity “hotspot”, home to species such as its famous lemur populations that don’t exist anywhere else. The island is also a frontline in the fight between wildlife protection and habitat loss.   

The study, published today in Nature Sustainability, is the first to gauge the effects of the pandemic on protected conservation areas. 

An international team of scientists led by Cambridge and Helsinki universities used historical and contemporary fire and weather data to predict rates of burning in Madagascar’s protected areas for every month during 2012-2020.

They compared this data modelling to counts of actual blazes collected by satellites to detect periods when fires raged far beyond what might be expected from the climate and previous patterns of burning.

When the first lockdowns of 2020 halted the on-site management of protected areas, the numbers of fires – much of it in threatened forest habitat – soared by 209% in March, 223% in April, 78% in May, 248% in June and 76% in July.

However, burning quickly returned to normal levels as predicted by the modelling once management operations resumed – despite continued border closures and economic hardships as a result of the ongoing pandemic.

Researchers describe this scale of burning inside protected areas as “unprecedented” in recent Malagasy history. The only comparable periods were during two spells of civil unrest in 2013 and 2018 in the run-up to elections, but even then the fieriest month was just a 134% increase in burning.

“The disruption caused by COVID-19 clearly demonstrates the dramatic impact that interruptions to the management of protected areas can have on habitats,” said senior author Prof Andrew Balmford from the University of Cambridge.

“Over the last twenty years, excess fires in Malagasy protected areas have been limited to occasional blocks of one or two months.

“When all staff were pulled out of protected areas in March 2020 the fires spiked dramatically and continued at a ferocious level for an unprecedented five months, falling away exactly as staff started to return,” he said.      

While the team says they cannot know for sure what caused all the fires during the early months of COVID-19, lead author Dr Johanna Eklund from the University of Helsinki said that local communities already struggling economically would have come under further pressure from lockdowns.

“Madagascar has very high rates of poverty, and has a history of conflict between the livelihoods of vulnerable people and saving unique biodiversity,” said Eklund, currently a visiting researcher at Cambridge.

“The pandemic increased economic insecurity for many, so it would not be surprising if this led some to encroach on protected lands while on-site management activities were on hold.”

Eklund suggests that a lack of on-site patrolling to prevent any fires from spreading combined with communities resorting to “swidden” – or slash-and-burn – agriculture may be behind much of the spike in lockdown fires. These techniques clear vegetation for crops and cattle-grazing but are illegal inside protected areas.

“Importantly, the study did not measure fires outside conservation sites, so we cannot measure how much protected areas actually mitigated burning compared to areas without protection,” Eklund said.

The team used imaging data from NASA satellite systems capable of detecting “thermal anomalies” and noted for near real-time fire management alerts.

Eklund, who has conducted research in Madagascar for close to a decade, realised she could still remotely assist those protecting the forests. “Satellites pick up fires really well and show where protected areas are under pressure.”

Co-author Domoina Rakotobe, former coordinator for the Malagasy organisation Forum Lafa, the network of terrestrial protected area managers, added: “The high levels of burning during the lockdowns clearly shows the value of on-the-ground management, with protected area teams working with communities to support local livelihoods and safeguard natural resources.”

Scientists suggest that some staffing of protected areas should be considered “essential services” in future crises. 

When all staff were pulled out of protected areas in March 2020 the fires spiked dramatically
Andrew Balmford
Slash and burn practise leading to fires in the region west of Manantenina, Madagascar

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

‘Stressed’ cells offer clues to eliminating build-up of toxic proteins in dementia

$
0
0
Nurse taking care of elderly sick woman in wheelchair

A characteristic of diseases such as Alzheimer’s and Parkinson’s – collectively known as neurodegenerative diseases – is the build-up of misfolded proteins. These proteins, such as amyloid and tau in Alzheimer’s disease, form ‘aggregates’ that can cause irreversible damage to nerve cells in the brain.

Protein folding is a normal process in the body, and in healthy individuals, cells carry out a form of quality control to ensure that proteins are correctly folded and that misfolded proteins are destroyed. But in neurodegenerative diseases, this system becomes impaired, with potentially devastating consequences.

As the global population ages, an increasing number of people are being diagnosed with dementia, making the search for effective drugs ever more urgent. However, progress has been slow, with no medicines yet available that can prevent or remove the build-up of aggregates.

In a study published today in Nature Communications, a team led by scientists at the UK Dementia Research Institute, University of Cambridge, has identified a new mechanism that appears to reverse the build-up of aggregates, not by eliminating them completely, but rather by ‘refolding’ them.

“Just like when we get stressed by a heavy workload, so, too, cells can get ‘stressed’ if they’re called upon to produce a large amount of proteins,” explained Dr Edward Avezov from the UK Dementia Research Institute at the University of Cambridge.

“There are many reasons why this might be, for example when they are producing antibodies in response to an infection. We focused on stressing a component of cells known as the endoplasmic reticulum, which is responsible for producing around a third of our proteins – and assumed that this stress might cause misfolding.”

The endoplasmic reticulum (ER) is a membrane structure found in mammalian cells. It carries out a number of important functions, including the synthesis, folding, modification and transport of proteins needed on the surface or outside the cell. Dr Avezov and colleagues hypothesised that stressing the ER might lead to protein misfolding and aggregation by diminishing its ability to function correctly, leading to increased aggregation.

They were surprised to discover the opposite was true.

“We were astonished to find that stressing the cell actually eliminated the aggregates – not by degrading them or clearing them out, but by unravelling the aggregates, potentially allowing them to refold correctly,” said Dr Avezov.

“If we can find a way of awakening this mechanism without stressing the cells – which could cause more damage than good – then we might be able to find a way of treating some dementias.”

The main component of this mechanism appears to be one of a class of proteins known as heat shock proteins (HSPs), more of which are made when cells are exposed to temperatures above their normal growth temperature, and in response to stress.

Dr Avezov speculates that this might help explain one of the more unusual observations within the field of dementia research. “There have been some studies recently of people in Scandinavian countries who regularly use saunas, suggesting that they may be at lower risk of developing dementia. One possible explanation for this is that this mild stress triggers a higher activity of HSPs, helping correct tangled proteins.”

One of the factors that has previous hindered this field of research has been the inability to visualise these processes in live cells. Working with teams from Pennsylvania State University and the University of Algarve, the team has developed a technique that allows them to detect protein misfolding in live cells. It relies on measuring light patterns of a glowing chemical over a scale of nanoseconds - one billionth of a second.

“It’s fascinating how measuring our probe’s fluorescence lifetime on the nanoseconds scale under a laser-powered microscope makes the otherwise invisible aggregates inside the cell obvious,” said Professor Eduardo Melo, one of the leading authors, from the University of Algarve, Portugal.

The research was supported by the UK Dementia Research Institute, which receives its funding from the Medical Research Council, Alzheimer's Society and Alzheimer's Research UK, as well as the Portuguese Foundation for Science and Technology.

Reference
Melo, EP, et al. Stress-induced protein disaggregation in the Endoplasmic Reticulum catalysed by BiP. Nature Comms; 6 May 2022; DOI: 10.1038/s41467-022-30238-2

It’s often said that a little stress can be good for you. Now scientists have shown that the same may be true for cells, uncovering a newly-discovered mechanism that might help prevent the build-up of tangles of proteins commonly seen in dementia.

We were astonished to find that stressing the cell actually eliminated the aggregates – not by degrading them or clearing them out, but by unravelling the aggregates, potentially allowing them to refold correctly
Edward Avezov
Taking care of elderly sick woman in wheelchair

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Nine Cambridge scientists among the new Fellows announced today by the Royal Society

$
0
0
Fellowship awardees

The Royal Society is a self-governing Fellowship made up of the most eminent scientists, engineers and technologists from the UK and the Commonwealth. Its Foreign Members are drawn from the rest of the world.

The Society’s fundamental purpose is to recognise, promote, and support excellence in science and to encourage the development and use of science for the benefit of humanity.

This year, a total of 51 Fellows, 10 Foreign Members, and one Honorary Fellow have been selected for their outstanding contributions to science.

Sir Adrian Smith, President of the Royal Society said: “It is an honour to welcome so many outstanding researchers from around the world into the Fellowship of the Royal Society.

“Through their careers so far, these researchers have helped further our understanding of human disease, biodiversity loss and the origins of the universe. I am also pleased to see so many new Fellows working in areas likely to have a transformative impact on our society over this century, from new materials and energy technologies to synthetic biology and artificial intelligence. I look forward to seeing what great things they will achieve in the years ahead.”

The Cambridge Fellows are:

Professor Graham Burton FMedSci FRS

Mary Marshall and Arthur Walton Professor Emeritus of the Physiology of Reproduction, University of Cambridge

Burton is a reproductive biologist whose research has focused on the early stages of human pregnancy. In particular, he showed how the placenta is established in a protective low-oxygen environment, stimulating its own development through interactions with the uterus. He demonstrated that aberrations in the early stages of placental development can adversely affect the life-long health of mother and offspring. Burton was founding Director of the Centre for Trophoblast Research, and founding Chair of the Strategic Research Initiative Cambridge Reproduction.

He said: “I am delighted to receive this recognition for myself and the field of reproductive biology, and thank colleagues and collaborators for their contributions over the years.”

Professor Roberto Cipolla FREng FRS

Professor of Information Engineering, Department of Engineering, University of Cambridge

Cipolla is distinguished for his research in computer vision and his contributions to the reconstruction, registration and recognition of three-dimensional objects from images. These include novel algorithms for the recovery of accurate 3D shape, visual localisation and semantic segmentation and their translation into commercial products.

He said: "This is the ultimate honour for any scientist and recognises the amazing contribution of my students, collaborators and mentors in my 30 years at Cambridge. I am also very fortunate to be working in the field of computer vision and machine learning at a time of revolutionary progress and ground-breaking applications.”

Professor Douglas Easton FMedSci FRS

Professor of Genetic Epidemiology, Centre for Cancer Genetic Epidemiology, Department of Public Health and Primary Care, University of Cambridge

Easton’s main research interests are in cancer genetics. He analyses large population studies to identify genetic variants that predispose to cancer, and to understand how they combine together with other factors to determine cancer risk. His work has characterised many important cancer genes such as BRCA1 and BRCA2, and identified of hundreds of common cancer predisposition variants in the non-coding genome. He co-developed the BOADICEA risk prediction model now used worldwide to guide genetic counselling and cancer prevention.

He said: "I am truly delighted and honoured to be elected to the Fellowship of the Royal Society. This prestigious honour is a tribute the work of many wonderful colleagues in Cambridge and worldwide, over many years, who have made the research possible."

Professor Robin Franklin FMedSci FRS

Formerly Professor of Stem Cell Medicine, Wellcome - MRC Cambridge Stem Cell Institute, University of Cambridge; now Principal Investigator, Altos Labs - Cambridge Institute

The central question of Franklin’s career is 'how do tissues regenerate?' To address this question, he has studied the brain, an organ notorious for its poor regenerative capacity. Working with many excellent colleagues, he has described how stem cells in the adult brain regenerate oligodendrocytes - the cells responsible for making the insulating myelin sheath around nerve fibres - once they are lost in diseases such as multiple sclerosis (MS); how this process declines with age; and it can be reversed. The work has led to two regenerative medicine trials in MS.

He said: “I am absolutely delighted to have been elected a Fellow of the Royal Society - it is a huge honour.”

Professor Richard Gilbertson FMedSci FRS

Li Ka Shing Chair of Oncology and Head of Department of Oncology, University of Cambridge, Director of Cancer Research UK Cambridge Centre and Senior Group Leader, Cancer Research UK Cambridge Institute

Gilbertson, a paediatric physician-scientist, has identified the origins of common and aggressive childhood brain tumours and many of the genetic alterations that drive these tumours. His research has helped establish a direct link between disordered development and the multiple different brain tumour types observed in children: contributing directly to their classification by the World Health Organisation (WHO); changing the way conventional treatments are used, sparing children from unnecessary side effects; and underpinning clinical trials of new therapies.

Gilbertson said: “I am truly delighted and humbled to receive this recognition that I share with all the wonderful students, trainees and colleagues I have worked with over the years.”

Professor Paul Lehner FMedSci FRS

Professor of Immunology and Medicine, Cambridge Institute for Medical Research, University of Cambridge

Lehner studies virus-host antagonism and how our genome is defended from invasion by RNA-derived retroelements such as HIV. His discovery of the ‘HUSH’ epigenetic silencing complex explains how the genome distinguishes new genetic material from endogenous genes through recognition of intronless DNA. This work uncovered an unanticipated surveillance system that discriminates ‘self’ from ‘non-self’ genomic DNA and defends our genome against the reverse flow of genetic information (RNA to DNA), paving the way to novel applications in medicine and biotechnology.

Lehner said: “I’m absolutely delighted to be elected to the Fellowship of the Royal Society; I’ve been fortunate to work with incredibly talented people and this honour recognises the commitment of the many past and present members of my group who have contributed to our work.”

Professor Roberto Maiolino FRS

Director of the Kavli Institute for Cosmology and Professor of Experimental Astrophysics, University of Cambridge

Maiolino studies the formation of galaxies using observations collected at some of the largest ground-based and space telescopes. He has obtained key results on the interplay between the evolution of galaxies and the supermassive black holes at their centres. He has also investigated the enrichment of chemical elements across the cosmic epochs, as well as the origin and nature of dust particles in the early Universe.

He said: “I am truly honoured by such a prestigious appointment. Being a Fellow of the Royal Society will certainly foster my research activities and will allow me to further promote exciting, cutting-edge projects.”

Professor Angelos Michaelides FRS

1968 Professor of Chemistry, Yusuf Hamied Department of Chemistry, University of Cambridge

Michaelides’ work involves the development and application of theoretical methods to better understand contemporary problems in chemistry, physics, and materials science. His group places a particular focus on developing and applying computer simulation approaches that provide the fundamental molecular-level insight needed to help address contemporary global challenges related to water, energy, and the environment.  

He said: “Holy moly! I’m delighted to have been elected an FRS and very grateful to all the outstanding students, post-docs, collaborators, and mentors I’ve had over the years without whom this would never have happened.”

Professor Jason William Chin FMedSci FRS

Head, Centre for Chemical and Synthetic Biology, and Joint Head, Division of Protein and Nucleic Acid Chemistry, MRC Laboratory of Molecular Biology; Professor of Chemistry and Chemical Biology, Yusuf Hamied Department of Chemistry, University of Cambridge; Associate Faculty in Synthetic Genomics, Wellcome Sanger Institute 

Chin has engineered the genetic code of living cells to synthesise modified proteins and non-canonical polymers. To accomplish this, he created new translational machinery and codons to reprogram the genetic code,  going well-beyond prior work using amber suppression. He then completely synthesised a bacterial genome in which he reduced the number of sense codons in its genetic code. The codons thus unused were reassigned to encode non-canonical amino acids. Chin's fundamental advances have been widely used to drive discovery, including to define the molecular consequences of post-translational modifications, define protein interactions in cells, and provide mechanistic insight into enzymes.

The nine Cambridge researchers were all selected for their exceptional contributions to science.

It is an honour to welcome so many outstanding researchers from around the world into the Fellowship of the Royal Society.
Sir Adrian Smith, President of the Royal Society

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Viewing all 4507 articles
Browse latest View live