Quantcast
Channel: University of Cambridge - Latest news
Viewing all 4508 articles
Browse latest View live

Claims AI can boost workplace diversity are ‘spurious and dangerous’, researchers argue

$
0
0

Recent years have seen the emergence of AI tools marketed as an answer to lack of diversity in the workforce, from use of chatbots and CV scrapers to line up prospective candidates, through to analysis software for video interviews. 

Those behind the technology claim it cancels out human biases against gender and ethnicity during recruitment, instead using algorithms that read vocabulary, speech patterns and even facial micro-expressions to assess huge pools of job applicants for the right personality type and “culture fit”.

However, in a new report published in Philosophy and Technology, researchers from Cambridge’s Centre for Gender Studies argue these claims make some uses of AI in hiring little better than an “automated pseudoscience” reminiscent of physiognomy or phrenology: the discredited beliefs that personality can be deduced from facial features or skull shape.

They say it is a dangerous example of “technosolutionism”: turning to technology to provide quick fixes for deep-rooted discrimination issues that require investment and changes to company culture.

In fact, the researchers have worked with a team of Cambridge computer science undergraduates to debunk these new hiring techniques by building an AI tool modelled on the technology, available at: https://personal-ambiguator-frontend.vercel.app/ 

The ‘Personality Machine’ demonstrates how arbitrary changes in facial expression, clothing, lighting and background can give radically different personality readings – and so could make the difference between rejection and progression for a generation of job seekers vying for graduate positions.            

The Cambridge team say that use of AI to narrow candidate pools may ultimately increase uniformity rather than diversity in the workforce, as the technology is calibrated to search for the employer’s fantasy “ideal candidate”.

This could see those with the right training and background “win over the algorithms” by replicating behaviours the AI is programmed to identify, and taking those attitudes into the workplace, say the researchers.  

Additionally, as algorithms are honed using past data, they argue that candidates considered the best fit are likely to end up those that most closely resembling the current workforce.

“We are concerned that some vendors are wrapping ‘snake oil’ products in a shiny package and selling them to unsuspecting customers,” said co-author Dr Eleanor Drage.

“By claiming that racism, sexism and other forms of discrimination can be stripped away from the hiring process using artificial intelligence, these companies reduce race and gender down to insignificant data points, rather than systems of power that shape how we move through the world.”

The researchers point out that these AI recruitment tools are often proprietary – or “black box” – so how they work is a mystery.

“While companies may not be acting in bad faith, there is little accountability for how these products are built or tested,” said Drage. “As such, this technology, and the way it is marketed, could end up as dangerous sources of misinformation about how recruitment can be ‘de-biased’ and made fairer.”

Despite some pushback – the EU’s proposed AI Act classifies AI-powered hiring software as “high risk”, for example – researchers say that tools made by companies such as Retorio and HIreVue are deployed with little regulation, and point to surveys suggesting use of AI in hiring is snowballing.

A 2020 study of 500 organisations across various industries in five countries found 24% of businesses have implemented AI for recruitment purposes and 56% of hiring managers planned to adopt it in the next year.

Another poll of 334 leaders in human resources, conducted in April 2020, as the pandemic took hold, found that 86% of organisations were incorporating new virtual technology into hiring practices.  

“This trend was in already in place as the pandemic began, and the accelerated shift to online working caused by COVID-19 is likely to see greater deployment of AI tools by HR departments in future,” said co-author Dr Kerry Mackereth, who presents the Good Robot podcast with Drage, in which the duo explore the ethics of technology. 

Covid-19 is not the only factor, according to HR operatives the researchers have interviewed. “Volume recruitment is increasingly untenable for human resources teams that are desperate for software to cut costs as well as numbers of applicants needing personal attention,” said Mackereth.

Drage and Mackereth say many companies now use AI to analyse videos of candidates, interpreting personality by assessing regions of a face – similar to lie-detection AI – and scoring for the “big five” personality tropes: extroversion, agreeableness, openness, conscientiousness, and neuroticism.

The undergraduates behind the ‘Personality Machine’, which uses a similar technique to expose its flaws, say that while their tool may not help users beat the algorithm, it will give job seekers a flavour of the kinds of AI scrutiny they might be under – perhaps even without their knowledge.

“All too often, the hiring process is oblique and confusing,” said Euan Ong, one of the student developers. “We want to give people a visceral demonstration of the sorts of judgements that are now being made about them automatically.

“These tools are trained to predict personality based on common patterns in images of people they’ve previously seen, and often end up finding spurious correlations between personality and apparently unrelated properties of the image, like brightness. We made a toy version of the sorts of models we believe are used in practice, in order to experiment with it ourselves,” Ong said. 

Research highlights growing market in AI-powered recruitment tools that claim to bypass human bias to remove discrimination from hiring. 

While companies may not be acting in bad faith, there is little accountability for how these products are built or tested
Eleanor Drage
Co-author Dr Eleanor Drage testing the 'personality machine' built by Cambridge undergraduates.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge students first to receive new Ukraine STEM fund scholarships

$
0
0

The Ukraine Math and Science Achievement Fund launched this academic year with a $3 million donation from Ken Griffin, founder and CEO of U.S.-based global financial firm Citadel. The new multi-year scholarship program has been established to help exceptional Ukrainian students continue their science, technology, engineering, and maths (STEM) studies.

“Ukrainian students – including some of the brightest of their generation – have faced unimaginable hardship and have had their education and futures disrupted,” Ken Griffin said. “The Ukraine Math and Science Achievement Fund aims to ensure that these brilliant students have the opportunity to realise their academic ambitions and leave their mark on the world.”

The fund will provide tuition assistance and other support to help dozens of top Ukrainian students affected by the war pursue STEM studies at leading educational institutions around the world and will be administered by the Digital Harbor Foundation in Baltimore, Maryland.

Three of the initial Griffin Scholars hail from Lyceum 27 in Kharkiv, a city in Ukraine that has been particularly hard hit by the war. They include Ihor Pylaiev, an International Mathematical Olympiad (IMO) two-time gold medalist who, with a perfect score, was this year’s overall IMO winner; IMO two-time silver medalist Svyatoslav Deniskov; and IMO bronze medalist Vadym Hasseiev. Four additional Griffin Scholars – Liza Horokh, Roksolana Ivanchuk, Mariia Mikhnovska, and Anton Havrilyuk – studied in Ukraine's capital city of Kyiv. Like their Kharkiv peers, they have each medaled at various Olympiads, share a passion for maths and problem-solving, and have endured war-related hardships.

The fund has enabled four of these outstanding students to accept offers to study at the University of Cambridge this autumn; the three other members of the first cohort are attending Hills Road Sixth Form College in the City of Cambridge. Future funding rounds beginning later this year will allow additional talented Ukrainian students to pursue educational opportunities around the world and scholarships to study at universities in Europe and the United States.

The Ukraine Math and Science Achievement Fund builds on an effort led by Dr. Ferenc Huszár, an associate professor of computer science at Cambridge; Dr. Iryna Korshunova, a machine learning researcher; and a group of volunteers who have been providing the displaced Olympiad community from Ukraine with access to financial support, laptops, and educational resources since the start of the war. After learning about this grassroots effort, Ken Griffin was inspired to amplify this work and further support top STEM students seeking to pursue their studies and maximise their potential.

“The talent and determination these students have shown in recent months is inspiring,” said Dr. Huszár. “Despite the stress and uncertainty they have faced, they went on to solve complex problems, to learn, and to compete with incredible success. This fund not only enables them to study at top schools, but also recognises the hard work and talent of these students and their teachers.”

In addition to this new scholarship fund, Griffin’s firms, Citadel and Citadel Securities, also sponsored Ukraine’s Informatics Olympiad team earlier this year, when the war jeopardised the team’s ability to compete internationally. The funding enabled a group of students to train in Poland, compete in Olympiad competitions across Europe, and travel to Indonesia in August to compete in the International Olympiad in Informatics, the most prestigious computer science competition for secondary and high school students around the world.

The Ukrainian team placed first among all the European teams and sixth globally at the competition.

"The Olympiad has changed their lives,” said Anton Tsypko, the coach of the Ukrainian Olympiad team. “In the midst of an incredibly challenging period, it has opened new doors for these talented students and put them on a path to pursue their studies at top universities. I look forward to the contributions they’ll make within their academic communities and society more broadly.”

Members of the Olympiad team are among the students eligible to apply for scholarships from the Ukraine Math and Science Achievement Fund as they look to continue their studies at the world’s top universities upon graduation. Applications for funding this year and next will be accepted on a rolling basis beginning later this autumn at ukraineachievementfund.org. 

Four Cambridge undergraduates will become the first university scholars to receive support from a new global education fund created for talented Ukrainian maths and science students whose education has been affected by Russia’s invasion.

Ukrainian students – including some of the brightest of their generation – have faced unimaginable hardship and have had their education and futures disrupted
Ken Griffin

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Assessments of thinking skills may misrepresent poor, inner-city children in the US

$
0
0
School assessments

In a newly-published study of almost 500 children from high-poverty, urban communities in the United States, researchers found that a widely-used assessment, which measures the development of thinking skills called ‘executive functions’, did not fully and accurately evaluate students’ progress. The study links this to probable cultural bias in the assessment design and suggests that this may be replicated in other, similar tools.

Any such design flaw may have influenced a growing body of research which suggests that children from poorer backgrounds tend to start school with less well-developed executive functions.  ‘Executive functions’ is a collective term for a set of essential thinking skills needed to carry out everyday tasks, and learning. They include working memory, self-control, the ability to ignore distractions and easily switch between tasks. Children with good executive functions tend to have better test scores, better mental health and greater employment potential.

One common method for measuring the healthy development of these skills involves asking teachers to complete questionnaires about children’s observed behaviours. The results can potentially help pinpoint children – or entire groups – who need extra support. They also provide a rich source of data for research on how executive functions develop.

In the new study, researchers found that one of these teacher rating scales, which has been widely used in the United States, was of limited value when assessing poorer, urban students. Specifically, they found that the executive function screener of a version of the Behaviour Assessment System for Children (BASC), called the BASC-2, “is not a good representation of everyday executive function behaviours by children from schools in high-poverty communities”.

The team, from the University of Cambridge (UK) and Virginia Commonweath University (US) suggest that the likely cause is that both this scale, and others like it, have been developed using an unrepresentative sample of children.

Researchers have previously pointed out that these assessments tend to be modelled on children who are mostly from comfortable socio-economic settings. By mapping their observed behaviours on to executive functions, they may falsely assume that these behaviours are ‘normal’ markers for any child of the same age. In reality, children’s different backgrounds and lived experiences may mean that executive functions express themselves differently across different groups.

Annie Zonneveld, from the Faculty of Education, University of Cambridge and the study’s first author, said: “There is a big question around how we measure executive functions: are we actually using the right tools? If they are based on white, middle-class students, we cannot be sure that they would actually work for the whole population. We may be seeing evidence of that here.”

Michelle Ellefson, Professor of Cognitive Science at the Faculty of Education, said: “Teachers can provide us with really valuable data about children’s executive functions because they can monitor development in ways we could not possibly replicate in a lab, but they need effective measures to do this. This means the assessments must draw on information about children from different backgrounds.”

According to the Children’s Defense Fund, about 14% of children in the United States live in poverty. While nearly 50% of all children are from ethnic minority families, 71% of those in poverty are from these backgrounds. Most psychometric research on executive functions, however, focuses on white, middle-income, or affluent families. It has never been clear how far its findings can be generalised.

The new study examined the executive function components of two versions of the BASC: the BASC-2 and BASC-3. These ask teachers to observe children’s everyday behaviours and rate, on a scale of ‘never’ to ‘always’, how far they agree with statements such as “acts without thinking”, “is easily distracted”, “cannot wait to take turn”, “is a self-starter” and “argues when denied own way”. They then extrapolate information about the children’s executive functions based on the responses.

The researchers analysed two sample groups of children, aged around nine or 10, all from state schools in high-poverty, urban areas in the United States. In total, 472 children took part. The first sample was assessed using the BASC-2; the other using the BASC-3.

Both groups also completed six computer-based tasks which psychologists and neuroscientists use in lab-based tasks to measure specific executive functions. The researchers looked at how far the scores from these computerised tasks – which are accurate but difficult to run with large groups – corresponded to the measures from the teacher-administered surveys.

The findings indicated that while the BASC-2 provides a reasonable overview of students’ general executive functioning, it does not capture accurate details about specific functions like working memory and self-control. The BASC-3 was far more effective, probably because it uses a different and more focused set of questions.

“The BASC-2 has been used extensively in archived datasets and contributes to academic research about how executive functions develop,” Ellefson said. “It is really important to recognise that without modification, it is not an appropriate basis for making judgements about certain groups of children.”

The assessment is just one of many surveys that measure children’s cognitive development in different countries. “It is important that we know how these tools are establishing their baseline understanding of ‘typical’ development,” Zonneveld said. “If they are based on mostly white populations from affluent suburbs, they won’t necessarily be as representative as we might hope.”

The study is published in Developmental Science.

Some of the assessment tools that measure children’s thinking skills in the US may have provided inaccurate information about poor, urban students because they are modelled on wealthier – mostly white – populations.

There is a big question around how we measure executive functions: are we actually using the right tools?
Annie Zonneveld

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Universal bus service to go electric, with extended route

$
0
0

As part of its commitment to sustainability and sustainable business travel, the University of Cambridge has agreed to award an eight-year contract to Whippet, to provide a fleet of new electric buses from July 2023.

The new contract will provide a ‘split service’, with half of Universal buses serving Girton College at the northern end of the route, and half routed along Grange Road and Newnham Road to better serve Wolfson College, with some returning to Hills Road to connect with Homerton College and the Faculty of Education.

Route U, the University bus for everyone, is subsidised by the University of Cambridge, and carries around 60,000 people per week, linking Eddington with West Cambridge, the city centre, the train station and the Cambridge Biomedical Campus. University staff and students use the service, as well as Eddington residents, sixth form students, shoppers, tourists and key workers.

The University’s Planning and Resources Committee approved the introduction of the electric service, replacing the existing diesel service, and the route extension. The £1 fare for University Card holders will be retained despite the significant additional investment from the University in the new bus service. 

Professor David Cardwell, Pro-Vice-Chancellor for Strategy and Planning, said: “We are delighted to announce that the Universal bus service will operate with a fleet of new electric buses from next year, and that students, staff and members of the public will soon be able to also use the service at Girton College, Homerton College and Wolfson College. 

“Colleagues, in particular the University’s Sustainability Team, have worked hard on what are significant enhancements to the existing service, and have welcomed the positive working relationship they have had with student representatives, and their perspective on the service.

“The Universal bus is a pioneering initiative, and this new contract will support the University’s commitment to environmental sustainability, in particular sustainable business travel, as well as its work to meet the access requirements of all students.”

Universal Bus – The University bus for everyone

New electric buses will operate on the Universal bus route from next year, with the service extended to serve Girton College, Homerton College and Wolfson College.

The Universal bus is a pioneering initiative, and this new contract will support the University’s commitment to environmental sustainability, in particular sustainable business travel.
Professor David Cardwell, Pro-Vice-Chancellor for Strategy and Planning

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Scientists detect dementia signs as early as nine years ahead of diagnosis

$
0
0
Elderly person's hands

In research published today in Alzheimer's & Dementia: The Journal of the Alzheimer's Association, the team analysed data from the UK Biobank and found impairment in several areas, such as problem solving and number recall, across a range of conditions.

The findings raise the possibility that in the future, at-risk patients could be screened to help select those who would benefit from interventions to reduce their risk of developing one of the conditions, or to help identify patients suitable for recruitment to clinical trials for new treatments.

There are currently very few effective treatments for dementia or other neurodegenerative diseases such as Parkinson’s disease. In part, this is because these conditions are often only diagnosed once symptoms appear, whereas the underlying neurodegeneration may have begun years – even decades – earlier. This means that by the time patients take part in clinical trials, it may already be too late in the disease process to alter its course.

Until now, it has been unclear whether it might be possible to detect changes in brain function before the onset of symptoms. To help answer this question, researchers at the University of Cambridge and Cambridge University Hospitals NHS Foundation Trust turned to UK Biobank, a biomedical database and research resource containing anonymised genetic, lifestyle and health information from half a million UK participants aged 40-69.

As well as collecting information on participants’ health and disease diagnoses, UK Biobank collected data from a battery of tests including problem solving, memory, reaction times and grip strength, as well as data on weight loss and gain and on the number of falls. This allowed them to look back to see whether any signs were present at baseline – that is, when measurements were first collected from participants (between five and nine years prior to diagnosis).

People who went on to develop Alzheimer’s disease scored more poorly compared to healthy individuals when it came to problem solving tasks, reaction times, remembering lists of numbers, prospective memory (our ability to remember to do something later on) and pair matching. This was also the case for people who developed a rarer form of dementia known as frontotemporal dementia.

People who went on to develop Alzheimer’s were more likely than healthy adults to have had a fall in the previous 12 months. Those patients who went on to develop a rare neurological condition known as progressive supranuclear palsy (PSP), which affects balance, were more than twice as likely as healthy individuals to have had a fall.

For every condition studied – including Parkinson’s disease and dementia with Lewy bodies – patients reported poorer overall health at baseline.

First author Nol Swaddiwudhipong, a junior doctor at the University of Cambridge, said: “When we looked back at patients’ histories, it became clear that they were showing some cognitive impairment several years before their symptoms became obvious enough to prompt a diagnosis. The impairments were often subtle, but across a number of aspects of cognition.

“This is a step towards us being able to screen people who are at greatest risk – for example, people over 50 or those who have high blood pressure or do not do enough exercise – and intervene at an earlier stage to help them reduce their risk.”

Senior author Dr Tim Rittman from the Department of Clinical Neurosciences at the University of Cambridge added: “People should not be unduly worried if, for example, they are not good at recalling numbers. Even some healthy individuals will naturally score better or worse than their peers. But we would encourage anyone who has any concerns or notices that their memory or recall is getting worse to speak to their GP.”

Dr Rittman said the findings could also help identify people who can participate in clinical trials for potential new treatments. “The problem with clinical trials is that by necessity they often recruit patients with a diagnosis, but we know that by this point they are already some way down the road and their condition cannot be stopped. If we can find these individuals early enough, we’ll have a better chance of seeing if the drugs are effective.”

The research was funded by the Medical Research Council with support from the NIHR Cambridge Biomedical Research Centre.

Reference
Swaddiwudhipong, N, et al. Pre-Diagnostic Cognitive and Functional Impairment in Multiple Sporadic Neurodegenerative Diseases. Alzheimer's & Dementia; 13 Oct 2022; DOI: 10.1002/alz.12802

Cambridge scientists have shown that it may be possible to spot signs of brain impairment in patients as early as nine years before they receive a diagnosis for one of a number of dementia-related diseases.

When we looked back at patients’ histories, it became clear that they were showing some cognitive impairment several years before their symptoms became obvious enough to prompt a diagnosis
Nol Swaddiwudhipong
Elderly person's hands

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Watching lithium in real time could improve performance of EV battery materials

$
0
0
Electric car charging

The team, led by the University of Cambridge, tracked the movement of lithium ions inside a promising new battery material in real time.

It had been assumed that the mechanism by which lithium ions are stored in battery materials is uniform across the individual active particles. However, the Cambridge-led team found that during the charge-discharge cycle, lithium storage is anything but uniform.

When the battery is near the end of its discharge cycle, the surfaces of the active particles become saturated by lithium while their cores are lithium deficient. This results in the loss of reusable lithium and a reduced capacity.

The research, funded by the Faraday Institution, could help improve existing battery materials and could accelerate the development of next-generation batteries. The results are published in Joule.

Electrical vehicles (EVs) are vital in the transition to a zero-carbon economy. Most electric vehicles on the road today are powered by lithium-ion batteries, due in part to their high energy density.

However, as EV use becomes more widespread, the push for longer ranges and faster charging times means that current battery materials need to be improved, and new materials need to be identified.

Some of the most promising of these materials are state-of-the-art positive electrode materials known as layered lithium nickel-rich oxides, which are widely used in premium EVs. However, their working mechanisms, particularly lithium-ion transport under practical operating conditions, and how this is linked to their electrochemical performance, are not fully understood, so we cannot yet obtain maximum performance from these materials.

By tracking how light interacts with active particles during battery operation under a microscope, the researchers observed distinct differences in lithium storage during the charge-discharge cycle in nickel-rich manganese cobalt oxide (NMC).

“This is the first time that this non-uniformity in lithium storage has been directly observed in individual particles,” said co-first author Alice Merryweather, from Cambridge’s Yusuf Hamied Department of Chemistry. “Real time techniques like ours are essential to capture this while the battery is cycling.”

Combining the experimental observations with computer modelling, the researchers found that the non-uniformity originates from drastic changes to the rate of lithium-ion diffusion in NMC during the charge-discharge cycle. Specifically, lithium ions diffuse slowly in fully lithiated NMC particles, but the diffusion is significantly enhanced once some lithium ions are extracted from these particles.

“Our model provides insights into the range over which lithium-ion diffusion in NMC varies during the early stages of charging,” said co-first author Dr Shrinidhi Pandurangi from Cambridge’s Department of Engineering. “Our model predicted lithium distributions accurately and captured the degree of heterogeneity observed in experiments. These predictions are key to understanding other battery degradation mechanisms such as particle fracture.”

Importantly, the lithium heterogeneity seen at the end of discharge establishes one reason why nickel-rich cathode materials typically lose around ten percent of their capacity after the first charge-discharge cycle.

“This is significant, considering one industrial standard that is used to determine whether a battery should be retired or not is when it has lost 20 percent of its capacity,” said co-first author Dr Chao Xu, from ShanghaiTech University, who completed the research while based at Cambridge.

The researchers are now seeking new approaches to increase the practical energy density and lifetime of these promising battery materials.

The research was supported in part by the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Alice Merryweather is jointly supervised by Professor Dame Clare Grey and Dr Akshay Rao, who are both co-authors on the current paper.  

Reference:
Chao Xu et al. ‘Operando visualization of kinetically induced lithium heterogeneities in single-particle layered Ni-rich cathodes.’ Joule (2022). DOI: 10.1016/j.joule.2022.09.008

Researchers have found that the irregular movement of lithium ions in next-generation battery materials could be reducing their capacity and hindering their performance.

Electric car charging

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Offshore carbon storage deployment and research needs to scale up for UK to deliver net zero pledge, says report

$
0
0
Close-up image of pink rocks

Published by the Royal Society and led by University of Cambridge researchers,Locked Away – Geological Carbon Storage explores the latest evidence and technical considerations for permanently storing CO2 by pumping it into deep saline aquifers or depleted oil and gas fields offshore.

Alongside sustained reductions in carbon emissions, international bodies and the UK’s Committee on Climate Change identified carbon capture and storage (CCS) as a critical technology in most possible routes to achieving net zero.  

However, the levels of CCS deployment globally have been slow and, globally, are ‘well below those anticipated to be needed to limit global warming to 1.5°C, or 2°C’, the report warns.

“Geological carbon storage will be an essential part of our long-term energy transition, both in storing emissions from hard-to-decarbonise industries, and for longer-term removal of CO2 through direct air capture,” said Professor Andy Woods from Cambridge’s Institute for Energy and Environmental Flows (IEEF), chair of the report’s working group.

“The UK’s access to potential storage sites in its offshore waters, along with a strong industrial base and regulatory and assurance environment, mean this could be an important industry.

“But thousands of wells are likely to be needed globally, and each new subsurface reservoir can take years to develop to ensure its suitability.”

Scaling up

The policy briefing considers the latest geoscience evidence and lessons from current and planned CCS projects that could inform policymakers if they pursue geological carbon storage.

It also looks at the challenges of scaling up CCS, including outstanding research and policy questions relating to transport, storage, monitoring, sustainable business models and incentives.

The IPCC special report on global warming of 1.5°C and research by the International Energy Agency suggest that 7-8 gigatonnes of CO2 will need to be stored globally each year by 2050 to keep warming below 1.5°C: this represents over 20% of present global annual fossil fuel and industrial emissions (roughly 34 gigatonnes of CO2 per year).

By 2100, cumulative storage of between 350-1200 gigatonnes of CO2 is likely to be needed to avoid the worst effects of climate change.

For the UK to deliver on its net zero carbon emissions pledge, it needs to develop new wells – and the associated injection, transport and storage infrastructure – capable of storing about 75-175 megatonnes of CO2 every year by 2050, according to the UK North Sea Transition Authority.

With CO2 injection rates currently constrained by pressurisation limits, and a 5-7 year timeframe to deploy a new reservoir, the report’s expert working group estimates this will require the equivalent of around one new carbon storage system, capable of injecting 4-5 megatonnes of CO2 per year, being added each year to 2050.

Sustained investment

To date, the upfront capital costs, lack of sufficient and predictable incentives to support operating costs, and concerns over the social acceptability in many jurisdictions have contributed to a global under-deployment of CCS.

The Global CCS Institute’s 2021 survey lists 27 CCS projects as being operational, capturing 36.6 megatonnes of CO2 per year, with a further 62 projects listed as either in construction or advanced development. If successfully deployed, the combined capture potential would be 86.4 megatonnes of CO2 per year.

A UK target of delivering CCS in four industrial clusters, set under the previous government, aims to capture and store around 20-30 megatonnes of CO2 each year. With Phase 1 sites, in the East Coast Cluster (Teesside plus Humber) and HyNet in the Northwest, targeting delivery in the middle of this decade. 

Scaling up required capacity, the report says, demands an enormous and continued global investment each year to 2050 to build the injection wells, transport networks, monitoring technologies, and a skilled workforce, to install hundreds of new wells each year.

“We have technology to store and monitor carbon in this way,” said Woods.

“But as deployment of these technologies rolls out, there will likely be many new challenges, especially since each storage reservoir has its own unique geological structure and setting.

“So we need to continue to invest in research, and the policy and regulatory frameworks that are required to scale up safely and at pace.”

In particular, the report highlights the need to understand the storage capacity and properties of different geological formations; the critical pressures which might cause seal rocks to fail and leak; different monitoring strategies for detecting CO2 leaks, new understanding of some of the geochemical processes; and the potential to increase capacity in old wells.

There is also a need to for ongoing effective public dialogue to highlight the importance of carbon storage in mitigating climate change, and to understand and address the concerns of communities and citizens.

Adapted from a story by the Royal Society.

The UK will need to step up research and deployment of new offshore carbon storage wells if it is to achieve the capacity required to deliver its net zero emissions plans, a new report says.

Geological carbon storage will be an essential part of our long-term energy transition, both in storing emissions from hard-to-decarbonise industries, and for longer-term removal of CO2 through direct air capture
Andy Woods
Rock abstract

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

UK policing: psychological damage among officers heightened by bad working conditions

$
0
0
Police officers in the UK

High levels of trauma-related mental health disorders across UK police forces are partly the result of bad working conditions such as having too little time, sexual harassment, and dealing with difficult situations without support, according to a study led by the University of Cambridge. 

However, officers who say they feel supported by colleagues, and have a sense of doing meaningful work, had around half the rates of a form of PTSD as the national average for policing staff.

Researchers behind the study say their findings suggest that simple improvements to the working lives of police – scheduled time for support from peers and supervisors, for example – could dramatically reduce the level of psychiatric problems in UK forces. 

Sociologists surveyed thousands of police personnel across the country in 2018 and found that 12% showed clinical symptoms of Complex Post-Traumatic Stress Disorder (C-PTSD), a chronic condition in which repeated trauma exposure causes social disconnection, feelings of worthlessness, and an inability to regulate emotions.

Complex PTSD often leads to 'burnout' and substance abuse. In fact, 90% of police workers in the original survey study The Job, The Life had experienced trauma, and one in five of these reported symptoms of either PTSD or C-PTSD.

Now, the same team of researchers have analysed survey data provided by 12,248 serving police officers to determine the working conditions and on-the-job situations with the strongest links to Complex PTSD. The latest findings are published in the journal Policing.

Trauma detailed by officers with probable levels of Complex PTSD based on the survey screening included dealing with fatal car accidents, rapes, homicides, suicides – including of children – and drug overdoses.  

Exposure to physical violence made little difference to rates of C-PTSD, nor did long working hours.

However, officers who described it as “very difficult” to take time away from the job for personal or family matters had C-PTSD rates over 50% higher than the UK-wide average for police.  

Those who described their relationship between work and personal life as “not fitting well at all”, some 15% of police officers in the study, had twice (24%) the average policing rates of C-PTSD. 

One officer suffering with probable C-PTSD described how what you see “impacts on your life outside of work”, offering the example of cases involving dead children that “make you anxious about your own children's wellbeing. To a degree you lose your innocence.”    

Another C-PTSD sufferer said “it is a given and accepted” that the job means exposure to trauma, and describes the occupational health team in their force as “brilliant” but few in number. “They are only able to put 'sticky plasters' on, and send the officers back out,” the officer said.

Police officers who described never having enough time to “get the job done” had almost double the rates of C-PTSD as the average across UK forces, 22% compared to 12%, as did officers who reported experiencing sexual harassment – whether from the public or colleagues.    

Officers who said they could never rely on the help and support of colleagues were most likely to suffer with Complex PTSD, with over 43% displaying symptoms, but such claims were relatively rare.

One detective with C-PTSD symptoms recounted dealing with sexual abuse cases as the sole investigating officer. “Little or no support from management. Victims hanging all their hopes and pressures on me.”

By contrast, C-PTSD rates were just 7% among those who said they could always rely on colleagues, and just 6% among those who say they regularly get a feeling of a job well done, with researchers claiming that a sense of meaningful work may provide a 'protective effect' mentally.  

“Our research shows that the debilitating psychological misery often caused by trauma exposure isn’t an inevitable part of the difficult job of policing, it is exacerbated by poor working conditions,” said Prof Brendan Burchell, lead author from Cambridge’s Department of Sociology.    

The team also conducted analyses beyond individual officers to compare forces, revealing a strong link between “wor' intensity'– those forces with more officers reporting a lack of time to effectively police – and increased rates of Complex PTSD.

Of 18 anonymised UK police forces, the one with the highest reported time constraints among officers had C-PTSD rates of 29%, well over double the average for the overall policing population.

“Severe austerity cuts since 2010 leading to a marked reduction in police numbers without a decrease in the demands of the job inevitably creates more time pressure for remaining officers,” said Burchell.

“Single-crewing, shift work and fewer resources mean that time for encouraging words between colleagues or space for officers to acknowledge their traumatic experiences are few and far between.”

One officer with probable C-PTSD described being “single crewed” at a rural location for a year, with nearest support almost an hour away. Another spoke of going from a shift team of five to working alone. “My coping strategy of being around colleagues who had been to the same fatal accident or suicide was taken away from me.” 

Cambridge co-author Dr Jessica Miller, who is also director of research for Police Care UK, the charity that funded the research, added: “The police forces reporting the best working conditions had much lower rates of PTSD. Modest investments to improve their working conditions could see significant reductions in psychological problems among police officers.”

Nationwide study of over 12,000 officers suggests rates of trauma-induced disorder Complex PTSD are exacerbated by factors such as too little time and support, and lack of say over working hours.

The debilitating psychological misery often caused by trauma exposure isn’t an inevitable part of the difficult job of policing
Brendan Burchell

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge researchers join new £2 million UK consortium to tackle monkeypox outbreak

$
0
0
Monkeypox virus - 3D render

The consortium has received £2 million from the Biotechnology and Biosciences Research Council and the Medical Research Council, both part of UK Research and Innovation (UKRI). It is led by the Pirbright Institute and the MRC-University of Glasgow Centre for Virus Research.

Researchers will work closely with experts at government agencies – the Animal and Plant Health Agency, UK Health Security Agency, and Defence Science and Technology Laboratory – to study the current outbreak and inform the public health response in the UK and internationally.

Cambridge scientists Professor Geoffrey Smith from the Department of Pathology and Professor Mike Weekes from the Cambridge Institute for Medical Research and Department of Medicine are among the key scientists involved in the consortium.

Professor Weekes said: "Monkeypox has become a really important global pathogen, reaching more than 50 countries worldwide in a matter of months. Although we have an effective vaccine and treatment, global roll-out has so far proved challenging, emphasising the importance of a comprehensive understanding of this virus. The UK consortium includes researchers from multiple different disciplines, and I anticipate the data we generate will rapidly help understand how the virus can be targeted in new ways to prevent disease."

Professor Smith said: “Few would have predicted that monkeypox virus would be causing a global epidemic in 2022. The ability to respond quickly to this new challenge has been helped greatly not just by the swift and welcome response of UKRI, but also by decades of support for the study of orthopoxviruses from UKRI and the Wellcome Trust. The information gained from those studies is valuable in the fight against monkeypox virus.”

The monkeypox virus outbreak originated in West Africa. The current worldwide outbreak of cases spreading outside this area was first identified in May 2022. This is the first time that many monkeypox cases and clusters have been reported in non-endemic areas.

In the UK there have been more than 3,400 confirmed cases since May, although case numbers are currently falling. Internationally, WHO reports it has spread to 106 countries and territories with 25 confirmed deaths.

Professor Melanie Welham, Executive Chair of BBSRC, said: “One of the real strengths of the UK’s scientific response to disease outbreaks is the way that we can draw on leading researchers from all over the country, who can pool their expertise to deliver results, fast. Long-term support for animal and human virus research has ensured we have the capability to respond with agility.

“This new national consortium will study the unprecedented monkeypox outbreak to better understand how to tackle it. This will feed rapidly into global public health strategies, developing new diagnostic tests and identifying potential therapies.”

The consortium will focus on building our understanding in a number of key areas, including:

Developing new tests and identifying potential control measures:

  • Developing sensitive point-of-care tests to speed up diagnosis, such as lateral flow tests or LAMP* tests. The lateral flow test development will be conducted with Global Access Diagnostics (GADx) to develop a product which could later be manufactured at scale and used clinically worldwide, including in low/middle income countries.
  • Screening potential drugs to treat monkeypox in human cells in the lab to determine which ones could be developed for further testing.
  • Studying the virus, how it infects humans and its susceptibility to the immune response to identify targets for future therapies.

Studying the virus:

  • Characterising the genome of the virus and studying how it is evolving, and how this is linked to changes in the transmission and pathology of the virus.
  • Understanding the human immune response to the virus and the vaccine, including studying samples from infected individuals.
  • Identifying animal reservoirs and potential spill-over routes of transmission between animals and humans.

Learning from the vaccine roll-out:

  • Studying the effectiveness of the smallpox vaccine by tracking the immune responses after primary and secondary vaccination of up to 200 individuals.

Professor Bryan Charleston, co-lead from The Pirbright Institute, said: “The implications of the current monkeypox outbreak are huge. As well as tackling the current outbreak, we also need to be fully prepared for next outbreak, because worldwide there’s a huge reservoir of infection. One of the key ways we can do this is to develop rapid tests, which are very important to help clinicians on the front line to manage the disease.”

Professor Massimo Palmarini, co-lead from the MRC-University of Glasgow Centre for Virus Research, said: “Monkeypox is public health challenge, so taking decisive, collective action to better understand this virus is paramount. By bringing together research expertise in different areas, we will harness the UK’s world-leading knowledge to learn more about how the virus works and spreads and provide the foundations for the development of potential new treatments.”

Adapted from a press release from UKRI

Cambridge is among 12 institutions across the UK that will be working together to tackle the monkeypox outbreak, developing better diagnostic tests, identifying potential therapies and studying vaccine effectiveness and the virus’ spread.

Few would have predicted that monkeypox virus would be causing a global epidemic in 2022
Geoffrey Smith
Monkeypox virus - 3D render

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Clinical trial for new stem cell-based treatment for Parkinson’s disease given go ahead

$
0
0
Hands of an elderly man with walking stick

The Swedish Medical Products Agency has granted approval for the trial to proceed; ethical approval has already been obtained from the Swedish Ethics Review Authority. The team, led from Lund University in Sweden, is poised to begin recruitment.

STEM-PD uses human embryonic stem cells, a type of cell that can turn into almost any type of cell in the body. The team has ‘programmed’ the cells to develop into dopamine nerve cell, which will be transplanted into the brains of patients to replace cells that are lost in Parkinson’s disease. The product has already been shown to be safe and effective at reverting motor deficits in animal models of Parkinson’s disease.

The trial is a collaboration with colleagues at Skåne University Hospital, the University of Cambridge, Cambridge University Hospitals NHS Foundation Trust (CUH), and Imperial College London.

Professor Roger Barker from the Wellcome-MRC Stem Cell Institute at the University of Cambridge and CUH is clinical lead on the project. “The use of stem cells will in theory enable us to make unlimited amounts of dopamine neurons and thus opens the prospect of producing this therapy to a wide patient population. This could transform the way we treat Parkinson’s disease”

This is the first such trial in Europe and the preclinical and clinical studies of STEM-PD have been funded by national and EU funding agencies. In addition, the STEM-PD team has obtained funding and valuable support for the current study from Novo Nordisk; a collaboration which will continue for future product development.

The cells to be used in the trial have been manufactured under ‘good manufacturing practice’ at the Royal Free Hospital in London and have undergone rigorous testing in the lab.

Professor Malin Parmar who leads the STEM-PD team from Lund University said: “We are looking forward to this clinical study of STEM-PD, hoping that it could potentially help address the significant burden of Parkinson’s disease. This has been a massive team effort for over a decade, and the regulatory approval is a major and important milestone.”

The STEM-PD trial will assess safety and tolerability of the transplanted product one year after transplantation, measuring the effects on Parkinson’s symptoms. The trial will enrol eight patients for transplantation, starting with patients from Sweden, and with subsequent plans for enrolment of patients also from Cambridge University Hospitals. All transplantation surgery will be performed at Skåne University Hospital in Lund.

Parkinson’s disease is the second most common neurodegenerative disease worldwide, yet remains without a cure. Typical motor symptoms of Parkinson’s disease are slowness of movement, tremor and stiffness and later also gait difficulties. It is not well known how the disease arises or develops, but the core feature common to all patients is the loss of dopamine neurons in the midbrain.

Suitable patients will be invited to participate in the trial; it is not possible to volunteer to participate.

Adapted from a press statement from Lund University

Cambridge researchers will play a key role in clinical trials of a new treatment that involves transplanting healthy nerve cells into the brains of patients with Parkinson’s disease.

This could transform the way we treat Parkinson’s disease
Roger Barker
Hands of an elderly man with walking stick

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Artificial intelligence powers record-breaking all-in-one miniature spectrometers

$
0
0
On-chip spectrometer on a fingertip

We see light and colours around us every day. However, to analyse the information it carries, we must analyse light using spectrometers, in the lab. These devices detect sparkles and substances that our eyes would otherwise not notice.

Now, an international team of researchers, including the University of Cambridge, have designed a miniaturised spectrometer that breaks all current resolution records, and does so in a much smaller package, thanks to computational programmes and artificial intelligence.

The new miniaturised devices could be used in a broad range of sectors, from checking the quality of food to analysing starlight or detecting faint clues of life in outer space. The results are reported in the journal Science.

Traditionally, spectrometers rely on bulky components to filter and disperse light. Modern approaches simplify these components to shrink footprints, but still suffer from limited resolution and bandwidth. Additionally, traditional spectrometers are heavy and take up extraordinary amounts of space, which limits their applications in portable and mobile devices.

To tackle these problems, and shrink the size of the system, researchers have coupled layered materials with artificial intelligence algorithms. The result is an all-in-one spectrometer thousands of times smaller than current commercial systems. At the same time, it offers performance comparable to benchtop systems. In other words, these new spectrometers will provide portable alternatives to uncover otherwise invisible information, without even going into the lab.

“We eliminate the need for detector arrays, dispersive components, and filters. It’s an all-in-one, miniaturised device that could revolutionise this field,” said Dr Hoon Hahn Yoon, from Aalto University in Finland, first author of the paper. This spectrometer-on-chip technology is expected to offer high performance and new usability across science and industry.

The detector uses van der Waals heterostructures – a ‘sandwich’ of different ingredients, including graphene, molybdenum disulfide, and tungsten diselenide. Different combinations of material components enable light detection beyond the visible spectrum, as far as the near-infrared region. This means the spectrometer detects more than just colour, enabling applications such as chemical analysis and night vision.

“We detect a continuum spectrum of light, opening a world of possibilities in a myriad of markets,” said Yoon. “Exploring other material combinations could uncover further functionalities, including even broader hyperspectral detection and improved resolution.”

Artificial intelligence is a key aspect of these devices, commonly called ‘computational’ spectrometers. This technology compensates for the inherent noise increase that inevitably occurs when the optical component is wholly removed.

“We were able to use mathematical algorithms to successfully reconstruct the signals and spectra, it’s a profound and transformative technological leap,” said lead author Professor Zhipei Sun, also from Aalto University, and a former member of Cambridge’s Department of Engineering. “The current design is just a proof-of-concept. More advanced algorithms, as well as different combinations of materials, could soon provide even better miniaturised spectrometers.”

Spectrometers are used for toxin detection in food and cosmetics, cancer imaging, and in spacecraft – including the James Webb Space Telescope. And they will soon become more common thanks to the development and advancement of technologies such as the Internet of Things and Industry 4.0.

The detection of light – and the full analysis of spectroscopic information – has applications in sensing, surveillance, smart agriculture, and more. Among the most promising applications for miniaturised spectrometers are chemical and biochemical analysis, thanks to the capabilities of the devices to detect light in the infrared wavelength range.

The new devices could be incorporated into instruments like drones, mobile phones, and lab-on-a-chip platforms, which can carry out several experiments in a single integrated circuit. The latter also opens up opportunities in healthcare. In this field, spectrometers and light-detectors are already key components of imaging and diagnostic systems – the new miniaturised devices could enable the simultaneous visualisation and detection of ‘chemical fingerprints’, leading to possibilities in the biomedical area.

“Our miniaturised spectrometers offer high spatial and spectral resolution at the micrometre and nanometre scales, which is particularly exciting for responsive bio-implants and innovative imaging techniques,” said co-author Professor Tawfique Hasan, from the Cambridge Graphene Centre.

This technology has huge potential for scalability and integration, thanks to its compatibility with well-established industrial processes. It could open up the future for the next generation of smartphone cameras that evolve into hyperspectral cameras that conventional colour cameras cannot do. Researchers hope their contribution is a stepping stone towards the development of more advanced computational spectrometers, with record-breaking accuracy and resolution. This example, they say, is just the first of many.

Reference:
Hoon Hahn Yoon et al. ‘Miniaturized Spectrometers with a Tunable van der Waals Junction.’ Science (2022). DOI: 10.1126/science.add8544.

Using Artificial Intelligence (AI) to replace optical and mechanical components, researchers have designed a tiny spectrometer that breaks all current resolution records.

On-chip spectrometer on a fingertip

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Likelihood of receiving an autism diagnosis may depend on where you live

$
0
0
Autistic child

The latest findings, from researchers from the University of Cambridge in collaboration with researchers from the London School of Economics and Political Science and Newcastle University, are published today in The Lancet Child & Adolescent Health.

After analysing all new autism cases across England using NHS health service boundaries for possible hotspots, some areas stand out. For example, 45.5% of the NHS Rotherham catchment area had higher-than-average new autism diagnoses clusters. For NHS Heywood, this amounted to 38.8% of its catchment area and 36.9% for NHS Liverpool, pointing at a possible health service effect towards who receives an autism diagnosis.

The research team used four years’ worth of data from the Summer School Census, which collected data from individuals aged 1-18 years old in state-funded schools in England. Of the 32 million pupils studied, more than 102,000 new autism diagnoses were identified between 2014 and 2017.

After adjusting for age and sex, the researchers found that one in 234 children were given a new autism diagnosis during that four-year period. New diagnoses tended to happen when children were transitioning to a new school, whether that was into nursery (1-3 years), primary school (4-6), and secondary school (10-12 years). 

Particular communities appeared to have different rates, varying by ethnicity and deprivation. 

Lead researcher Dr Andres Roman-Urrestarazu from the Department of Psychiatry and Cambridge Public Health at the University of Cambridge said: “Autism diagnoses are more common among Black students and other minority ethnic groups. Why this is the case is not clear and so we need to explore the role played by social factors such as ethnicity and area deprivation as well as the nature of local services.”

The likelihood of receiving an autism diagnosis more than tripled among girls depending on their ethnicity and social and financial situation compared to white girls without financial disadvantages who speak English as their first language.

In contrast, boys’ likelihood of receiving an autism diagnosis increased more than five-fold depending on their ethnicity and social and financial situation compared to white boys without financial disadvantages who speak English as their first language.

Boys and young men are already known to be more likely to receive autism diagnoses, but the social determinants that could affect a diagnosis remained an open question.

Dr Robin van Kessel, co-lead researcher from the Department of Health Policy at the London School of Economics and Political Science said: “These new findings show how social determinants interact and can combine to significantly increase the likelihood of an autism diagnosis. As a result, individuals from a minority ethnic background experiencing economic hardship may be significantly more likely to receive an autism diagnosis than their peers.”

Professor Carol Brayne from Cambridge Public Health said: “There are clear inequalities in an individual’s likelihood of receiving an autism diagnosis, whether they are socioeconomic factors, ethnicity or even which NHS region or local authority someone lives in.”

This work was supported by the Commonwealth Fund Harkness Fellowship, Institute for Data Valorization, Fonds de recherche du Québec—Santé, Calcul Quebec, Digital Research Alliance of Canada, Wellcome Trust, Innovative Medicines Initiative, Autism Centre of Excellence at Cambridge, Simons Foundation Autism Research Initiative, Templeton World Charitable Fund, Medical Research Council, NIHR Cambridge Biomedical Research Centre, and the NIHR Applied Research Collaboration East of England—Population Evidence and Data Science.

Reference
Roman-Urrestarazu, A et al. Autism incidence and spatial analysis in more than 7 million pupils in English schools: a retrospective, longitudinal, school registry study. Lancet Child & Adolescent Health; 25 Oct 2022; DOI: 10.1016/S2352-4642(22)00247-4

New autism diagnoses tend to be clustered within specific NHS service regions, suggesting that where an individual lives may influence whether they receive an autism diagnosis and access to special education needs support.

There are clear inequalities in an individual’s likelihood of receiving an autism diagnosis, whether they are socioeconomic factors, ethnicity or even which NHS region or local authority someone lives in
Carol Brayne
Autistic child

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

New approach to ‘cosmic magnet’ manufacturing could reduce reliance on rare earths in low-carbon technologies

$
0
0
Tetrataenite found in Nuevo Mercurio, Zacatecas, Mexico

A team from the University of Cambridge, working with colleagues from Austria, found a new way to make a possible replacement for rare-earth magnets: tetrataenite, a ‘cosmic magnet’ that takes millions of years to develop naturally in meteorites.

Previous attempts to make tetrataenite in the laboratory have relied on impractical, extreme methods. But the addition of a common element – phosphorus – could mean that it’s possible to make tetrataenite artificially and at scale, without any specialised treatment or expensive techniques.

The results are reported in the journal Advanced Science. A patent application on the technology has been filed by Cambridge Enterprise, the University’s commercialisation arm, and the Austrian Academy of Sciences.

High-performance magnets are a vital technology for building a zero-carbon economy, and the best permanent magnets currently available contain rare earth elements. Despite their name, rare earths are plentiful in Earth’s crust. However, China has a near monopoly on global production: in 2017, 81% of rare earths worldwide were sourced from China. Other countries, such as Australia, also mine these elements, but as geopolitical tensions with China increase, there are concerns that rare earth supply could be at risk.

“Rare earth deposits exist elsewhere, but the mining operations are highly disruptive: you have to extract a huge amount of material to get a small volume of rare earths,” said Professor Lindsay Greer from Cambridge’s Department of Materials Science & Metallurgy, who led the research. “Between the environmental impacts, and the heavy reliance on China, there’s been an urgent search for alternative materials that do not require rare earths.”

Tetrataenite, an iron-nickel alloy with a particular ordered atomic structure, is one of the most promising of those alternatives. Tetrataenite forms over millions of years as a meteorite slowly cools, giving the iron and nickel atoms enough time to order themselves into a particular stacking sequence within the crystalline structure, ultimately resulting in a material with magnetic properties approaching those of rare-earth magnets.

In the 1960s, scientists were able to artificially form tetrataenite by bombarding iron-nickel alloys with neutrons, enabling the atoms to form the desired ordered stacking, but this technique is not suitable for mass production.

“Since then, scientists have been fascinated with getting that ordered structure, but it’s always felt like something that was very far away,” said Greer. Despite many attempts over the years, it has not yet been possible to make tetrataenite on anything approaching an industrial scale.

Now, Greer and his colleagues from the Austrian Academy of Sciences and the Montanuniversität in Leoben, have found a possible alternative that doesn’t require millions of years of cooling or neutron irradiation.

The team was studying the mechanical properties of iron-nickel alloys containing small amounts of phosphorus, an element that is also present in meteorites. The pattern of phases inside these materials showed the expected tree-like growth structure called dendrites.

“For most people, it would have ended there: nothing interesting to see in the dendrites, but when I looked closer, I saw an interesting diffraction pattern indicating an ordered atomic structure,” said first author Dr Yurii Ivanov, who completed the work while at Cambridge and is now based at the Italian Institute of Technology in Genoa.

At first glance, the diffraction pattern of tetrataenite looks like that of the structure expected for iron-nickel alloys, namely a disordered crystal not of interest as a high-performance magnet. It took Ivanov’s closer look to identify the tetrataenite, but even so, Greer says it’s strange that no one noticed it before.

The researchers say that phosphorus, which is present in meteorites, allows the iron and nickel atoms to move faster, enabling them to form the necessary ordered stacking without waiting for millions of years. By mixing iron, nickel and phosphorus in the right quantities, they were able to speed up tetrataenite formation by between 11 and 15 orders of magnitude, such that it forms over a few seconds in simple casting.

“What was so astonishing was that no special treatment was needed: we just melted the alloy, poured it into a mould, and we had tetrataenite,” said Greer. “The previous view in the field was that you couldn’t get tetrataenite unless you did something extreme, because otherwise, you’d have to wait millions of years for it to form. This result represents a total change in how we think about this material.”

While the researchers have found a promising method to produce tetrataenite, more work is needed to determine whether it will be suitable for high-performance magnets. The team are hoping to work on this with major magnet manufacturers.

The work may also force a revision of views on whether the formation of tetrataenite in meteorites really does take millions of years.

The research was supported in part by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme and Seventh Framework Programme, and the Austrian Science Fund.

 

Reference:
Yurii P Ivanov et al. ‘Direct formation of hard-magnetic tetrataenite in bulk alloy castings.’ Advanced Science (2022). DOI: 10.1002/advs.202204315

Researchers have discovered a potential new method for making the high-performance magnets used in wind turbines and electric cars without the need for rare earth elements, which are almost exclusively sourced in China.

Between the environmental impacts, and the heavy reliance on China, there’s been an urgent search for alternative materials that do not require rare earths
Lindsay Greer
Tetrataenite found in Nuevo Mercurio, Zacatecas, Mexico

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Autistic people are more likely to experience depression and anxiety during pregnancy

$
0
0
Pregant woman

In the study, led by researchers at the Autism Research Centre, 524 non-autistic people and 417 autistic people completed an online survey about their experience of pregnancy. Anyone who was pregnant at the time of responding or had previously given birth was eligible to take part.

The study revealed that autistic parents were around three times more likely than non-autistic parents to report having experienced prenatal depression (9% of non-autistic parents and 24% of autistic parents) and anxiety (14% of non-autistic parents and 48% of autistic parents).

Autistic respondents also experienced lower satisfaction with pregnancy healthcare. Autistic respondents were less likely to trust professionals, feel that professionals took their questions and concerns seriously, feel that professionals treated them respectfully, and be satisfied with how information was presented to them in appointments. Furthermore, autistic respondents were more likely to experience sensory issues during pregnancy and more likely to feel overwhelmed by the sensory environment of prenatal appointments.

Dr Sarah Hampton, lead researcher on the study, said: “This study suggests that autistic people are more vulnerable to mental health difficulties during pregnancy. It is imperative that effective mental health screening and support is available for autistic people during pregnancy.”

Dr Rosie Holt, a member of the research team, added: “The results also suggest that autistic people may benefit from accommodations to prenatal healthcare. These may include adjustments to the sensory environment of healthcare settings, as well as adjustments to how information is communicated during prenatal appointments.”

Dr Carrie Allison, Deputy Director of the Autism Research Centre and a member of the team, said: “We are grateful to members of the autistic community for providing feedback when we designed this research. It is vital that autistic people with lived experience help shape the research we do, and we keep their priorities as a clear focus.”

Professor Simon Baron-Cohen, Director of the Autism Research Centre and a member of the research team, said: “It is important that more research is conducted looking at the experiences of autistic new parents, who have been neglected in research. It is also important that this research is translated into health and social care policy and practice to ensure these parents receive the support and adaptations they need in a timely manner.”

Reference
Hampton, S., Allison, C., Baron-Cohen, S., & Holt, R. (2022). Autistic People’s Perinatal Experiences I: A Survey of Pregnancy Experiences. Journal of Autism and Developmental Disorders

Autistic people are more vulnerable to depression and anxiety during pregnancy, according to new research from the University of Cambridge. The results are published in the Journal of Autism and Developmental Disorders and have important implications for supporting autistic people during pregnancy.

This study suggests that autistic people are more vulnerable to mental health difficulties during pregnancy. It is imperative that effective mental health screening and support is available for autistic people during pregnancy
Sarah Hampton
Pregant woman

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

UK police fail to meet 'legal and ethical standards' in use of facial recognition

$
0
0

A team from the University of Cambridge’s Minderoo Centre for Technology and Democracy created the new audit tool to evaluate “compliance with the law and national guidance” around issues such as privacy, equality, and freedom of expression and assembly.

Based on the findings, published in a new report, the experts are joining calls for a ban on police use of facial recognition in public spaces.

“There is a lack of robust redress mechanisms for individuals and communities harmed by police deployments of the technology,” said the report’s lead author Evani Radiya-Dixit, a visiting fellow at Cambridge’s Minderoo Centre.

“To protect human rights and improve accountability in how technology is used, we must ask what values we want to embed in technology.”

Researchers constructed the audit tool based on current legal guidelines – including the UK’s Data Protection and Equality acts – as well as outcomes from UK court cases and feedback from civil society organisations and the Information Commissioner's Office.

They applied their ethical and legal standards to three uses of facial recognition technology (FRT) by UK police. One was the Bridges court case, in which a Cardiff-based civil liberties campaigner appealed against South Wales Police’s use of automated FRT to live-scan crowds and compare faces to those on a criminal “watch list”.  

The researchers also tested the Metropolitan Police’s trials of similar live FRT use, and a further example from South Wales Police in which officers used FRT apps on their smartphones to scan crowds in order to identify “wanted individuals in real time”.

In all three cases, they found that important information about police use of FRT is “kept from view”, including scant demographic data published on arrests or other outcomes, making it difficult to evaluate whether the tools “perpetuate racial profiling” say researchers.

In addition to lack of transparency, the researchers found little in the way of accountability – with no clear recourse for people or communities negatively affected by police use, or misuse, of the tech. “Police forces are not necessarily answerable or held responsible for harms caused by facial recognition technology,” said Radiya-Dixit.

Some of the FRT uses lacked regular oversight from an independent ethics committee or indeed the public, say the researchers, and did not do enough to ensure there was a reliable “human in the loop” when scanning untold numbers of faces among crowds of thousands while hunting for criminals.

In the South Wales Police’s smartphone app trial, even the “watch list” included images of people innocent under UK law – those previously arrested but not convicted – despite the fact that retention of such images is unlawful.

“We find that all three of these deployments fail to meet the minimum ethical and legal standards based on our research on police use of facial recognition," said Radiya-Dixit.

Prof Gina Neff, Executive Director at the Minderoo Centre for Technology and Democracy, said: “Over the last few years, police forces around the world, including in England and Wales, have deployed facial recognition technologies. Our goal was to assess whether these deployments used known practices for the safe and ethical use of these technologies.” 

“Building a unique audit system enabled us to examine the issues of privacy, equality, accountability, and oversight that should accompany any use of such technologies by the police,” Neff said.

Officers are increasingly under-resourced and overburdened, write the researchers, and FRT is seen as a fast, efficient and cheap way to track down persons of interest.

At least ten police forces in England and Wales have trialled facial recognition, with trials involving FRT use for operational policing purposes – although different forces use different standards.

Questions of privacy run deep for policing technology that scans and potentially retains vast numbers of facial images without knowledge or consent. The researchers highlight a possible “chilling effect” if FRT leads to a reluctance to exercise fundamental rights among the public – right to protest, for example – for fear of potential consequences.

Use of FRT also raises discrimination concerns. The researchers point out that, historically, surveillance systems are used to monitor marginalised groups, and recent studies suggest the technology itself contains inherent bias that disproportionately misidentifies women, people of colour, and people with disabilities.

Given regulatory gaps and failures to meet minimum standards set out by the new audit toolkit, the researchers write that they support calls for a “ban on police use of facial recognition in publicly accessible spaces”.

 

Researchers devise an audit tool to test whether police use of facial recognition poses a threat to fundamental human rights, and analyse three deployments of the technology by British forces – with all three failing to meet “minimum ethical and legal standards”.  

Building a unique audit system enabled us to examine the issues of privacy, equality, accountability, and oversight that should accompany any use of such technologies by the police
Gina Neff
Image from the report 'A Sociotechnical Audit: Assessing Police use of Facial Recognition'

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Companies’ ‘deforestation-free’ supply chain pledges have barely impacted forest clearance in the Amazon

$
0
0
An area of the Amazon rainforest cleared for soya production

Corporate pledges not to buy soybeans produced on land deforested after 2006 have reduced tree clearance in the Brazilian Amazon by just 1.6% between 2006 and 2015.

This equates to a protected area of 2,300 km2 in the Amazon rainforest: barely the size of Oxfordshire.

The findings, made by tracing traders’ soy supplies back to their source, are published today in the journal Environmental Research Letters. The work involved a team from the University of Cambridge, Boston University, ETH Zurich and New York University.

The researchers also discovered that in the Cerrado, Brazil’s tropical savannah, zero-deforestation commitments have not been adopted effectively - leaving over 50% of soy-suitable forests and their biodiversity without protection.

Brazil has the largest remaining tropical forests on the planet, but these are being rapidly cleared to rear cattle and grow crops including soybean. Demand for soy is surging around the world, and an estimated 4,800 km2 of rainforest is cleared each year to grow soybeans.

The majority of soy is consumed indirectly by humans: soybean is widely used as feed for factory-farmed chickens, pigs, fish and cattle. It also accounts for around 27% of global vegetable oil production, and as a complete protein source it often forms a key part of vegetarian and vegan diets.

By 2021, at least 94 companies had adopted zero-deforestation commitments – pledging to eliminate deforestation from their supply chains. But the study revealed that many of these commitments are not put into practice.

And the researchers say that adoption of zero-deforestation commitments is lagging among small and medium sized food companies.

“Zero-deforestation pledges are a great first step, but they need to be implemented to have an effect on forests – and right now it’s mainly the bigger companies that have the resources to do this,” said Professor Rachael Garrett, Moran Professor of Conservation and Development at the University of Cambridge Conservation Research Institute, a joint senior author of the report.

She added: “If soybean traders actually implemented their global commitments for zero-deforestation production, current levels of forest clearance in Brazil could be reduced by around 40 percent.”

Deforestation is the second largest contributor to global greenhouse gas emissions after fossil fuel use. It also causes the loss of diverse animal and plant life, threatens the livelihoods of indigenous groups, and increases inequality and conflict.

The researchers say that the supply chains of other food products including cattle, oil palm and cocoa supply chains are more complex than soy, making them even more difficult to monitor.

“If supply chain policies intend to contribute to the task of tackling deforestation in Brazil, it’s crucial to expand zero-deforestation supply chain policies beyond soy,” said Garrett, who is also Professor of Environmental Policy at ETH Zurich.

A ‘soy moratorium’ was the first voluntary zero-deforestation commitment in the tropics – by signing it, companies agreed not to buy soybeans produced on land deforested after 2006. But while the commitment was implemented in the Brazilian Amazon, most Brazilian soy is produced in the Cerrado – which is rich in biodiversity.

The researchers say their findings suggest private sector efforts are not enough to halt deforestation: supportive political leadership is also vital to conservation efforts.

“Supply chain governance should not be a substitute for state-led forest policies, which are critical to enable zero deforestation monitoring and enforcement, have better potential to cover different crops, land users, and regions,” said Garrett.

In 2021, the COP26 Glasgow Leaders’ Declaration on Forests and Land Use committed to halt and reverse deforestation by 2030. It was signed by over 100 countries, representing 85% of global forests.

This research was funded by the US National Science Foundation, NASA Land-Cover and Land-Use Change Program, and US Department of Agriculture's National Institute of Food and Agriculture.

Reference

Gollnow, F., Cammelli F., Carlson, K.M., and Garrett, R. D. ‘Gaps in Adoption and Implementation Limit the Current and Potential Effectiveness of Zero-Deforestation Supply Chain Policies for Soy.’ October 2022, Environmental Research Letters. DOI: 10.1088/1748-9326/ac97f6

More companies must make and implement zero-deforestation supply chain commitments in order to significantly reduce deforestation and protect diverse ecosystems, say researchers.

Zero-deforestation pledges are a great first step, but they need to be implemented to have an effect on forests.
Rachael Garrett
An area of the Amazon rainforest cleared for soya production

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Just like humans, more intelligent jays have greater self-control

$
0
0
Jay

This is the first evidence of a link between self-control and intelligence in birds.

Self-control - the ability to resist temptation in favour of a better but delayed reward – is a vital skill that underpins effective decision-making and future planning.

Jays are members of the corvid family, often nicknamed the ‘feathered apes’ because they rival non-human primates in their cognitive abilities. Corvids hide, or ‘cache’, their food to save it for later. In other words, they need to delay immediate gratification to plan for future meals. The researchers think this may have driven the evolution of self-control in these birds.

Self-control has been previously shown to be linked to intelligence in humans, chimpanzees and – in an earlier study by these researchers – in cuttlefish. The greater the intelligence, the greater the self-control.

The new results show that the link between intelligence and self-control exists across distantly related animal groups, suggesting it has evolved independently several times.

Of all the corvids, jays in particular are vulnerable to having their caches stolen by other birds. Self-control also enables them to wait for the right moment to hide their food without being seen or heard.

The results are published today in the journal Philosophical Transactions of the Royal Society B.

To test the self-control of ten Eurasian jays, Garrulus glandarius, researchers designed an experiment inspired by the 1972 Stanford Marshmallow test - in which children were offered a choice between one marshmallow immediately, or two if they waited for a period of time.

Instead of marshmallows, the jays were presented with mealworms, bread and cheese. Mealworms are a common favourite; bread and cheese come second but individuals vary in their preference for one over the other.

The birds had to choose between bread or cheese - available immediately, and mealworm that they could see but could only get to after a delay, when a Perspex screen was raised. Could they delay immediate gratification and wait for their favourite food?

A range of delay times was tested, from five seconds to five and a half minutes, before the mealworm was made available if the bird had resisted the temptation to eat the bread or cheese.

All the birds in the experiment managed to wait for the worm, but some could wait much longer than others. Top of the class was ‘JayLo’, who ignored a piece of cheese and waited five and a half minutes for a mealworm. The worst performers, ‘Dolci’ and ‘Homer’, could only wait a maximum of 20 seconds.

“It’s just mind-boggling that some jays can wait so long for their favourite food. In multiple trials, I sat there watching JayLo ignore a piece of cheese for over five minutes – I was getting bored, but she was just patiently waiting for the worm,” said Dr Alex Schnell at the University’s Department of Psychology, first author of the report.

The jays looked away from the bread or cheese when it was presented to them, as if to distract themselves from temptation. Similar behaviour has been seen in chimpanzees and children.

 

 

JayLo patiently ignores the cheese (in right box) to wait for the worm (in left box).

The researchers also presented the jays with five cognitive tasks that are commonly used to measure general intelligence. The birds that performed better in these tasks also managed to wait longer for the mealworm reward. This suggests that self-control is linked with intelligence in jays.

“The birds’ performance varied across individuals – some did really well in all the tasks and others were mediocre. What was most interesting was that if a bird was good at one of the tasks, it was good at all of them – which suggests that a general intelligence factor underlies their performance,” said Schnell.

The jays also adjusted their self-control behaviour according to the circumstances: in another experiment where the worm was visible but always out of reach, the jays always ate the immediately available bread or cheese. And the length of time they were willing to wait for the worm fell if it was pitted against their second most preferred food as the immediate treat, compared to their third. This flexibility shows that jays only delay gratification when it is warranted.

Research by other scientists has found that children taking the Stanford marshmallow test vary greatly in their self-control, and this ability is linked to their general intelligence. Children that can resist temptation for longer also get higher scores in a range of academic tasks.

This research was approved by the University of Cambridge Animal Ethics Review Committee, and performed in accordance with the Home Office Regulations and the ASAB Guidelines for the Treatment of Animals in Behavioural Research and Teaching.

The research was funded by the Royal Society, Fyssen Foundation, and European Research Council.

Reference

Schnell, A.K., Boeckle, M., Clayton, N.S. ‘Waiting for a better possibility: delay of gratification in corvids and its relationship to other cognitive capacities.’ Philosophical Transactions of the Royal Society B, October 2022.

A study has found that Eurasian jays can pass a version of the ‘marshmallow test’ – and those with the greatest self-control also score the highest on intelligence tests.

It’s just mind-boggling that some jays can wait so long for their favourite food."
Alex Schnell
Jay

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Catholic Church can curb carbon emissions by returning to meat-free Fridays

$
0
0

In 2011, the Catholic bishops of England and Wales called on congregations to return to foregoing meat on Fridays. Only around a quarter of Catholics changed their dietary habits – yet this still saved over 55,000 tonnes of carbon a year, according to a new study led by the University of Cambridge.

Researchers say that, in terms of CO2 emissions, this is equivalent to 82,000 fewer people taking a return trip from London to New York over the course of a year. 

The current Catholic leader, Pope Francis, has called for “radical” responses to climate change. The researchers argue that if the Pope reinstated meatless Fridays across the global church, it could mitigate millions of tonnes of greenhouse gases annually.

For example, they say that if Catholic bishops in the United States alone issued an “obligation” to resist meat on the last day of the working week, environmental benefits would likely be twenty times larger than in the UK.

“The Catholic Church is very well placed to help mitigate climate change, with more than one billion followers around the world,” said lead author Professor Shaun Larcom from Cambridge’s Department of Land Economy.

“Pope Francis has already highlighted the moral imperative for action on the climate emergency, and the important role of civil society in achieving sustainability through lifestyle change.

“Meat agriculture is one of the major drivers of greenhouse gas emissions. If the Pope was to reinstate the obligation for meatless Fridays to all Catholics globally, it could be a major source of low-cost emissions reductions,” Larcom said. “Even if only a minority of Catholics choose to comply, as we find in our case study.”

Traditionally, the practice of refraining from meat one day a week saw many Catholics – and indeed large sections of the population in predominantly Christian countries – turn to fish on Fridays as a protein substitute.

The overall Catholic share of the British population has remained largely stable for decades at just under 10%, say economists behind the study, published as a working paper awaiting peer-review on the Social Science Research Network.  

Larcom and colleagues combined new survey data with that from diet and social studies to quantify the effects of a statement issued by the Catholic Church for England and Wales re-establishing meat-free Fridays as a collective act of penance from September 2011 onwards after a 26-year hiatus.

Commissioned survey results suggest that 28% of Catholics in England and Wales adjusted their Friday diet following this announcement. Of this segment, 41% stated that they stopped eating meat on Friday, and 55% said they tried to eat less meat on that day. For those who said they just reduced consumption, the researchers assumed a halving of meat intake on a Friday.* 

People in England and Wales eat an average of 100 grams of meat a day, according to the National Diet and Nutrition Survey (NDNS). Researchers calculated that even the small reduction in meat intake by a section of the Catholic population was equal to each working adult across the whole of England and Wales cutting two grams of meat a week out of their diet.   

The team then calculated the carbon footprint for this tiny fall in meat consumption by comparing emissions generated from average daily diets of meat eaters and non-meat eaters in England and Wales. The average high protein non-meat diet, including foods such as fish and cheese, contributes just a third of the greenhouse gas emissions per kilo compared to the average meat eater. 

Assuming the Catholics who did adapt their diet switched to high protein non-meat meals on Fridays, this equates to approximately 875,000 fewer meat meals a week, which saves 1,070 tonnes of carbon – or 55,000 tonnes over a year, according to researchers.

In addition to their central calculation, the researchers used a natural experiment approach across the United Kingdom to compare meat consumption in Scotland and Northern Ireland, where Catholic bishops did not attempt to reintroduce meatless Fridays, with that in England and Wales from 2009 to 2019. 

Using NDNS diet diary data the team pinpointed mealtime changes on Fridays only, and found meat consumption fell by around eight grams per person in the “treatment jurisdiction” of England and Wales following the re-establishment of the Catholic obligation, compared to the rest of the UK.

There could be many reasons for this dietary shift – meat intake has fallen across the country over this time – but the team argue the reduction at least partly resulted from the return of meatless Fridays. As such, they say that the carbon footprint calculations using a two-gram per week drop are likely to be conservative.

Researchers also tested for “religious impacts” using longitudinal survey data that questioned UK Catholics on their religious lives. No discernible effect on either church attendance or strength of personal religious belief was detected over the period in which meat-free Fridays were reintroduced.

“Our results highlight how a change in diet among a group of people, even if they are a minority in society, can have very large consumption and sustainability implications,” said co-author Dr Po-Wen She, a fellow of Cambridge’s Department of Land Economy.   

Co-author Dr Luca Panzone from Newcastle University added: “While our study looked at a change in practice among Catholics, many religions have dietary proscriptions that are likely to have large natural resource impacts. Other religious leaders could also drive changes in behaviour to further encourage sustainability and mitigate climate change.” 

For Christians, the practice of meat-free Fridays dates back to at least Pope Nicholas I’s declaration in the 9th century. Catholics were required to abstain from eating meat (“flesh, blood, or marrow”) on Fridays in memory of Christ’s death and crucifixion.

However, fish and vegetables, along with crabs, turtles and even frogs, were permitted. The researchers point out that the practice was observed so fervently among some American Catholics that it led to the invention of the Filet-o-Fish meal by the burger chain McDonald’s.

Even a small dietary change by a minority of UK Catholics had significant environmental benefits, say researchers, who argue that a papal decree reinstating meatless Fridays across the global church would save millions of tonnes of carbon a year.

If the Pope was to reinstate the obligation for meatless Fridays to all Catholics globally, it could be a major source of low-cost emissions reductions
Shaun Larcom
Pope Francis in Vatican City

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

University and city council to explore feasibility of city centre heat network to reduce emissions

$
0
0

If the study identifies it is feasible and government funding is available to develop the project further, a heat network – which could include heat provided by ground source energy - could eventually be built, which could supply 100% renewable heating and hot water to city centre buildings belonging to the council, the University of Cambridge and others. 

This could present a solution to reduce the emissions produced by historic buildings such as the Corn Exchange, the Guildhall, and various University of Cambridge and College sites that may be among some of the hardest in the city to decarbonise.

Earlier this year the council and the University of Cambridge, with the support from sixteen university colleges, submitted a bid to the government’s Heat Network Delivery Unit (HNDU) for funding to carry out a feasibility study to determine the technical and economic feasibility of the scheme. 

The government has now confirmed it will provide £97,680 towards the study, with the council and University also making a financial contribution of £16,500 each. The council and partners will procure expert consultants to conduct the study and is aiming to complete the study by summer 2023.

Cllr Rosy Moore, Executive Councillor for Environment, Climate Change and Biodiversity, said: “This funding represents an exciting first step, which could ultimately lead to the development of a city centre heat network that could provide a transformational zero-carbon heating and hot water supply for buildings in the city centre and beyond.” 

“In recent years we have delivered a number of projects to help reduce carbon emissions across the city and I hope that this study produces positive results and helps us to provide another significant way to help tackle the climate crisis."

Professor Ian Leslie, Chair of the University of Cambridge’s Environmental Sustainability Strategy Committee, said: “Decarbonising space and water heating in an historic city setting is challenging — but necessary. The University, with the support of the Colleges, is excited to be working with the City Council to progress a proposal for a heat network in Cambridge. Establishing a core heat network is an ambitious undertaking, but one which could provide the nucleus for an ever-growing network eventually spanning the entire city.”
 

Cambridge City Council, in partnership with the University of Cambridge, has secured government funding to undertake a study to explore the feasibility of developing the Cambridge City Centre Heat Network.

Decarbonising space and water heating in an historic city setting is challenging — but necessary.
Professor Ian Leslie, Chair of the University’s Environmental Sustainability Strategy Committee

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Can cosmic inflation be ruled out?

$
0
0

The astrophysicists, from the University of Cambridge, the University of Trento, and Harvard University, say that there is a clear, unambiguous signal in the cosmos which could eliminate inflation as a possibility. Their paper, published in The Astrophysical Journal Letters, argues that this signal – known as the cosmic graviton background (CGB) – can feasibly be detected, although it will be a massive technical and scientific challenge.

“Inflation was theorised to explain various fine-tuning challenges of the so-called hot Big Bang model,” said the paper’s first author Dr Sunny Vagnozzi, from Cambridge’s Kavli Institute for Cosmology, and who is now based at the University of Trento. “It also explains the origin of structure in our Universe as a result of quantum fluctuations.

“However, the large flexibility displayed by possible models for cosmic inflation which span an unlimited landscape of cosmological outcomes raises concerns that cosmic inflation is not falsifiable, even if individual inflationary models can be ruled out. Is it possible in principle to test cosmic inflation in a model-independent way?”

Some scientists raised concerns about cosmic inflation in 2013, when the Planck satellite released its first measurements of the cosmic microwave background (CMB), the universe's oldest light.

“When the results from the Planck satellite were announced, they were held up as a confirmation of cosmic inflation,” said Professor Avi Loeb from Harvard University, Vagnozzi’s co-author on the current paper. “However, some of us argued that the results might be showing just the opposite.”

Along with Anna Ijjas and Paul Steinhardt, Loeb was one of those who argued that results from Planck showed that inflation posed more puzzles than it solved, and that it was time to consider new ideas about the beginnings of the universe, which, for instance, may have begun not with a bang but with a bounce from a previously contracting cosmos.

The maps of the CMB released by Planck represent the earliest time in the universe we can ‘see’, 100 million years before the first stars formed. We cannot see farther.

“The actual edge of the observable universe is at the distance that any signal could have travelled at the speed-of-light limit over the 13.8 billion years that elapsed since the birth of the Universe,” said Loeb. “As a result of the expansion of the universe, this edge is currently located 46.5 billion light years away. The spherical volume within this boundary is like an archaeological dig centred on us: the deeper we probe into it, the earlier is the layer of cosmic history that we uncover, all the way back to the Big Bang which represents our ultimate horizon. What lies beyond the horizon is unknown.”

In could be possible to dig even further into the universe’s beginnings by studying near-weightless particles known as neutrinos, which are the most abundant particles that have mass in the universe. The Universe allows neutrinos to travel freely without scattering from approximately a second after the Big Bang, when the temperature was ten billion degrees. “The present-day universe must be filled with relic neutrinos from that time,” said Vagnozzi.

Vagnozzi and Loeb say we can go even further back, however, by tracing gravitons, particles that mediate the force of gravity.

“The Universe was transparent to gravitons all the way back to the earliest instant traced by known physics, the Planck time: 10 to the power of -43 seconds, when the temperature was the highest conceivable: 10 to the power of 32 degrees,” said Loeb. “A proper understanding of what came before that requires a predictive theory of quantum gravity, which we do not possess.”

Vagnozzi and Loeb say that once the Universe allowed gravitons to travel freely without scattering, a relic background of thermal gravitational radiation with a temperature of slightly less than one degree above absolute zero should have been generated: the cosmic graviton background (CGB).

However, the Big Bang theory does not allow for the existence of the CGB, as it suggests that the exponential inflation of the newborn universe diluted relics such as the CGB to a point that they are undetectable. This can be turned into a test: if the CGB were detected, clearly this would rule out cosmic inflation, which does not allow for its existence.

Vagnozzi and Loeb argue that such a test is possible, and the CGB could in principle be detected in future. The CGB adds to the cosmic radiation budget, which otherwise includes microwave and neutrino backgrounds. It therefore affects the cosmic expansion rate of the early Universe at a level that is detectable by next-generation cosmological probes, which could provide the first indirect detection of the CGB.

However, to claim a definitive detection of the CGB, the ‘smoking gun’ would be the detection of a background of high-frequency gravitational waves peaking at frequencies around 100 GHz. This would be very hard to detect, and would require tremendous technological advances in gyrotron and superconducting magnets technology. Nevertheless, say the researchers, this signal may be within our reach in future.

Reference:
Sunny Vagnozzi and Abraham Loeb. ‘The Challenge of Ruling Out Inflation via the Primordial Graviton Background.’ The Astrophysical Journal Letters (2022). DOI: 10.3847/2041-8213/ac9b0e

Adapted in part from a piece on Medium by Avi Loeb.

Astrophysicists say that cosmic inflation – a point in the Universe’s infancy when space-time expanded exponentially, and what physicists really refer to when they talk about the ‘Big Bang’ – can in principle be ruled out in an assumption-free way.

Is it possible in principle to test cosmic inflation in a model-independent way?
Sunny Vagnozzi
Cosmic inflation is a popular scenario for the earliest phase in the evolution of the Universe

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Viewing all 4508 articles
Browse latest View live