Quantcast
Channel: University of Cambridge - Latest news
Viewing all 4508 articles
Browse latest View live

Astronomers develop novel way to ‘see’ first stars through fog of early Universe

$
0
0
Artist's impression of stars springing up out of the darkness

The researchers, led by the University of Cambridge, have developed a methodology that will allow them to observe and study the first stars through the clouds of hydrogen that filled the Universe about 378,000 years after the Big Bang.

Observing the birth of the first stars and galaxies has been a goal of astronomers for decades, as it will help explain how the Universe evolved from the emptiness after the Big Bang to the complex realm of celestial objects we observe today, 13.8 billion years later.

The Square Kilometre Array (SKA) - a next-generation telescope due to be completed by the end of the decade - will likely be able to make images of the earliest light in the Universe, but for current telescopes the challenge is to detect the cosmological signal of the stars through the thick hydrogen clouds.

The signal that astronomers aim to detect is expected to be approximately one hundred thousand times weaker than other radio signals coming also from the sky – for example, radio signals originating in our own galaxy.

Using a radio telescope itself introduces distortions to the signal received, which can completely obscure the cosmological signal of interest. This is considered an extreme observational challenge in modern radio cosmology. Such instrument-related distortions are commonly blamed as the major bottleneck in this type of observation.

Now the Cambridge-led team has developed a methodology to see through the primordial clouds and other sky noise signals, avoiding the detrimental effect of the distortions introduced by the radio telescope. Their methodology, part of the REACH (Radio Experiment for the Analysis of Cosmic Hydrogen) experiment, will allow astronomers to observe the earliest stars through their interaction with the hydrogen clouds, in the same way we would infer a landscape by looking at shadows in the fog.

Their method will improve the quality and reliability of observations from radio telescopes looking at this unexplored key time in the development of the Universe. The first observations from REACH are expected later this year.

The results are reported today in the journal Nature Astronomy.

“At the time when the first stars formed, the Universe was mostly empty and composed mostly of hydrogen and helium,” said Dr Eloy de Lera Acedo from Cambridge’s Cavendish Laboratory, the paper’s lead author.

He added: “Because of gravity, the elements eventually came together and the conditions were right for nuclear fusion, which is what formed the first stars. But they were surrounded by clouds of so-called neutral hydrogen, which absorb light really well, so it’s hard to detect or observe the light behind the clouds directly.”

In 2018, another research group (running the ‘Experiment to Detect the Global Epoch of Reioniozation Signature’ – or EDGES) published a result that hinted at a possible detection of this earliest light, but astronomers have been unable to repeat the result - leading them to believe that the original result may have been due to interference from the telescope being used.

“The original result would require new physics to explain it, due to the temperature of the hydrogen gas, which should be much cooler than our current understanding of the Universe would allow. Alternatively, an unexplained higher temperature of the background radiation - typically assumed to be the well-known Cosmic Microwave Background - could be the cause” said de Lera Acedo.

He added: “If we can confirm that the signal found in that earlier experiment really was from the first stars, the implications would be huge.”

In order to study this period in the Universe’s development, often referred to as the Cosmic Dawn, astronomers study the 21-centimetre line – an electromagnetic radiation signature from hydrogen in the early Universe. They look for a radio signal that measures the contrast between the radiation from the hydrogen and the radiation behind the hydrogen fog.

The methodology developed by de Lera Acedo and his colleagues uses Bayesian statistics to detect a cosmological signal in the presence of interference from the telescope and general noise from the sky, so that the signals can be separated.

To do this, state-of-the-art techniques and technologies from different fields have been required.

The researchers used simulations to mimic a real observation using multiple antennas, which improves the reliability of the data – earlier observations have relied on a single antenna.

“Our method jointly analyses data from multiple antennas and across a wider frequency band than equivalent current instruments. This approach will give us the necessary information for our Bayesian data analysis,” said de Lera Acedo.

He added: “In essence, we forgot about traditional design strategies and instead focused on designing a telescope suited to the way we plan to analyse the data – something like an inverse design. This could help us measure things from the Cosmic Dawn and into the epoch of reionisation, when hydrogen in the Universe was reionised.”

The telescope’s construction is currently being finalised at the Karoo radio reserve in South Africa, a location chosen for its excellent conditions for radio observations of the sky. It is far away from human-made radio frequency interference, for example television and FM radio signals.

The REACH team of over 30 researchers is multidisciplinary and distributed worldwide, with experts in fields such as theoretical and observational cosmology, antenna design, radio frequency instrumentation, numerical modelling, digital processing, big data and Bayesian statistics. REACH is co-led by the University of Stellenbosch in South Africa.

Professor de Villiers, co-lead of the project at the University of Stellenbosch in South Africa said: "Although the antenna technology used for this instrument is rather simple, the harsh and remote deployment environment, and the strict tolerances required in the manufacturing, make this a very challenging project to work on.”

He added: “We are extremely excited to see how well the system will perform, and have full confidence we'll make that elusive detection."

The Big Bang and very early times of the Universe are well understood epochs, thanks to studies of the Cosmic Microwave Background (CMB) radiation. Even better understood is the late and widespread evolution of stars and other celestial objects. But the time of formation of the first light in the Cosmos is a fundamental missing piece in the puzzle of the history of the Universe.

The research was supported by the Kavli Institute for Cosmology in Cambridge (UK), the National Research Foundation (South Africa), the Cambridge-Africa ALBORADA trust (UK) and the Science and Technology Facilities Council (STFC), part of UK Research and Innovation (UKRI).

Reference

E. de Lera Acedo et al.: 'The REACH radiometer for detecting the 21-cm hydrogen signal from redshift z ≈ 7.5–28.’ Nature Astronomy (July 2022). DOI: 10.1038/s41550-022-01709-9..

A team of astronomers has developed a method that will allow them to ‘see’ through the fog of the early Universe and detect light from the first stars and galaxies.

The first stars were surrounded by clouds of hydrogen, which absorb light really well, so it's hard to detect or observe the light behind the clouds directly.
Eloy de Lera Acedo
Artist's impression of stars springing up out of the darkness

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Five Cambridge academics elected to the British Academy in 2022

$
0
0

The academics have been elected to the fellowship this year in recognition of their work in the fields of literature, visual culture, memory, history and heritage, and are among 85 distinguished scholars to be elected to the British Academy in 2022. 


Professor Virginia Cox (Faculty of Modern and Medieval Languages and Linguistics; Trinity College)
Virginia Cox’s research focuses on Renaissance and Counter-Reformation Italian literature, on the history of the reception of classical rhetorical theory in Italy between the thirteenth and sixteenth centuries, and on the history of Italian early modern writing by women.

Professor Richard Henson (MRC Cognition and Brain Sciences Unit)
Richard (Rik) Henson’s primary research focus is how the brain "remembers" things. His work uses focuses on trying to understand how our brains support different types of memory, which is vital for understanding the memory problems associated with brain damage and disease. He is current president of the British Neuroscience Association.

Professor Heonik Kwon (Department of Social Anthropology; Trinity College)
Heonik Kwon is the author of prize-winning books on the historical memories of the Vietnam War, Asia’s Cold War, and the Korean War. He is currently working on the history of cultural internationalism in the twentieth century and beyond as part of a five-year research project called Beyond The Korean War.

Professor Marie Louise Sorensen (Department of Archaeology; Jesus College)
Marie Louise Sorensen specialises in European prehistory, gender and theory, as well as contemporary heritage politics: in particular around conflict, including destruction and reconstruction. Sorensen has recently worked on early colonial expansion into the Cape Verde islands, and investigated domestic life in Bronze Age Hungary.

Professor Emma Wilson (Faculty of Modern and Medieval Languages and Linguistics; Corpus Christi College)
Emma Wilson researches contemporary visual culture, modern French literature and gender. She has written on contemporary women filmmakers in France, along with the uses of cinema to respond to loss and pain in her book Love, Mortality and the Moving Image. Wilson’s book on filmmaker Céline Sciamma was published last year. 


Founded in 1902, the British Academy is the UK’s national academy for the humanities and social sciences. The Fellowship claims over 1600 of the leading minds in these subjects from the UK and overseas, with other Cambridge fellows including the classicist Professor Dame Mary Beard and the historian Professor David Reynolds. The Academy is also a funding body for research, nationally and internationally, and a forum for debate and engagement.  

"I am delighted to welcome these distinguished and pioneering scholars to our Fellowship,” said new President of the British Academy, Professor Julia Black. “I am equally delighted that we have so many new female Fellows. While I hope this means that the tide is finally turning for women in academia, there is still much to do to make the research world diverse and open to all.” 

“With our new Fellows’ expertise and insights, the Academy is better placed than ever to open new seams of knowledge and understanding and to enhance the wellbeing and prosperity of societies around the world,” said Black.

 

Five academics from the University of Cambridge have been made Fellows of the prestigious British Academy for the humanities and social sciences.

British Academy

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Madingley aviaries saved from closure

$
0
0
Jays at Madingley

We are very grateful to everyone who has contributed to helping the University of Cambridge secure the future of this important research facility, especially Alex Gerko, Founder and CEO of XTX Markets. We welcome any further donations to help us keep the facility open beyond this period.

The aviaries have been the location of exceptional research led by Professor Nicky Clayton FRS that has transformed our understanding of the behaviour and intelligence of these bird species.

We are delighted to announce that due to a number of generous donations from both members of the public and the scientific community, together with support from the University of Cambridge, we are able to keep the corvid aviaries at Madingley open for a further five years.

Jays at Madingley

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Natural clean-up: bacteria can remove plastic pollution from lakes

$
0
0
Study lake in Norway

The bacteria break down the carbon compounds in plastic to use as food for their growth.

The scientists say that enriching waters with particular species of bacteria could be a natural way to remove plastic pollution from the environment.

The effect is pronounced: the rate of bacterial growth more than doubled when plastic pollution raised the overall carbon level in lake water by just 4%.

The results suggest that the plastic pollution in lakes is ‘priming’ the bacteria for rapid growth –  the bacteria are not only breaking down the plastic but are then more able to break down other natural carbon compounds in the lake.

Lake bacteria were found to favour plastic-derived carbon compounds over natural ones. The researchers think this is because the carbon compounds from plastics are easier for the bacteria to break down and use as food.

The scientists caution that this does not condone ongoing plastic pollution. Some of the compounds within plastics can have toxic effects on the environment, particularly at high concentrations.

The findings are published today in the journal Nature Communications.

“It’s almost like the plastic pollution is getting the bacteria’s appetite going. The bacteria use the plastic as food first, because it’s easy to break down, and then they’re more able to break down some of the more difficult food – the natural organic matter in the lake,” said Dr Andrew Tanentzap in the University of Cambridge’s Department of Plant Sciences, senior author of the paper.

He added: “This suggests that plastic pollution is stimulating the whole food web in lakes, because more bacteria means more food for the bigger organisms like ducks and fish.”

The effect varied depending on the diversity of bacterial species present in the lake water – lakes with more different species were better at breaking down plastic pollution.

A study published by the authors last year found that European lakes are potential hotspots of microplastic pollution.

When plastics break down they release simple carbon compounds. The researchers found that these are chemically distinct to the carbon compounds released as organic matter like leaves and twigs break down.

The carbon compounds from plastics were shown to be derived from additives unique to plastic products, including adhesives and softeners.

The new study also found that bacteria removed more plastic pollution in lakes that had fewer unique natural carbon compounds. This is because the bacteria in the lake water had fewer other food sources.

The results will help to prioritise lakes where pollution control is most urgent. If a lake has a lot of plastic pollution, but low bacterial diversity and a lot of different natural organic compounds, then its ecosystem will be more vulnerable to damage.

“Unfortunately, plastics will pollute our environment for decades. On the positive side, our study helps to identify microbes that could be harnessed to help break down plastic waste and better manage environmental pollution," said Professor David Aldridge in the University of Cambridge’s Department of Zoology, who was involved in the study.

The study involved sampling 29 lakes across Scandinavia between August and September 2019. To assess a range of conditions, these lakes differed in latitude, depth, area, average surface temperature and diversity of dissolved carbon-based molecules.

The scientists cut up plastic bags from four major UK shopping chains, and shook these in water until their carbon compounds were released.

At each lake, glass bottles were filled with lake water. A small amount of the ‘plastic water’ was added to half of these, to represent the amount of carbon leached from plastics into the environment, and the same amount of distilled water was added to the others. After 72 hours in the dark, bacterial activity was measured in each of the bottles.

The study measured bacterial growth - by increase in mass, and the efficiency of bacterial growth - by the amount of carbon-dioxide released in the process of growing.

In the water with plastic-derived carbon compounds, the bacteria had doubled in mass very efficiently. Around 50% of this carbon was incorporated into the bacteria in 72 hours.

"Our study shows that when carrier bags enter lakes and rivers they can have dramatic and unexpected impacts on the entire ecosystem. Hopefully our results will encourage people to be even more careful about how they dispose of plastic waste," said Eleanor Sheridan in the University of Cambridge’s Department of Plant Sciences, first author of the study who undertook the work as part of a final-year undergraduate project.

The research was funded by the European Research Council.

Reference

Sheridan, EA et al: ‘Plastic pollution fosters more microbial growth in lakes than natural organic matter.’ Nature Communications, 2022. DOI: 10.1038/s41467-022-31691-9

A study of 29 European lakes has found that some naturally-occurring lake bacteria grow faster and more efficiently on the remains of plastic bags than on natural matter like leaves and twigs.

It’s almost like the plastic pollution is getting the bacteria’s appetite going. The bacteria use the plastic as food first, because it’s easy to break down.
Andrew Tanentzap
Study lake in Norway

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

AI tackles the challenge of materials structure prediction

$
0
0
Geometric abstract background with connected line and dots

The researchers, from Cambridge and Linkoping Universities, have designed a way to predict the structure of materials given its constitutive elements. The results are reported in the journal Science Advances.

The arrangement of atoms in a material determines its properties. The ability to predict this arrangement computationally for different combinations of elements, without having to make the material in the lab, would enable researchers to quickly design and improve materials. This paves the way for advances such as better batteries and photovoltaics.

However, there are many ways that atoms can ‘pack’ into a material: some packings are stable, others are not. Determining the stability of a packing is computationally intensive, and calculating every possible arrangement of atoms to find the best one is not practical. This is a significant bottleneck in materials science.

“This materials structure prediction challenge is similar to the protein folding problem in biology,” said Dr Alpha Lee from Cambridge’s Cavendish Laboratory, who co-led the research. “There are many possible structures that a material can ‘fold’ into. Except the materials science problem is perhaps even more challenging than biology because it considers a much broader set of elements.”

Lee and his colleagues developed a method based on machine learning that successfully tackles this challenge. They developed a new way to describe materials, using the mathematics of symmetry to reduce the infinite ways that atoms can pack into materials into a finite set of possibilities. They then used machine learning to predict the ideal packing of atoms, given the elements and their relative composition in the material.

Their method accurately predicts the structure of materials that hold promise for piezoelectric and energy harvesting applications, with over five times the efficiency of current methods. Their method can also find thousands of new and stable materials that have never been made before, in a way that is computationally efficient.  

“The number of materials that are possible is four to five orders of magnitude larger than the total number of materials that we have made since antiquity,” said co-first author Dr Rhys Goodall, also from the Cavendish Laboratory. “Our approach provides an efficient computational approach that can ‘mine’ new stable materials that have never been made before. These hypothetical materials can then be computationally screened for their functional properties.”

The researchers are now using their machine learning platform to find new functional materials such as dielectric materials. They are also integrating other aspects of experimental constraints into their materials discovery approach.

The research was supported in part by the Royal Society and the Winton Programme for the Physics of Sustainability.

Reference:
Rhys A. Goodall et al. ‘Rapid discovery of stable materials by coordinate-free coarse graining.’ Science Advances (2022). DOI: 10.1126/sciadv.abn4117

Researchers have designed a machine learning method that can predict the structure of new materials with five times the efficiency of the current standard, removing a key roadblock in developing advanced materials for applications such as energy storage and photovoltaics.

Our approach provides an efficient computational approach that can ‘mine’ new stable materials that have never been made before.
Rhys Goodall
Geometric abstract background with connected line and dots

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Smart lighting system based on quantum dots more accurately reproduces daylight

$
0
0
Long exposure light painting

The researchers, from the University of Cambridge, designed the next-generation smart lighting system using a combination of nanotechnology, colour science, advanced computational methods, electronics and a unique fabrication process.

The team found that by using more than the three primary lighting colours used in typical LEDs, they were able to reproduce daylight more accurately. Early tests of the new design showed excellent colour rendering, a wider operating range than current smart lighting technology, and wider spectrum of white light customisation. The results are reported in the journal Nature Communications.

As the availability and characteristics of ambient light are connected with wellbeing, the widespread availability of smart lighting systems can have a positive effect on human health since these systems can respond to individual mood. Smart lighting can also respond to circadian rhythms, which regulate the daily sleep-wake cycle, so that light is reddish-white in the morning and evening, and bluish-white during the day.

When a room has sufficient natural or artificial light, good glare control, and views of the outdoors, it is said to have good levels of visual comfort. In indoor environments under artificial light, visual comfort depends on how accurately colours are rendered. Since the colour of objects is determined by illumination, smart white lighting needs to be able to accurately express the colour of surrounding objects. Current technology achieves this by using three different colours of light simultaneously.

Quantum dots have been studied and developed as light sources since the 1990s, due to their high colour tunability and colour purity. Due their unique optoelectronic properties, they show excellent colour performance in both wide colour controllability and high colour rendering capability.

The Cambridge researchers developed an architecture for quantum-dot light-emitting diodes (QD-LED) based next-generation smart white lighting. They combined system-level colour optimisation, device-level optoelectronic simulation, and material-level parameter extraction.

The researchers produced a computational design framework from a colour optimisation algorithm used for neural networks in machine learning, together with a new method for charge transport and light emission modelling.

The QD-LED system uses multiple primary colours – beyond the commonly used red, green and blue – to more accurately mimic white light. By choosing quantum dots of a specific size – between three and 30 nanometres in diameter – the researchers were able to overcome some of the practical limitations of LEDs and achieve the emission wavelengths they needed to test their predictions.

The team then validated their design by creating a new device architecture of QD-LED based white lighting. The test showed excellent colour rendering, a wider operating range than current technology, and a wide spectrum of white light shade customisation.

The Cambridge-developed QD-LED system showed a correlated colour temperature (CCT) range from 2243K (reddish) to 9207K (bright midday sun), compared with current LED-based smart lights which have a CCT between 2200K and 6500K. The colour rendering index (CRI) – a measure of colours illuminated by the light in comparison to daylight (CRI=100) – of the QD-LED system was 97, compared to current smart bulb ranges, which are between 80 and 91.

The design could pave the way to more efficient, more accurate smart lighting. In an LED smart bulb, the three LEDs must be controlled individually to achieve a given colour. In the QD-LED system, all the quantum dots are driven by a single common control voltage to achieve the full colour temperature range.

“This is a world-first: a fully optimised, high-performance quantum-dot-based smart white lighting system,” said Professor Jong Min Kim from Cambridge’s Department of Engineering, who co-led the research. “This is the first milestone toward the full exploitation of quantum-dot-based smart white lighting for daily applications.”

“The ability to better reproduce daylight through its varying colour spectrum dynamically in a single light is what we aimed for,” said Professor Gehan Amaratunga, who co-led the research. “We achieved it in a new way through using quantum dots. This research opens the way for a wide variety of new human responsive lighting environments.”

The structure of the QD-LED white lighting developed by the Cambridge team is scalable to large area lighting surfaces, as it is made with a printing process and its control and drive is similar to that in a display. With standard point source LEDs requiring individual control this is a more complex task.

The research was supported in part by the European Union and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).

 

Reference:
Chatura Samarakoon et al. ‘Optoelectronic System and Device Integration for Quantum-Dot Light-Emitting Diode White Lighting with Computational Design Framework.’ Nature Communications (2022). DOI: 10.1038/s41467-022-31853-9

Researchers have designed smart, colour-controllable white light devices from quantum dots – tiny semiconductors just a few billionths of a metre in size – which are more efficient and have better colour saturation than standard LEDs, and can dynamically reproduce daylight conditions in a single light.

This research opens the way for a wide variety of new human-responsive lighting environments
Gehan Amaratunga
Long exposure light painting

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Children with rare genetic disorders more likely to be diagnosed with developmental, behavioural and mental health problems

$
0
0
Toddler's hands touching tree bark

With the advent of rapid whole genome sequencing, children presenting with an intellectual disability or developmental delay are recommended to have their DNA sequenced to identify the underlying genetic cause.

To capitalise on this recent NHS development, researchers at the University of Cambridge, University College London and Cardiff University established IMAGINE ID, a national UK cohort study that aims to discover how genetic changes affect children and young people’s behaviour, in order to inform better care of families and children now and in the future.

Writing in The Lancet Psychiatry today, the researchers have published the results of an analysis of data from almost 2,800 young people with rare genomic variants – changes to their DNA – that are associated with intellectual disability.

Professor Lucy Raymond from the University of Cambridge, the study’s senior author, said: “Thanks to all the families that have taken part in our research, we’ve been able to conduct the largest study to date of the impact of rare genetic variants associated with intellectual disability. What we’ve found from parents is that these children are extremely likely to develop other neurodevelopmental or mental health conditions, which can present additional challenges both to the children and their families.”

All the participants were aged between four and 19 years. Just under three-quarters (74%) had an intellectual disability caused by a duplication or deletion of sections of DNA – a so-called copy number variant (CNV). The remaining young people had a disability caused by a single ‘spelling error’ in their DNA – a change in the A, C, G or T nucleotides – referred to as a single nucleotide variant (SNV).

Compared to the English national population, children in the study were almost 30 times as likely to have been diagnosed as autistic. In the general population, 1.2% of people are diagnosed with the condition compared to 36% of the study participants. Similarly, 22% of the study population were diagnosed with ADHD, compared to 1.6% of the general population, meaning that they were more than 13 times more likely to have the condition.

Around one in eight children (12%) had been diagnosed with oppositional defiant disorder, in which children are uncooperative, defiant, and hostile toward others – a rate 4.4 times higher than in the general population.

One in ten (11%) had an anxiety disorder, a 1.5 times increased risk. Rates of childhood depression were significantly lower, at just 0.4% compared with 2.1% of the general population, but this may increase over the next few years as some mental health disorders do not start until later adolescence or early adult life. Almost all of the children (94%) were reported to have at least one significant physical health problem, including disturbed sleep (65%), motor or movement disorders (64%) or seizures (30%).

Dr Jeanne Wolstencroft from Great Ormond Street Institute of Child Health, University College London, said: “Routine genomic testing now allows parents to understand the genetic cause of intellectual disabilities in an increasing number of children but, because so many of these conditions are rare, we still lack information on the impact this has on their children’s future mental health.

“We already know that intellectual disabilities tend to be associated with an increased likelihood of neurodevelopmental conditions, as well as emotional and behavioural difficulties, but we found that where there is an identifiable genetic cause, the likelihood is amplified considerably. This suggests that these children should be provided with early assessment and help where appropriate.”

The team has also shown for the first time that children with intellectual disability caused by a genetic variant inherited from a family member, are more likely to come from a more deprived socioeconomic background. This suggests that some parents or family members with the same variant may also have unrecognised difficulties that placed them at a social and educational disadvantage. These children were more likely to be diagnosed with a neuropsychiatric condition and were also more likely to exhibit behavioural difficulties.

Professor David Skuse from Great Ormond Street Institute of Child Health, University College London, said: “We hope this work helps improve the targeting of assessments and interventions to support families at the earliest opportunity. We’d like to see better training for health care providers about the wider use and utility of genetic testing. We have identified its potential value in terms of prioritising children with mental health needs for child mental health services, who are currently hugely limited in the UK.”

The research was funded by the Medical Research Council (part of UK Research & Innovation) and the Medical Research Foundation. Additional support was provided by the NIHR Cambridge Biomedical Resource Centre and the NIHR GOSH BRC.

Reference
Wolstencroft, J et al. Neuropsychiatric risk in children with intellectual disability of genetic origin: IMAGINE - The UK National Cohort Study. Lancet Psychiatry; 4 Aug 2022; DOI: 10.1016/PIIS2215-0366(22)00207-3

 

A major study of children with intellectual disabilities has highlighted the additional challenges that they often face, including a much-increased likelihood of being diagnosed as autistic, as well as Attention Deficit Hyperactivity Disorder (ADHD) and other mental health difficulties.

Thanks to all the families that have taken part in our research, we’ve been able to conduct the largest study to date of the impact of rare genetic variants associated with intellectual disability
Lucy Raymond
Toddler's hands touching tree bark

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Prostate cancer cases risk late detection due to misleading urinary focus

$
0
0
Black man looking out window

Prostate cancer is the most common type of cancer in men. According to Cancer Research UK, over 52,000 men are diagnosed with prostate cancer each year and there are more than 12,000 deaths.

Over three-quarters (78%) of men diagnosed with the disease survive for over ten years, but this proportion has barely changed over the past decade in the UK, largely because the disease is detected at a relatively late stage. In England, for example, nearly half of all prostate cancers are picked up at stage three of four (stage four being the latest stage).

Despite no evidence of a link between urinary symptoms and prostate cancer, national guidelines, health advice and public health campaigns continue to promote this link. In a review published today in BMC Medicine, Cambridge researchers argue that not only is this unhelpful, but it may even deter men from coming forward for early testing and detection of a potentially treatable cancer.

“When most people think of the symptoms of prostate cancer, they think of problems with peeing or needing to pee more frequently, particularly during the night,” said Vincent Gnanapragasam, Professor of Urology at the University of Cambridge and an Honorary Consultant Urologist at Addenbrooke’s Hospital, Cambridge. “This misperception has lasted for decades, despite very little evidence, and it’s potentially preventing us picking up cases at an early stage.”

Prostate enlargement can cause the urinary problems often included in public health messaging, but evidence suggests that this is rarely due to malignant prostate tumours. Rather, research suggests that the prostate is smaller in cases of prostate cancer.  A recent study – the UK PROTECT trial – even went as far as to say that a lack of urinary symptoms may in fact be an indicator of a higher likelihood of cancer.

Screening programmes are one way that cancers are often detected at an early stage, but in the case of prostate cancer, some argue that such programmes risk overwhelming health services and leading to men being treated for relatively benign disease.

Testing for prostate cancer involves a blood test that looks for a protein known as a prostate-specific antigen (PSA) that is made only by the prostate gland; however, it is not always accurate. PSA density is significantly more accurate than PSA alone in predicting a positive biopsy and is used in everyday clinical practice.

The researchers point to evidence that there is a misconception that prostate cancer is always symptomatic: a previous study found that 86% of the public associated prostate cancer with symptoms, but only 1% were aware that it could be asymptomatic.

“We urgently need to recognise that the information currently given to the public risks giving men a false sense of security if they don’t have any urinary symptoms,” said Professor Gnanapragasam.

“We need to emphasise that prostate cancer can be a silent or asymptomatic disease, particularly in its curable stages. Waiting out for urinary symptoms may mean missing opportunities to catch the disease when it’s treatable.

“Men shouldn’t be afraid to speak to their GP about getting tested, and about the value of a PSA test, especially if they have a history of prostate cancer in their family or have other risk factors such as being of Black or mixed Black ethnicity.”

The researchers say they are not advocating for an immediate screening programme, and acknowledge that changes in messaging could mean more men approaching their GPs for a PSA test, potentially resulting in unnecessary investigations and treatment. However, they argue that there are ways to reduce the risk of this happening. These include the use of algorithms to assess an individual’s risk and whether they need to be referred to a specialist, and for those who are referred, MRI scans could help rule out ‘indolent’ (mild) disease or negative findings, reducing the risks of an unnecessary biopsy.

“We’re calling on organisations such as the NHS, as well as patient charities and the media, to review the current public messaging,” said Professor Gnanapragasam.

“If men were aware that just because they have no symptoms doesn’t necessarily mean they are cancer free, then more might take up offers for tests. This could mean more tumours identified at an earlier stage and reduce the numbers of men experiencing late presentation with incurable disease.”

Reference
Gnanapragasam, VJ, et al. Urinary symptoms and prostate cancer—the misconception that may be preventing earlier presentation and better survival outcomes. BMC Medicine; 4 Aug 2022; DOI: 10.1186/s12916-022-02453-7

Men with early, curable stages of prostate cancer are missing opportunities to have their cancer detected because national guidelines and media health campaigns focus on urinary symptoms despite a lack of scientific evidence, say experts at the University of Cambridge.

When most people think of the symptoms of prostate cancer, they think of problems with peeing... This misperception has lasted for decades, despite very little evidence
Vincent Gnanapragasam
Black man looking out window

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Racial discrimination linked to increased risk of premature babies

$
0
0
Black woman holding newborn baby in hospital bed

The findings add to growing evidence that racial discrimination is a risk factor for poor health outcomes, say the researchers.

For several decades, race has been recognised as a social determinant of health and a risk factor for numerous diseases. The evidence increasingly suggests that social, environmental, economic and political factors are fundamental drivers of health inequities, and that it is often racial discrimination or racism, rather than race, that is the root cause of racial disparities in health outcomes.

For example, maternal death rates among Black and Indigenous women in the USA are two to three times higher than those of white women. Similarly, in the UK, maternal death rates are two to four times higher among Black and Asian women compared to death rates among white women.

To explore the existing patterns of racial discrimination and adverse pregnancy outcomes, the researchers carried out a systematic review and meta-analysis, pooling and analysing data from the available evidence. This approach allowed them to bring together existing and sometimes contradictory or under-powered studies to provide more robust conclusions. Their results are published in the open access journal BMJ Global Health.

The team searched eight electronic databases, looking for relevant studies on self-reported racial discrimination and premature birth (that is, before 37 weeks), low and very low birthweight, small-for-gestational age, and high blood pressure associated with pregnancy.

In all, the results of 24 studies were included in the final analysis. The majority of studies (20) were carried out in the USA. Study participants were of different racial and ethnic backgrounds, including Black or African American, Hispanic, non-Hispanic white, Mãori, Pacific, Asian, Aboriginal Australian, Romani, indigenous German and Turkish.

The pooled analysis showed that the experience of racial discrimination was significantly associated with increased risk of premature birth. Women who experienced racial discrimination were 40% more likely to give birth prematurely. When low quality studies were excluded, the odds of a premature birth were reduced, but still 31% higher in those experiencing racial discrimination.

While not statistically significant, the results also suggest that the experience of racial discrimination may increase the chance of giving birth to a small-for-gestational age baby by 23%.

Co-first-author Jeenan Kaiser, who did her MPhil in Public Health at the University of Cambridge and is currently a medical student at the University of Alberta, said: “Racial discrimination impacts the health of racialised communities not only in direct and intentional ways, but also in how it shapes an individual’s experiences, opportunities, and quality of life. These are fundamentally driven by structural and social determinants of health.

“While our study focused on its impact on pregnancy outcomes, it is becoming increasingly evident that it negatively impacts a myriad of health outcomes. Efforts to counter racial discrimination and promote health must focus on systemic policy changes to create sustainable change.”

Co-first author Kim van Daalen, a Gates Cambridge and PhD candidate at the Department of Public Health and Primary Care, University of Cambridge, said: “Dismantling structures and policies that enable institutional and interpersonal racial discrimination, underlying racial and ethnic disparities in health and intersecting social inequalities, is essential to improve overall health in societies. Partnerships of health care professionals with community-based reproductive justice and women’s health organisations who work in this area can improve health for racialised women in a community-centred way.”

The researchers point out that racial discrimination impacts what health services and resources are available, such as referral to specialist care, access to health insurance and access to public health services.

Co-author Dr Samuel Kebede, who did his MPhil in Epidemiology at the University of Cambridge as a Gates Cambridge Scholar and is currently at Montefiore Health System/Albert Einstein College of Medicine in New York City said: “Historically there have been countless examples of where medicine and public health have been furthered by the subjugation and experimentation of Black and indigenous people. But the influence of structural racism is still present within the healthcare system today. From segregated healthcare for uninsured and under-insured people of colour in the United States, to the global disparity in COVID-19 vaccinations, structures continue to perpetuate inequities. Health professionals can play a vital role in dismantling these systems.”

Many of the studies were of limited quality and included few marginalised racial or ethnic groups other than African Americans; as such, their applicability to other ethnic groups and cultural settings may be limited. However, the researchers argue that when pooled, the data clearly demonstrate the negative impact of racial discrimination on pregnancy outcomes.

Reference
van Daalen, KR, & Kaiser, J et al. Racial discrimination and adverse pregnancy outcomes: a systematic review and meta-analysis. BMJ Global Health; 3 Aug 2022; DOI: 10.1136/bmjgh-2022-009227

Women who experience racial discrimination on the basis of their ethnicity, race or nationality are at increased risk of giving birth prematurely, according to a team led by researchers at the University of Cambridge.

Racial discrimination impacts the health of racialised communities not only in direct and intentional ways, but also in how it shapes an individual’s experiences, opportunities, and quality of life
Jeenan Kaiser
Black woman holding newborn baby in hospital bed

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Large number of stem cell lines carry significant DNA damage, say researchers

$
0
0
Sun

Stem cells are a special type of cell that can be programmed to become almost any type of cell within the body. They are currently used for studies on the development of organs and even the early stages of the embryo.

Increasingly, researchers are turning to stem cells as ways of developing new treatments, known as cell-based therapies. Other potential applications include programming stem cells to grow into nerve cells to replace those lost to neurodegeneration in diseases such as Parkinson’s.

Originally, stem cells were derived from embryos, but it is now possible to derive stem cells from adult skin cells. These so-called induced pluripotent stem cells (iPSCs) have now been generated from a range of tissues, including blood, which is increasing in popularity due to its ease of derivation.

However, researchers at the University of Cambridge and Wellcome Sanger Institute have discovered a problem with stem cell lines derived from both skin cells and blood. When they examined the genomes of the stem cell lines in detail, they found that nearly three quarters carried substantial damage to their DNA that could compromise their use both in research and, crucially, in cell-based therapies. Their findings represent the largest genetic study to date of iPSCs and are published today in Nature Genetics.

DNA is made up of three billion pairs of nucleotides, molecules represented by the letters A, C, G and T. Over time, damage to our DNA, for example from ultraviolet radiation, can lead to mutations – a letter C might change to a letter T, for example. ‘Fingerprints’ left on our DNA can reveal what is responsible for this damage. As these mutations accumulate, they can have a profound effect on the function of cells and in some cases lead to tumours.

Dr Foad Rouhani, who carried out the work while at the University of Cambridge and the Wellcome Sanger Institute, said: “We noticed that some of the iPS cells that we were generating looked really different from each other, even when they were derived from the same patient and derived in the same experiment. The most striking thing was that pairs of iPS cells would have a vastly different genetic landscape – one line would have minimal damage and the other would have a level of mutations more commonly seen in tumours. One possible reason for this could be that a cell on the surface of the skin is likely to have greater exposure to sunlight than a cell below the surface and therefore eventually may lead to iPS cells with greater levels of genomic damage.”

The researchers used a common technique known as whole genome sequencing to inspect the entire DNA of stem cell lines in different cohorts, including the HipSci cohort at the Wellcome Sanger Institute and discovered that as many as 72% of the lines showed signs of major UV damage.

Professor Serena Nik-Zainal from the Department of Medical Genetics at the University of Cambridge said: “Almost three-quarters of the cell lines had UV damage. Some samples had an enormous amount of mutations – sometimes more than we find in tumours.  We were all hugely surprised to learn this, given that most of these lines were derived from skin biopsies of healthy people.”

They decided to turn their attention to cell lines not derived from skin and focused on blood derived iPSCs as these are becoming increasingly popular due to the ease of obtaining blood samples. They found that while these blood-derived iPSCs, too, carried mutations, they had lower levels of mutations than skin-derived iPS cells and no UV damage. However, around a quarter carried mutations in a gene called BCOR, an important gene in blood cancers.

To investigate whether these BCOR mutations had any functional impact, they differentiated the iPSCs and turned them into neurons, tracking their progress along the way.

Dr Rouhani said: “What we saw was that there were problems in generating neurons from iPSCs that have BCOR mutations – they had a tendency to favour other cell types instead. This is a significant finding, particularly if one is intending to use those lines for neurological research.”

When they examined the blood samples, they discovered that the BCOR mutations were not present within the patient: instead, the process of culturing cells appears to increase the frequency of these mutations, which may have implications for other researchers working with cells in culture.

Scientists typically screen their cell lines for problems at the chromosomal level – for example by checking to see that the requisite 23 pairs of chromosomes are present. However, this would not be sufficiently detailed to pick up the potentially major problems that this new study has identified. Importantly, without looking in detail at the genomes of these stem cells, researchers and clinicians would be unaware of the underlying damage that is present with the cell lines they are working with.

“The DNA damage that we saw was at a nucleotide level,” says Professor Nik-Zainal. “If you think of the human genome as like a book, most researchers would check the number of chapters and be satisfied that there were none missing. But what we saw was that even with the correct number of chapters in place, lots of the words were garbled.”

Fortunately, says Professor Nik-Zainal, there is a way round the problem: using whole genome sequencing to look in detail for the errors at the outset.

“The cost of whole genome sequencing has dropped dramatically in recent years to around £500 per sample, though it's the analysis and interpretation that's the hardest bit. If a research question involves cell lines and cellular models, and particularly if we're going to introduce these lines back into patients, we may have to consider sequencing the genomes of these lines to understand what we are dealing with and get a sense of whether they are suitable for use.”

Dr Rouhani adds: “In recent years we have been finding out more and more about how even our healthy cells carry many mutations and therefore it is not a realistic aim to produce stem cell lines with zero mutations. The goal should be to know as much as possible about the nature and extent of the DNA damage to make informed choices about the ultimate use of these stem cell lines.

“If a line is to be used for cell based therapies in patients for example, then we need to understand more about the implications of these mutations so that both clinicians and patients are better informed of the risks involved in the treatment.”

The research was funded by Cancer Research UK, the Medical Research Council and Wellcome, and supported by NIHR Cambridge Biomedical Research Centre and the UK Regenerative Medicine Platform.

Reference
Rouhani, FJ, Zou, X, Danecek, P, et al. Substantial somatic genomic variation and selection for BCOR mutations in human induced pluripotent stem cells; Nat Gen; 11 Aug 2022; DOI: 10.1038/s41588-022-01147-3

DNA damage caused by factors such as ultraviolet radiation affect nearly three-quarters of all stem cell lines derived from human skin cells, say Cambridge researchers, who argue that whole genome sequencing is essential for confirming if cell lines are usable.

Almost three-quarters of the cell lines had UV damage. Some samples had an enormous amount of mutations – sometimes more than we find in tumours
Serena Nik-Zainal
Sun

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Just over half of six-year-olds in Britain meet physical activity guidelines

$
0
0
Group of children playing tug of war

Physical activity is beneficial for our physical and mental health, but activity levels tend to decrease across childhood and adolescence. Current UK physical activity guidelines recommend that children and young people from ages 5 to 18 years do an average of 60 minutes of moderate-to-vigorous physical activity (such as playing in the park or physical education) per day across the week. For all children, it is also recommended that they keep to a minimum extended periods of sedentary behaviour (such as sitting watching TV).

To investigate how much activity children do in their early primary school years, researchers from the Medical Research Council (MRC) Epidemiology Unit at the University of Cambridge and the MRC Lifecourse Epidemiology Centre at the University of Southampton provided 712 six-year-olds with Actiheart accelerometers, which measured their heart rate and movement. The children, who had been recruited as part of the ongoing Southampton Women’s Survey, wore these continually for an average of six days.

The results of the study are published today in the Journal of Physical Activity & Health.

At age six, children were sedentary for a daily average of more than five hours (316 minutes) and engaged in over 7.5 hours (457 minutes) of low-level physical activity and just over an hour (65 minutes) of moderate-to-vigorous physical activity.

Just over half of the children (53%) met the current UK recommended guidelines, with boys being more likely to reach the target than girls (63% of boys vs 42% of girls).

Dr Esther van Sluijs from the MRC Epidemiology Unit at Cambridge said: “Using accelerometers, we were able to get a much better idea of how active children were and we found that just over a half of six-year-olds were getting the recommended amount of physical activity. But this means that almost half of British children in this age group are not regularly active, which we know is important for their wellbeing and their performance at school.”

When the researchers analysed activity levels by time of day, they found that girls engaged in less moderate-to-vigorous physical activity during the school day at age six. Possible explanations are that girls wear skirts, which may make physical activity more challenging, or that they choose less active options during break times.  

The researchers were able to look at longitudinal data from some children – that is, data recorded over a period of time rather than just a snapshot – and found that compared to at age four, at age six children became more sedentary (on average, around 30 minutes per day more compared to when they were four), but also engaged in an additional seven minutes per day of moderate-to-vigorous physical activity.

Dr Kathryn Hesketh from the MRC Epidemiology Unit at Cambridge added: “This is something of a double-edged sword: children appear to do more moderate-to-vigorous physical activity when they start formal schooling, which is really positive, but they also spend more time sedentary. This may in part be because of the structure of the school day, so we may want to look at ways to reduce sedentary time when children are younger, to prevent that behaviour becoming habitual.”

Professor Keith Godfrey from the University of Southampton commented: “These analyses indicate that new initiatives to promote physical activity must consider the lower activity levels in girls and at weekends. The time when children transition into formal schooling is an important opportunity to ensure a much higher proportion achieve recommended levels of activity.”

While based on detailed data collected up to 2012, evidence from national questionnaire based surveys is that children's patterns of activity levels changed little in the years leading up to the COVID-19 pandemic, with widely recognised even lower rates of meeting the Chief Medical Officer guidelines during the pandemic.

The work was largely supported by Wellcome and the Medical Research Council.

Reference
Hesketh, KR et al. Activity behaviours in British 6-year-olds: cross-sectional associations and longitudinal change during the school transition. Journal of Physical Activity & Health; 11 Aug 2022; DOI: 10.1123/jpah.2021-0718

All averages quoted are mean.

Fifty-three percent of six-year-olds met the recommended daily guidelines for moderate-to-vigorous physical activity in a study carried out pre-pandemic by researchers at the universities of Cambridge and Southampton.

This is something of a double-edged sword: children appear to do more moderate-to-vigorous physical activity when they start formal schooling, which is really positive, but they also spend more time sedentary
Kathryn Hesketh
Group of children playing tug of war

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Algorithm learns to correct 3D printing errors for different parts, materials and systems

$
0
0
Example image of the 3D printer nozzle used by the machine learning algorithm to detect and correct errors in real time.

The engineers, from the University of Cambridge, developed a machine learning algorithm that can detect and correct a wide variety of different errors in real time, and can be easily added to new or existing machines to enhance their capabilities. 3D printers using the algorithm could also learn how to print new materials by themselves. Details of their low-cost approach are reported in the journal Nature Communications.

3D printing has the potential to revolutionise the production of complex and customised parts, such as aircraft components, personalised medical implants, or even intricate sweets, and could also transform manufacturing supply chains. However, it is also vulnerable to production errors, from small-scale inaccuracies and mechanical weaknesses through to total build failures.

Currently, the way to prevent or correct these errors is for a skilled worker to observe the process. The worker must recognise an error (a challenge even for the trained eye), stop the print, remove the part, and adjust settings for a new part. If a new material or printer is used, the process takes more time as the worker learns the new setup. Even then, errors may be missed as workers cannot continuously observe multiple printers at the same time, especially for long prints.

“3D printing is challenging because there's a lot that can go wrong, and so quite often 3D prints will fail,” said Dr Sebastian Pattinson from Cambridge’s Department of Engineering, the paper’s senior author. “When that happens, all of the material and time and energy that you used is lost.”

Engineers have been developing automated 3D printing monitoring, but existing systems can only detect a limited range of errors in one part, one material and one printing system.

“What’s really needed is a ‘driverless car’ system for 3D printing,” said first author Douglas Brion, also from the Department of Engineering. “A driverless car would be useless if it only worked on one road or in one town – it needs to learn to generalise across different environments, cities, and even countries. Similarly, a ‘driverless’ printer must work for multiple parts, materials, and printing conditions.”

Brion and Pattinson say the algorithm they’ve developed could be the ‘driverless car’ engineers have been looking for.

“What this means is that you could have an algorithm that can look at all of the different printers that you're operating, constantly monitoring and making changes as needed – basically doing what a human can't do,” said Pattinson.

The researchers trained a deep learning computer vision model by showing it around 950,000 images captured automatically during the production of 192 printed objects. Each of the images was labelled with the printer’s settings, such as the speed and temperature of the printing nozzle and flow rate of the printing material. The model also received information about how far those settings were from good values, allowing the algorithm to learn how errors arise.

“Once trained, the algorithm can figure out just by looking at an image which setting is correct and which is wrong – is a particular setting too high or too low, for example, and then apply the appropriate correction,” said Pattinson. “And the cool thing is that printers that use this approach could be continuously gathering data, so the algorithm could be continually improving as well.”

Using this approach, Brion and Pattinson were able to make an algorithm that is generalisable – in other words, it can be applied to identify and correct errors in unfamiliar objects or materials, or even in new printing systems.

“When you’re printing with a nozzle, then no matter the material you’re using – polymers, concrete, ketchup, or whatever – you can get similar errors,” said Brion. “For example, if the nozzle is moving too fast, you often end up with blobs of material, or if you’re pushing out too much material, then the printed lines will overlap forming creases.

“Errors that arise from similar settings will have similar features, no matter what part is being printed or what material is being used. Because our algorithm learned general features shared across different materials, it could say ‘Oh, the printed lines are forming creases, therefore we are likely pushing out too much material’.”

As a result, the algorithm that was trained using only one kind of material and printing system was able to detect and correct errors in different materials, from engineering polymers to even ketchup and mayonnaise, on a different kind of printing system.

In future, the trained algorithm could be more efficient and reliable than a human operator at spotting errors. This could be important for quality control in applications where component failure could have serious consequences.

With the support of Cambridge Enterprise, the University’s commercialisation arm, Brion has formed Matta, a spin-out company that will develop the technology for commercial applications.

“We’re turning our attention to how this might work in high-value industries such as the aerospace, energy, and automotive sectors, where 3D printing technologies are used to manufacture high-performance and expensive parts,” said Brion. “It might take days or weeks to complete a single component at a cost of thousands of pounds. An error that occurs at the start might not be detected until the part is completed and inspected. Our approach would spot the error in real time, significantly improving manufacturing productivity.”

The research was supported by the Engineering and Physical Sciences Research Council, Royal Society, Academy of Medical Sciences, and the Isaac Newton Trust.

The full dataset used to train the AI is freely available online. 

Reference:
Douglas A. J. Brion & Sebastian W. Pattinson. ‘Generalisable 3D printing error detection and correction via multi-head neural networks.’ Nature Communications (2022). DOI: 10.1038/s41467-022-31985-y

Engineers have created intelligent 3D printers that can quickly detect and correct errors, even in previously unseen designs, or unfamiliar materials like ketchup and mayonnaise, by learning from the experiences of other machines.

Once trained, the algorithm can figure out just by looking at an image which setting is correct and which is wrong
Sebastian Pattinson
Example image of the 3D printer nozzle used by the machine learning algorithm to detect and correct errors in real time.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Risk of volcano catastrophe ‘a roll of the dice’, say experts

$
0
0

The world is “woefully underprepared” for a massive volcanic eruption and the likely repercussions on global supply chains, climate and food, according to experts from the University of Cambridge’s Centre for the Study of Existential Risk (CSER).

In an article published in the journal Nature, they say there is a “broad misconception” that risks of major eruptions are low, and describe current lack of governmental investment in monitoring and responding to potential volcano disasters as “reckless”. 

However, the researchers argue that steps can be taken to protect against volcanic devastation – from improved surveillance to increased public education and magma manipulation – and the resources needed to do so are long overdue.

“Data gathered from ice cores on the frequency of eruptions over deep time suggests there is a one-in-six chance of a magnitude seven explosion in the next one hundred years. That’s a roll of the dice,” said article co-author and CSER researcher Dr Lara Mani, an expert in global risk. 

“Such gigantic eruptions have caused abrupt climate change and collapse of civilisations in the distant past.”

Mani compares the risk of a giant eruption to that of a 1km-wide asteroid crashing into Earth. Such events would have similar climatic consequences, but the likelihood of a volcanic catastrophe is hundreds of times higher than the combined chances of an asteroid or comet collision.

“Hundreds of millions of dollars are pumped into asteroid threats every year, yet there is a severe lack of global financing and coordination for volcano preparedness,” Mani said. “This urgently needs to change. We are completely underestimating the risk to our societies that volcanoes pose.”

An eruption in Tonga in January was the largest ever instrumentally recorded. The researchers argue that if it had gone on longer, released more ash and gas, or occurred in an area full of critical infrastructure – such as the Mediterranean – then global shock waves could have been devastating.

“The Tonga eruption was the volcanic equivalent of an asteroid just missing the Earth, and needs to be treated as a wake-up call,” said Mani. 

The CSER experts cite recent research detecting the regularity of major eruptions by analysing traces of sulphur spikes in ancient ice samples. An eruption ten to a hundred times larger than the Tonga blast occurs once every 625 years – twice as often as had been previously thought.

“The last magnitude seven eruption was in 1815 in Indonesia,” said co-author Dr Mike Cassidy, a volcano expert and visiting CSER researcher, now based at the University of Birmingham.

“An estimated 100,000 people died locally, and global temperatures dropped by a degree on average, causing mass crop failures that led to famine, violent uprisings and epidemics in what was known as the year without summer,” he said.

“We now live in a world with eight times the population and over forty times the level of trade. Our complex global networks could make us even more vulnerable to the shocks of a major eruption.”

Financial losses from a large magnitude eruption would be in the multi-trillions, and on a comparable scale to the pandemic, say the experts.

Mani and Cassidy outline steps they say need to be taken to help forecast and manage the possibility of a planet-altering eruption, and help mitigate damage from smaller, more frequent eruptions.

These include a more accurate pinpointing of risks. We only know locations of a handful of the 97 eruptions classed as large magnitude on the “Volcano Explosivity Index” over the last 60,000 years. This means there could be dozens of dangerous volcanoes dotted the world over with the potential for extreme destruction, about which humanity has no clue.

“We may not know about even relatively recent eruptions due to a lack of research into marine and lake cores, particularly in neglected regions such as Southeast Asia,” said Cassidy. “Volcanoes can lie dormant for a long time, but still be capable of sudden and extraordinary destruction.”

Monitoring must be improved, say the CSER experts. Only 27% of eruptions since 1950 have had a seismometer anywhere near them, and only a third of that data again has been fed into the global database for “volcanic unrest”.

“Volcanologists have been calling for a dedicated volcano-monitoring satellite for over twenty years,” said Mani. “Sometimes we have to rely on the generosity of private satellite companies for rapid imagery.”

The experts also call for increased research into volcano “geoengineering”. This includes the need to study means of countering aerosols released by a massive eruption, which could lead to a “volcanic winter”. They also say that work to investigate manipulating pockets of magma beneath active volcanoes should be undertaken.

Added Mani: “Directly affecting volcanic behaviour may seem inconceivable, but so did the deflection of asteroids until the formation of the NASA Planetary Defense Coordination Office in 2016. The risks of a massive eruption that devastates global society is significant. The current underinvestment in responding to this risk is simply reckless.”      

While funding is pumped into preventing low-probability scenarios such as asteroid collision, the far more likely threat of a large volcanic eruption is close to ignored – despite much that could be done to reduce the risks, say researchers.

The risks of a massive eruption that devastates global society is significant
Lara Mani
Mount Rinjani in Indonesia, which had one of the largest eruptions in the last millennium in 1257 (magnitude 7).

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Medieval monks were ‘riddled with worms’, study finds

$
0
0

A new analysis of remains from medieval Cambridge shows that local Augustinian friars were almost twice as likely as the city’s general population to be infected by intestinal parasites.

This is despite most Augustinian monasteries of the period having latrine blocks and hand-washing facilities, unlike the houses of ordinary working people.

Researchers from the University of Cambridge’s Department of Archaeology say the difference in parasitic infection may be down to monks manuring crops in friary gardens with their own faeces, or purchasing fertiliser containing human or pig excrement.

The study, published today in the International Journal of Paleopathology, is the first to compare parasite prevalence in people from the same medieval community who were living different lifestyles, and so might have differed in their infection risk. 

The population of medieval Cambridge consisted of residents of monasteries, friaries and nunneries of various major Christian orders, along with merchants, traders, craftsmen, labourers, farmers, and staff and students at the early university.

Cambridge archaeologists investigated samples of soil taken from around the pelvises of adult remains from the former cemetery of All Saints by the Castle parish church, as well as from the grounds where the city’s Augustinian Friary once stood.

Most of the parish church burials date from the 12-14th century, and those interred within were primarily of a lower socio-economic status, mainly agricultural workers.

The Augustinian friary in Cambridge was an international study house, known as a studium generale, where clergy from across Britain and Europe would come to read manuscripts. It was founded in the 1280s and lasted until 1538 before suffering the fate of most English monasteries: closed or destroyed as part of Henry VIII’s break with the Roman Church.  

The researchers tested 19 monks from the friary grounds and 25 locals from All Saints cemetery, and found that 11 of the friars (58%) were infected by worms, compared with just eight of the general townspeople (32%).

They say these rates are likely the minimum, and that actual numbers of infections would have been higher, but some traces of worm eggs in the pelvic sediment would have been destroyed over time by fungi and insects. 

The 32% prevalence of parasites among townspeople is in line with studies of medieval burials in other European countries, suggesting this is not particularly low – but rather the infection rates in the monastery were remarkably high.

“The friars of medieval Cambridge appear to have been riddled with parasites,” said study lead author Dr Piers Mitchell from Cambridge’s Department of Archaeology. “This is the first time anyone has attempted to work out how common parasites were in people following different lifestyles in the same medieval town.”

Cambridge researcher Tianyi Wang, who did the microscopy to spot the parasite eggs, said: “Roundworm was the most common infection, but we found evidence for whipworm infection as well. These are both spread by poor sanitation.”

Standard sanitation in medieval towns relied on the cesspit toilet: holes in the ground used for faeces and household waste. In monasteries, however, running water systems were a common feature – including to rinse out the latrine – although that has yet to be confirmed at the Cambridge site, which is only partly excavated. 

Not all people buried in Augustinian friaries were actually clergy, as wealthy people from the town could pay to be interred there. However, the team could tell which graves belonged to friars from the remains of their clothing.

“The friars were buried wearing the belts they wore as standard clothing of the order, and we could see the metal buckles at excavation,” said Craig Cessford of the Cambridge Archaeological Unit.

As roundworm and whipworm are spread by poor sanitation, researchers argue that the difference in infection rates between the friars and the general population must have been due to how each group dealt with their human waste.

“One possibility is that the friars manured their vegetable gardens with human faeces, not unusual in the medieval period, and this may have led to repeated infection with the worms,” said Mitchell.

Medieval records reveal how Cambridge residents may have understood parasites such as roundworm and whipworm. John Stockton, a medical practitioner in Cambridge who died in 1361, left a manuscript to Peterhouse college that included a section on De Lumbricis (‘on worms’).

It notes that intestinal worms are generated by excess of various kinds of mucus: “Long round worms form from an excess of salt phlegm, short round worms from sour phlegm, while short and broad worms came from natural or sweet phlegm.”

The text prescribes “bitter medicinal plants” such as aloe and wormwood, but recommends they are disguised with “honey or other sweet things” to help the medicine go down.

Another text – Tabula medicine– found favour with leading Cambridge doctors of the 15th century, and suggests remedies as recommended by individual Franciscan monks, such as Symon Welles, who advocated mixing a powder made from moles into a curative drink.

Overall, those buried in medieval England’s monasteries had lived longer than those in parish cemeteries, according to previous research, perhaps due to a more nourishing diet, a luxury of wealth. 

Research examining traces of parasites in the remains of medieval Cambridge residents suggests that local friars were almost twice as likely as ordinary working townspeople to have intestinal worms – despite monasteries of the period having far more sanitary facilities.  

One possibility is that the friars manured their vegetable gardens with human faeces
Piers Mitchell
Augustinian friars being excavated by the Cambridge Archaeological Unit.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Pheasant meat sold for food found to contain many tiny shards of toxic lead

$
0
0

A study has found that pheasants killed by lead shot contain many fragments of lead too small to detect by eye or touch, and too distant from the shot to be removed without throwing away a large proportion of otherwise useable meat.

Lead fragments often form when lead shotgun pellets hit the bodies of gamebirds. The fragments become lodged deep within the meat.

Researchers examined the carcasses of eight wild-shot common pheasants, killed on a farmland shoot using lead shotgun ammunition and on sale in a UK butcher’s shop. They found small lead fragments embedded in every pheasant, in addition to lead shotgun pellets in seven of them.

The researchers found up to 10mg of tiny lead shards per pheasant, all of which were much too small to be detected by eye or by touch.

Lead is toxic to humans when absorbed by the body – there is no known safe level of exposure. Lead accumulates in the body over time and can cause long-term harm, including increased risk of cardiovascular disease and kidney damage in adults. It is known to lower IQ in young children, and affect the neurological development of unborn babies.

“While lead gunshot continues to be used for hunting, people who eat pheasants and other similar gamebirds are very likely to be also consuming a lot of tiny lead fragments,” said Professor Rhys Green in the University of Cambridge’s Department of Zoology, and first author of the study.

An earlier study in rats showed that when consumed, more lead is absorbed into the body from smaller fragments than from larger ones.

“It seems to have been widely assumed in the past that a lead shot embedded in a pheasant carcass remained intact, and could be removed cleanly before the pheasant was eaten – removing any health risk. Our study has shown the extent to which this is really not the case,” said Green.

He added: “By eating pheasant, people are also unwittingly eating lead, which is toxic.”

“One pheasant is a reasonable meal for two or three people. Consuming this much lead occasionally wouldn’t be a great cause for concern – but we know that there are thousands of people in the UK who eat game meat, often pheasant, every week.”

Around 11,000 tonnes of meat from wild-shot gamebirds, mostly pheasant, are eaten in the UK every year. Virtually all pheasants shot in the UK for human consumption are killed using lead shot.

The researchers used a high-resolution CT (computerised tomography) scanner to locate the lead fragments in the pheasant meat in three dimensions, and measure their size and weight. The meat was then dissolved, allowing the larger fragments to be extracted and analysed further to confirm they were lead.

An average of 3.5 lead pellets and 39 lead fragments of less than 1mm wide were detected per pheasant. The smallest fragments were 0.07mm wide – at the limit of resolution for the CT scanner for specimens of this size - and the researchers say it is likely that even smaller fragments were also present.

The lead pieces were widely distributed within the birds’ tissues and some of the small fragments were over 50mm from the nearest lead shot pellet.

The results are published today in the journal PLOS ONE.

“It’s rare for people eating game meat to accidentally eat a whole lead shot, because they’re cautious about damaging their teeth and know to check for lead shotgun pellets in the meat. But the lead fragments we found in pheasant carcasses were so tiny and widely distributed that it’s very unlikely they would be detected and removed,” said Green.

There are no UK or EU regulations about the maximum allowable levels of lead in human food from wild-shot game animals. This is in contrast with strict maximum levels for lead in many other foods including meat from cattle, sheep, pigs and poultry, and shellfish harvested from the wild.

Steel shotgun pellets are a practical alternative to lead, and their use in place of lead for hunting is recommended by UK shooting organisations. But there is very little evidence of a voluntary switch away from lead being made. The UK Health & Safety Executive is currently preparing a case for banning the use of lead ammunition for hunting in the UK, and the European Chemicals Agency is doing the same for Europe.

Other game including partridge, grouse and rabbit is also mainly shot using lead shotgun pellets, and wild deer are shot using lead bullets. Hunters often remove the guts of deer carcasses to make them lighter to carry, and the discarded guts - which often contain many bullet fragments - are eaten by wildlife, which then also suffer the harmful effects of consuming lead.

This research was funded by The Royal Society for the Protection of Birds.

Reference

Green, R.E. et al. ‘Implications for food safety of the size and location of fragments of lead shotgun pellets embedded in hunted carcasses of small game animals intended for human consumption.’ PLOS ONE, August 2022. DOI: 10.1371/journal.pone.0268089

Eating pheasant killed using lead shot is likely to expose consumers to raised levels of lead in their diet, even if the meat is carefully prepared to remove the shotgun pellets and the most damaged tissue.

By eating pheasant, people are also unwittingly eating lead, which is toxic.
Professor Rhys Green
Pheasant

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Machine learning algorithm predicts how to get the most out of electric vehicle batteries

$
0
0
People charging their electric cars at charging station

The researchers, from the University of Cambridge, say their algorithm could help drivers, manufacturers and businesses get the most out of the batteries that power electric vehicles by suggesting routes and driving patterns that minimise battery degradation and charging times.

The team developed a non-invasive way to probe batteries and get a holistic view of battery health. These results were then fed into a machine learning algorithm that can predict how different driving patterns will affect the future health of the battery.

If developed commercially, the algorithm could be used to recommend routes that get drivers from point to point in the shortest time without degrading the battery, for example, or recommend the fastest way to charge the battery without causing it to degrade. The results are reported in the journal Nature Communications.

The health of a battery, whether it’s in a smartphone or a car, is far more complex than a single number on a screen. “Battery health, like human health, is a multi-dimensional thing, and it can degrade in lots of different ways,” said first author Penelope Jones, from Cambridge’s Cavendish Laboratory. “Most methods of monitoring battery health assume that a battery is always used in the same way. But that’s not how we use batteries in real life. If I’m streaming a TV show on my phone, it’s going to run down the battery a whole lot faster than if I’m using it for messaging. It’s the same with electric cars – how you drive will affect how the battery degrades.”

“Most of us will replace our phones well before the battery degrades to the point that it’s unusable, but for cars, the batteries need to last for five, ten years or more,” said Dr Alpha Lee, who led the research. “Battery capacity can change drastically over that time, so we wanted to come up with a better way of checking battery health.”

The researchers developed a non-invasive probe that sends high-dimensional electrical pulses into a battery and measures the response, providing a series of ‘biomarkers’ of battery health. This method is gentle on the battery and doesn’t cause it to degrade any further.

The electrical signals from the battery were converted into a description of the battery’s state, which was fed into a machine learning algorithm. The algorithm was able to predict how the battery would respond in the next charge-discharge cycle, depending on how quickly the battery was charged and how fast the car would be going the next time it was on the road. Tests with 88 commercial batteries showed that the algorithm did not require any information about previous usage of the battery to make an accurate prediction.

The experiment focused on lithium cobalt oxide (LCO) cells, which are widely used in rechargeable batteries, but the method is generalisable across the different types of battery chemistries used in electric vehicles today.

“This method could unlock value in so many parts of the supply chain, whether you’re a manufacturer, an end user, or a recycler, because it allows us to capture the health of the battery beyond a single number, and because it’s predictive,” said Lee. “It could reduce the time it takes to develop new types of batteries, because we’ll be able to predict how they will degrade under different operating conditions.”

The researchers say that in addition to manufacturers and drivers, their method could be useful for businesses that operate large fleets of electric vehicles, such as logistics companies. “The framework we’ve developed could help companies optimise how they use their vehicles to improve the overall battery life of the fleet,” said Lee. “There’s so much potential with a framework like this.”

“It’s been such an exciting framework to build because it could solve so many of the challenges in the battery field today,” said Jones. “It’s a great time to be involved in the field of battery research, which is so important in helping address climate change by transitioning away from fossil fuels.”

The researchers are now working with battery manufacturers to accelerate the development of safer, longer-lasting next-generation batteries. They are also exploring how their framework could be used to develop optimal fast charging protocols to reduce electric vehicle charging times without causing degradation.

The research was supported by the Winton Programme for the Physics of Sustainability, the Ernest Oppenheimer Fund, The Alan Turing Institute and the Royal Society.


Reference:
Penelope K. Jones, Ulrich Stimming & Alpha A. Lee. ‘Impedance-based forecasting of lithium-ion battery performance amid uneven usage.’ Nature Communications (2022). DOI: 10.1038/s41467-022-32422-w

Researchers have developed a machine learning algorithm that could help reduce charging times and prolong battery life in electric vehicles by predicting how different driving patterns affect battery performance, improving safety and reliability.

This method could unlock value in so many parts of the supply chain, whether you’re a manufacturer, an end user, or a recycler, because it allows us to capture the health of the battery beyond a single number
Alpha Lee
People charging their electric cars at charging station in York

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Scientists develop new method to assess ozone layer recovery

$
0
0
View of Earth from 40,000 feet

Published in the journal Nature, their method - the Integrated Ozone Depletion (IOD) metric - provides a useful tool for policymakers and scientists.

The IOD has been designed to provide a straightforward way to measure the effects of unregulated emissions of substances that deplete the ozone layer, and evaluate how effective ozone layer protection measures are.

The ozone layer is found in a region of the earth’s atmosphere known as the stratosphere, and acts as an important protection barrier against most of the sun’s harmful ultraviolet rays.

Ozone-depleting gases such as chlorofluorocarbons, better known as CFCs, have been phased out under the Montreal Protocol - an international treaty agreed to protect the ozone layer.

The Montreal Protocol has been largely successful, but illegal breaches are jeopardising its efficacy.

The IOD indicates the impact of any new emissions on the ozone layer by considering three things: the strength of the emission, how long it will remain in the atmosphere, and how much ozone is chemically destroyed by it. 

For environmental protection and human health policies, the IOD represents a simple means of calculating the impact of any given emission scenario on ozone recovery. 

This new metric has been developed by researchers at the National Centre for Atmospheric Science at the University of Cambridge and the National Centre for Earth Observation at the University of Leeds.

Professor John Pyle, from the National Centre for Atmospheric Science and the University of Cambridge, has dedicated his career to studying the depletion of ozone in the stratosphere and helping develop the Montreal Protocol. He is the lead author of the current Nature paper.

“Following the Montreal Protocol, we are now in a new phase - assessing the recovery of the ozone layer,” said Pyle, from Cambridge’s Yusuf Hamied Department of Chemistry. “This new phase calls for new metrics, like the Integrated Ozone Depletion - which we refer to as the IOD. Our new metric can measure the impact of emissions - regardless of their size. Using an atmospheric chemistry computer model, we have been able to demonstrate a simple linear relationship between the IOD, the size of the emissions and the chemical lifetimes. So, with knowledge of the lifetimes, it is a simple matter to calculate the IOD, making this an excellent metric both for science and policy.”

“The Montreal Protocol is successfully protecting the ozone layer, but there is increasing evidence to suggest the ozone hole is recovering slower than expected. The IOD will be very useful for monitoring ozone recovery, and especially relevant to regulators who need to phase out substances with the potential to chemically destroy ozone.”

The IOD metric has been created using a computer model of the atmosphere, called the UK Chemistry and Aerosols model (UKCA). The National Centre for Atmospheric Science and the Met Office developed the UKCA model to calculate future projections of important chemicals, such as ozone in the stratosphere.

“We have used the UKCA model to develop the IOD metric, which will enable us to estimate the effect of any new illegal or unregulated emissions on the ozone layer. In the UKCA model we can perform experiments with different types and concentrations of CFCs, and other ozone-depleting substances,” said co-author Dr Luke Abraham, also from the University of Cambridge. “We can estimate how chemicals in the atmosphere will change in the future, and assess their impact on the ozone layer over the coming century.”

Reference:
John A. Pyle et al. ‘Integrated ozone depletion as a metric for ozone recovery.’ Nature (2022). DOI: 10.1038/s41586-022-04968-8

Researchers have developed a new method for assessing the impacts of ozone-destroying substances that threaten the recovery of the ozone layer. 

The Montreal Protocol is successfully protecting the ozone layer, but there is increasing evidence to suggest the ozone hole is recovering slower than expected
John Pyle
View of Earth from 40,000 feet

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Robots can be used to assess children’s mental wellbeing, study suggests

$
0
0
Robot shaking hands with Dr Micol Spitale

A team of roboticists, computer scientists and psychiatrists from the University of Cambridge carried out a study with 28 children between the ages of eight and 13, and had a child-sized humanoid robot administer a series of standard psychological questionnaires to assess the mental wellbeing of each participant.

The children were willing to confide in the robot, in some cases sharing information with the robot that they had not yet shared via the standard assessment method of online or in-person questionnaires. This is the first time that robots have been used to assess mental wellbeing in children.

The researchers say that robots could be a useful addition to traditional methods of mental health assessment, although they are not intended to be a substitute for professional mental health support. The results will be presented today at the 31st IEEE International Conference on Robot & Human Interactive Communication (RO-MAN) in Naples, Italy.

During the COVID-19 pandemic, home schooling, financial pressures, and isolation from peers and friends impacted the mental health of many children. Even before the pandemic however, anxiety and depression among children in the UK has been on the rise, but the resources and support to address mental wellbeing are severely limited.

Professor Hatice Gunes, who leads the Affective Intelligence and Robotics Laboratory in Cambridge’s Department of Computer Science and Technology, has been studying how socially-assistive robots (SARs) can be used as mental wellbeing ‘coaches’ for adults, but in recent years has also been studying how they may be beneficial to children.

“After I became a mother, I was much more interested in how children express themselves as they grow, and how that might overlap with my work in robotics,” said Gunes. “Children are quite tactile, and they’re drawn to technology. If they’re using a screen-based tool, they’re withdrawn from the physical world. But robots are perfect because they’re in the physical world – they’re more interactive, so the children are more engaged.”

With colleagues in Cambridge’s Department of Psychiatry, Gunes and her team designed an experiment to see if robots could be a useful tool to assess mental wellbeing in children.

“There are times when traditional methods aren’t able to catch mental wellbeing lapses in children, as sometimes the changes are incredibly subtle,” said Nida Itrat Abbasi, the study’s first author. “We wanted to see whether robots might be able to help with this process.”

For the study, 28 participants between ages eight and 13 each took part in a one-to-one 45-minute session with a Nao robot – a humanoid robot about 60 centimetres tall. A parent or guardian, along with members of the research team, observed from an adjacent room. Prior to each session, children and their parent or guardian completed standard online questionnaire to assess each child’s mental wellbeing.

During each session, the robot performed four different tasks: 1) asked open-ended questions about happy and sad memories over the last week; 2) administered the Short Mood and Feelings Questionnaire (SMFQ); 3) administered a picture task inspired by the Children’s Apperception Test (CAT), where children are asked to answer questions related to pictures shown; and 4) administered the Revised Children’s Anxiety and Depression Scale (RCADS) for generalised anxiety, panic disorder and low mood.

Children were divided into three different groups following the SMFQ, according to how likely they were to be struggling with their mental wellbeing. Participants interacted with the robot throughout the session by speaking with it, or by touching sensors on the robot’s hands and feet. Additional sensors tracked participants’ heartbeat, head and eye movements during the session.

Study participants all said they enjoyed talking with the robot: some shared information with the robot that they hadn’t shared either in person or on the online questionnaire.

The researchers found that children with varying levels of wellbeing concerns interacted differently with the robot. For children that might not be experiencing mental wellbeing-related problems, the researchers found that interacting with the robot led to more positive response ratings to the questionnaires. However, for children that might be experiencing wellbeing related concerns, the robot may have enabled them to divulge their true feelings and experiences, leading to more negative response ratings to the questionnaire.

“Since the robot we use is child-sized, and completely non-threatening, children might see the robot as a confidante – they feel like they won’t get into trouble if they share secrets with it,” said Abbasi. “Other researchers have found that children are more likely to divulge private information – like that they’re being bullied, for example – to a robot than they would be to an adult.”

The researchers say that while their results show that robots could be a useful tool for psychological assessment of children, they are not a substitute for human interaction.

“We don’t have any intention of replacing psychologists or other mental health professionals with robots, since their expertise far surpasses anything a robot can do,” said co-author Dr Micol Spitale. “However, our work suggests that robots could be a useful tool in helping children to open up and share things they might not be comfortable sharing at first.”

The researchers say that they hope to expand their survey in future, by including more participants and following them over time. They are also investigating whether similar results could be achieved if children interact with the robot via video chat.

The research was supported in part by the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI), and NIHR Cambridge Biomedical Research Centre. Hatice Gunes is a Fellow of Trinity Hall, Cambridge. 

Reference:
Nida Itrat Abbasi et al. ‘Can Robots Help in the Evaluation of Mental Wellbeing in Children? An Empirical Study.’ Paper presented to the 31st IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), Naples, Italy, 29 August – 2 September 2022.

Robots can be better at detecting mental wellbeing issues in children than parent-reported or self-reported testing, a new study suggests.

Children might see the robot as a confidante – they feel like they won’t get into trouble if they share secrets with it
Nida Itrat Abbasi
Nao robot shaking hands with study co-author Dr Micol Spitale

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cannabis users no less likely to be motivated or able to enjoy life’s pleasure

$
0
0
Female hands rolling a marijuana joint

Cannabis users also show no difference in motivation for rewards, pleasure taken from rewards, or the brain’s response when seeking rewards, compared to non-users.

Cannabis is the third most commonly used controlled substance worldwide, after alcohol and nicotine. A 2018 report from the NHS Digital Lifestyles Team stated that almost one in five (19%) of 15-year-olds in England had used cannabis in the previous 12 months, while in 2020 the National Institute on Drug Abuse reported the proportion in the United States to be 28% of 15-16-year-olds.

A common stereotype of cannabis users is the ‘stoner’ – think Jesse Pinkman in Breaking Bad, The Dude in The Big Lebowski, or, more recently, Argyle in Stranger Things. These are individuals who are generally depicted as lazy and apathetic.

At the same time, there has been considerable concern of the potential impact of cannabis use on the developing brain and that using cannabis during adolescence might have a damaging effect at an important time in an individual’s life.

A team led by scientists at UCL, the University of Cambridge and the Institute of Psychiatry, Psychology & Neuroscience at King’s College London carried out a study examining whether cannabis users show higher levels of apathy (loss of motivation) and anhedonia (loss of interest in or pleasure from rewards) when compared to controls and whether they were less willing to exert physical effort to receive a reward. The research was part of the CannTEEN study.

The results are published in the International Journal of Neuropsychopharmacology.

The team recruited 274 adolescent and adult cannabis users who had used cannabis at least weekly over the past three months, with an average of four days per week, and matched them with non-users of the same age and gender.

Participants completed questionnaires to measure anhedonia, asking them to rate statements such as “I would enjoy being with family or close friends”. They also completed questionnaires to measure their levels of apathy, which asked them to rate characteristics such as how interested they were in learning new things or how likely they were to see a job through to the end.

Cannabis users scored slightly lower than non-users on anhedonia – in other words, they appeared better able to enjoy themselves – but there was no significant difference when it came to apathy. The researchers also found no link between frequency of cannabis use and either apathy or anhedonia in the people who used cannabis.

Martine Skumlien, a PhD candidate in the Department of Psychiatry at the University of Cambridge, said: “We were surprised to see that there was really very little difference between cannabis users and non-users when it came to lack of motivation or lack of enjoyment, even among those who used cannabis every day. This is contrary to the stereotypical portrayal we see on TV and in movies.”

In general, adolescents tended to score higher than adults on anhedonia and apathy in both user and non-user groups, but cannabis use did not augment this difference.

Dr Will Lawn, from the Institute of Psychiatry, Psychology and Neuroscience at King’s College London, said: “There’s been a lot of concern that cannabis use in adolescence might lead to worse outcomes than cannabis use during adulthood. But our study, one of the first to directly compare adolescents and adults who use cannabis, suggests that adolescents are no more vulnerable than adults to the harmful effects of cannabis on motivation, the experience of pleasure, or the brain’s response to reward.

“In fact, it seems cannabis may have no link – or at most only weak associations – with these outcomes in general. However, we need studies that look for these associations over a long period of time to confirm these findings.”

Just over half of participants also carried out a number of behavioural tasks. The first of these assessed physical effort. Participants were given the option to perform button-presses in order to win points, which were later exchanged for chocolates or sweets to take home. There were three difficulty levels and three reward levels; more difficult trials required faster button pressing. On each trial the participant could choose to accept or reject the offer; points were only accrued if the trial was accepted and completed.

In a second task, measuring how much pleasure they received from rewards, participants were first told to estimate how much they wanted to receive each of three rewards (30 seconds of one of their favourite songs, one piece of chocolate or a sweet, and a £1 coin) on a scale from ‘do not want at all’ to ‘intensely want’. They then received each reward in turn and were asked to rate how pleasurable they found them on a scale from ‘do not like at all’ to ‘intensely like’.

The researchers found no difference between users and non-users or between age groups on either the physical effort task or the real reward pleasure task, confirming evidence from other studies that found no, or very little, difference.

Skumlien added: “We’re so used to seeing ‘lazy stoners’ on our screens that we don’t stop to ask whether they’re an accurate representation of cannabis users. Our work implies that this is in itself a lazy stereotype, and that people who use cannabis are no more likely to lack motivation or be lazier than people who don’t.

“Unfair assumptions can be stigmatising and could get in the way of messages around harm reduction. We need to be honest and frank about what are and are not the harmful consequences of drug use.”

Earlier this year, the team published a study that used functional magnetic resonance imaging (fMRI) to look at brain activity in the same participants as they took part in a brain imaging task measuring reward processing. The task involved participants viewing orange or blue squares while in the scanner. The orange squares would lead to a monetary reward, after a delay, if the participant made a response.

The researchers used this set up to investigate how the brain responds to rewards, focusing in particular on the ventral striatum, a key region in the brain’s reward system. They found no relationship between activity in this region and cannabis use, suggesting that cannabis users had similar reward systems as non-users.

Professor Barbara Sahakian, from the Department of Psychiatry at the University of Cambridge, said: “Our evidence indicates that cannabis use does not appear to have an effect on motivation for recreational users. The participants in our study included users who took cannabis on average four days a week and they were no more likely to lack motivation. However, we cannot rule out the possibility that greater use, as seen in some people with cannabis-use disorder, has an effect.

“Until we have future research studies that follow adolescent users, starting from onset through to young adulthood, and which combine measures of motivation and brain imaging, we cannot determine for certain that regular cannabis use won’t negatively impact motivation and the developing brain.”

This research was funded by the Medical Research Council with additional support from the Aker Foundation, National Institute for Health Research and Wellcome.

References

Skumlien, M, et al. Anhedonia, apathy, pleasure, and effort-based decision-making in adult and adolescent cannabis users and controls. IJNP; 24 Aug 2022; DOI: 10.1093/ijnp/pyac056

Skumlien, M, et al. Neural responses to reward anticipation and feedback in adult and adolescent cannabis users and controls. Neuropsychopharmacology; 6 April 2022; DOI: 10.1038/s41386-022-01316-2

Adult and adolescent cannabis users are no more likely than non-users to lack motivation or be unable to enjoy life’s pleasure, new research has shown, suggesting there is no scientific basis for the stereotype often portrayed in the media.

We’re so used to seeing ‘lazy stoners’ on our screens that we don’t stop to ask whether they’re an accurate representation of cannabis users. Our work implies that this is in itself a lazy stereotype
Martine Skumlien
Female hands rolling a marijuana joint

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge Biomedical Campus celebrates 60 years with £2bn boost to UK economy

$
0
0
Aerial shot of Cambridge Biomedical Campus

The key findings of the report are:

  • The campus supported an aggregate economic footprint of £2.2 billion worth of Gross Value Added to the UK economy and that as well as being the largest employment site in Cambridge, over 15,000 additional roles are supported across the regional supply chain and local businesses.
  • For every 10 jobs directly generated by organisations on the CBC, a further 2.7 jobs are supported within Cambridge City and South Cambridgeshire; one in every six jobs in the local authority areas are either directly or indirectly supported by the campus.
  • Employment on site is growing much faster than the rest of the UK and that £721m is spent by employees across the regional economy.

Looking at the wider economic picture, the research highlights that in 2021 the site reported a collaborative operating income of £1.9 billion, as well as contributing £291million to the Exchequer through tax revenues.

In addition to the new report, a series of events are planned to celebrate the success of the campus spanning 60 years and to tell more stories about the globally significant research that goes on.

Dr Kristin-Anne Rutter, Executive Director at Cambridge Biomedical Campus, said: “The economic impact report for the first time demonstrates the importance of the Cambridge Biomedical Campus to the region, and to the thousands of people who work here and rely on the organisations, whether it’s as a patient or someone working on the site. The success we have on the site is not just limited to improved healthcare and treatments for patients – we generate jobs and income for businesses across Cambridge and the East of England. We do this through collaboration, with research, industry and the NHS working together to drive innovation which is then shared.

“The report is an important milestone, so too is our 60th anniversary and throughout September we’ll be highlighting some of the amazing developments and ideas which have happened since Addenbrookes hospital and the MRC Laboratory of Molecular Biology arrived on the Hills Road site. We’ll be sharing how the campus has grown and how science is taken from laboratories into hospitals, to diagnose and treat patients with world-leading innovative healthcare.”

Alongside the major economic impact of CBC, the research that takes place on the campus has very real and direct healthcare benefits, fuelled by innovation and discoveries that sit at the very forefront of life sciences technology and knowhow.

Read the report

A new economic impact report  details the financial contributions of the Cambridge Biomedical Campus (CBC), which celebrates its 60th anniversary this autumn. The independent report by the Centre for Economics and Business Research (Cebr) for the first time calculates the economic benefits of CBC also highlights the health and research benefits for the region.

The success we have on the site is not just limited to improved healthcare and treatments for patients – we generate jobs and income for businesses across Cambridge and the East of England
Kristin-Anne Rutter
Aerial shot of Cambridge Biomedical Campus
Life-saving treatments

Below are some of the case studies of how patients have been given, or are set to benefit from life-saving treatments, discovered and developed at CBC but with the potential to literally change the lives of people across the world.

Cytosponge: A ‘sponge on a string’ test to detect oesophageal cancer

Around 9,100 people are diagnosed with oesophageal cancer each year in the UK. A big challenge with this type of cancer is that many people don’t realise there’s a problem until they start to have trouble swallowing. Often, these symptoms aren’t recognisable until a later stage in the disease.

But there may be an opportunity to detect the disease earlier. Some people develop a condition – called Barrett’s oesophagus – prior to developing into cancer. Barrett’s oesophagus is much more common than oesophageal cancer, and although it will only become cancer in a handful of cases, it presents an opportunity for doctors to spot a problem early and intervene before cancer develops. But the typical test for Barrett’s oesophagus, endoscopy, is both invasive and expensive.

Enter the Cytosponge.Cytosponge-TFF3 test is a ‘sponge on a string’ device coupled with a laboratory test called TFF3 developed by scientists funded by the Medical Research Council (MRC) and Cancer Research UK – a simple, quick and affordable test for Barrett’s oesophagus that can be done in a GP surgery.

Read more

Ethanol breath biopsy clinical trial for early lung cancer detection

A new clinical trial has launched at Royal Papworth Hospital in Cambridge which is using ethanol (an alcohol) detected in exhaled breath as a potential tool to diagnose lung cancer earlier. The EVOLUTION trial is recruiting patients who definitely have lung cancer and healthy volunteers who definitely do not.
A liquid solution containing a metabolic probe is administered intravenously, travels around the body and when it reacts with a lung tumour causes the release of ethanol. After a set amount of time, patients breathe at regular intervals into a special mask which collects the ethanol which is then analysed in the laboratory. The eVOC probe (Exogenous Volatile Organic Compound) has been developed by Cambridge company Owlstone Medical, who have collaborated with Royal Papworth Hospital’s thoracic oncology research team for previous breath biopsy studies.

Changing the future of ovarian cancer

Each year, about 7,500 women in the UK are diagnosed with ovarian cancer, and around 5,000 will have the most aggressive form of the disease. The cure rate for women with ovarian cancer is very low, despite new medicines coming into the clinic. Only 43% of women in England survive five years beyond their ovarian cancer diagnosis, compared with more than 80% of people for more common cancers, such as breast (85%) and prostate (87%). This is because the disease is often diagnosed late, treatment options are limited, and many women develop resistance to current therapies. Research by Professors James Brenton and Evis Sala, at the Cancer Research UK Cambridge Centre, aims to address this.

Read more

Life-changing artificial pancreas

An artificial pancreas developed by Cambridge researchers is helping protect very young children with type 1 diabetes at a particularly vulnerable time of their lives.

The artificial pancreas uses an algorithm - CamAPS FX – to determine the amount of insulin administered by a device worn by the child. It is available through a number of NHS trusts across the UK, including Cambridge University Hospitals NHS Foundation Trust, and the team hope it will soon be available even more widely.

Read more

Adapted from a press release by Cambridge University Health Partners

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Viewing all 4508 articles
Browse latest View live