Are you the publisher? Claim or contact us about this channel

Embed this content in your HTML


Report adult content:

click to rate:

Account: (login)

More Channels


Channel Catalog

older | 1 | .... | 73 | 74 | (Page 75) | 76 | 77 | .... | 141 | newer

    0 0

    A second study from the team suggests that a drug used to treat paracetamol overdose may be able to help individuals who want to break their addiction and stop their damaging cocaine seeking habits.

    Although both studies were carried out in rats, the researchers believe the findings will be relevant to humans.

    Cocaine is a stimulant drug that can lead to addiction when taken repeatedly. Quitting can be extremely difficult for some people: around four in ten individuals who relapse report having experienced a craving for the drug – however, this means that six out of ten people have relapsed for reasons other than ‘needing’ the drug.

    “Most people who use cocaine do so initially in search of a hedonic ‘high’,” explains Dr David Belin from the Department of Pharmacology at the University of Cambridge. “In some individuals, though, frequent use leads to addiction, where use of the drug is no longer voluntary, but ultimately becomes a compulsion. We wanted to understand why this should be the case.”

    Drug-taking causes a release in the brain of the chemical dopamine, which helps provide the ‘high’ experienced by the user. Initially the drug taking is volitional – in other words, it is the individual’s choice to take the drug – but over time, this becomes habitual, beyond their control.

    Previous research by Professor Barry Everitt from the Department of Psychology at Cambridge showed that when rats were allowed to self-administer cocaine, dopamine-related activity occurred initially in an area of the brain known as the nucleus accumbens, which plays a significant role driving ‘goal-directed’ behaviour, as the rats sought out the drug. However, if the rats were given cocaine over an extended period, this activity transferred to the dorsolateral striatum, which plays an important role in habitual behaviour, suggesting that the rats were no longer in control, but rather were responding automatically, having developed a drug-taking habit.

    The brain mechanisms underlying the balance between goal-directed and habitual behaviour involves the prefrontal cortex, the brain region that orchestrates our behaviour. It was previously thought that this region was overwhelmed by stimuli associated with the drugs, or with the craving experienced during withdrawal; however, this does not easily explain why the majority of individuals relapsing to drug use did not experience any craving.

    Chronic exposure to drugs alters the prefrontal cortex, but it also alters an area of the brain called the basolateral amygdala, which is associated with the link between a stimulus and an emotion. The basolateral amygdala stores the pleasurable memories associated with cocaine, but the pre-frontal cortex manipulates this information, helping an individual to weigh up whether or not to take the drug: if an addicted individual takes the drug, this activates mechanisms in the dorsal striatum.

    However, in a study published today in the journal Nature Communications, Dr Belin and Professor Everitt studied the brains of rats addicted to cocaine through self-administration of the drug and identified a previously unknown pathway within the brain that links impulse with habits.

    The pathway links the basolateral amygdala indirectly with the dorsolateral striatum, circumventing the prefrontal cortex. This means that an addicted individual would not necessarily be aware of their desire to take the drug.

    “We’ve always assumed that addiction occurs through a failure or our self-control, but now we know this is not necessarily the case,” explains Dr Belin. “We’ve found a back door directly to habitual behaviour.

    “Drug addiction is mainly viewed as a psychiatric disorder, with treatments such as cognitive behavioural therapy focused on restoring the ability of the prefrontal cortex to control the otherwise maladaptive drug use. But we’ve shown that the prefrontal cortex is not always aware of what is happening, suggesting these treatments may not always be effective.”

    In a second study, published in the journal Biological Psychiatry, Dr Belin and colleagues showed that a drug used to treat paracetamol overdose may be able to help individuals addicted to cocaine overcome their addiction – provided the individual wants to quit.

    The drug, N-acetylcysteine, had previously been shown in rat studies to prevent relapse. However, the drug later failed human clinical trials, though analysis suggested that while it did not lead addicted individuals to stop using cocaine, amongst those who were trying to abstain, it helped them refrain from taking the drug.

    Dr Belin and colleagues used an experiment in which rats compulsively self-administered cocaine. They found that rats given N-acetylcysteine lost the motivation to self-administer cocaine more quickly than rats given a placebo. In fact, when they had stopped working for cocaine, they tended to relapse at a lower rate. N-acetylcysteine also increased the activity in the brain of a particular gene associated with plasticity – the ability of the brain to adapt and learn new skills.

    “A hallmark of addiction is that the user continues to take the drug even in the face of negative consequences – such as on their health, their family and friends, their job, and so on,” says co-author Mickael Puaud from the Department of Pharmacology of the University of Cambridge. “Our study suggests that N-acetylcysteine, a drug that we know is well tolerated and safe, may help individuals who want to quit to do so.”

    Murray, JE et al. Basolateral and central amygdala differentially recruit and maintain dorsolateral striatum-dependent cocaine-seeking habits. Nature Comms; 16 December 2015

    Ducret, E et al. N-acetylcysteine facilitates self-imposed abstinence after escalation of cocaine intake. Biological Psychiatry; 7 Oct 2015

    Individuals addicted to cocaine may have difficulty in controlling their addiction because of a previously-unknown ‘back door’ into the brain, circumventing their self-control, suggests a new study led by the University of Cambridge.

    Most people who use cocaine do so initially in search of a hedonic ‘high’. In some individuals, though, frequent use leads to addiction, where use of the drug is no longer voluntary, but ultimately becomes a compulsion
    David Belin
    relaxing after work_MMVI (cropped)

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    Archaeologists have revealed exceptionally well-preserved Bronze Age dwellings during an excavation at Must Farm quarry in the East Anglian fens that is providing an extraordinary insight into domestic life 3,000 years ago. The settlement, dating to the end of the Bronze Age (1200-800 BC), would have been home to several families who lived in a number of wooden houses on stilts above water.

    The settlement was destroyed by fire that caused the dwellings to collapse into the river, preserving the contents in situ. The result is an extraordinary time capsule containing exceptional textiles made from plant fibres such as lime tree bark, rare small cups, bowls and jars complete with past meals still inside. Also found are exotic glass beads forming part of an elaborate necklace, hinting at a sophistication not usually associated with the British Bronze Age.

    The exposed structures are believed to be the best-preserved Bronze Age dwellings ever found in Britain and the finds, taken together, provide a fuller picture of prehistoric life than we have ever had before.

    The major excavation is happening because of concern about the long-term preservation of this unique Bronze Age site with its extraordinary remains. The Cambridge Archaeological Unit (CAU) is carrying out the excavation of 1,100 square metres of the Must Farm site in Cambridgeshire, and is now half way through the project.

    The excavation site is two metres below the modern ground surface, as  levels have risen over thousands of years and archaeologists have now reached the river bed as it was in 1000-800BC. Clearly visible are the well-preserved charred roof timbers of one of the roundhouses, timbers with tool marks and a perimeter of wooden posts known as a palisade which once enclosed the site.      

    It is possible that those living in the settlement were forced to leave everything behind when it caught on fire. Such is the level of preservation due to the deep waterlogged sediments of the Fens, the footprints of those who once lived there were also found. The finds suggest there is much more to be discovered in the rest of the settlement as the excavation continues over the coming months.

    CAU’s Mark Knight, Site Director of the excavation, said: “Must Farm is the first large-scale investigation of the deeply buried sediments of the fens and we uncover the perfectly preserved remains of prehistoric settlement. Everything suggests the site is not a one-off but in fact presents a template of an undiscovered community that thrived 3,000 years ago ‘beneath’ Britain’s largest wetland.”

    The £1.1 million four-year project has been funded by heritage organisation Historic England and the building firm Forterra. Duncan Wilson, Chief Executive of Historic England, said: “A dramatic fire 3,000 years ago combined with subsequent waterlogged preservation has left to us a frozen moment in time, which gives us a graphic picture of life in the Bronze Age.”

    After the excavation is complete, the team will take all the finds for further analysis and conservation. Eventually they will be displayed at Peterborough Museum and at other local venues. The end of the four year project will see a major publication about Must Farm and an online resource detailing the finds.

    The site, now a clay quarry owned by Forterra, is close to Whittlesey, Cambridgeshire and sits astride a prehistoric watercourse inside the Flag Fen basin. The site has produced large quantities of Bronze Age metalwork, including a rapier and sword in 1969, and more recently the discovery of nine pristinely preserved log boats in 2011.

    Archaeologists say these discoveries place Must Farm alongside similar European Prehistoric Wetland sites; the ancient loch-side dwellings known as crannogs in Scotland and Ireland; stilt houses, also known as pile dwellings, around the Alpine Lakes; and the terps of Friesland, manmade hill dwellings in the Netherlands.

    David Gibson, Archaeological Manager at CAU, added: “Usually at a Later Bronze Age period site you get pits, post-holes and maybe one or two really exciting metal finds. Convincing people that such places were once thriving settlements takes some imagination.

    “But this time so much more has been preserved – we can actually see everyday life during the Bronze Age in the round. It’s prehistoric archaeology in 3D with an unsurpassed finds assemblage both in terms of range and quantity,” he said. 

    For a more detailed summary of the Must Farm discoveries, visit the project archive here: 

    Large circular wooden houses built on stilts collapsed in a dramatic fire 3,000 years ago and plunged into a river, preserving their contents in astonishing detail. Archaeologists say the excavations have revealed the best-preserved Bronze Age dwellings ever found in Britain.  

    It’s prehistoric archaeology in 3D with an unsurpassed finds assemblage both in terms of range and quantity
    David Gibson
    Archaeologists at Must Farm have uncovered the charred wooden roof structure of a 3,000 year old round house.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    While efforts to limit emissions of greenhouse gases, including ozone, tend to focus on industrial activities and the burning of fossil fuels, a new study suggests that future regulations may need to address the burning of forests and vegetation. The study, published in the journal Nature Communications, indicates that ‘biomass burning’ may play a larger role in climate change than previously realised.

    Based on observations from two aircraft missions, satellite data and a variety of models, an international research team showed that fires burning in tropical Africa and Southeast Asia caused pockets of high ozone and low water in the lower atmosphere above Guam – a remote island in the Pacific Ocean 1,700 miles east of Taiwan.

    “We were very surprised to find high concentrations of ozone and chemicals that we know are only emitted by fires in the air around Guam,” said the study’s lead author Daniel Anderson, a graduate student at the University of Maryland. “We didn’t make specific flights to target high-ozone areas – they were so omnipresent that no matter where we flew, we found them.”

    For the study, two research planes on complementary missions flew over Guam measuring the levels of dozens of chemicals in the atmosphere in January and February 2014. One aircraft flew up to 24,000 feet above the ocean surface during the UK Natural Environment Research Council’s Coordinated Airborne Studies in the Tropics (CAST) mission. The other flew up to 48,000 feet above the ocean surface during the CONvective Transport of Active Species in the Tropics (CONTRAST) mission.

    “International collaboration is essential for studying global environmental issues these days,” said CAST Principal Investigator Neil Harris, of Cambridge’s Department of Chemistry. “This US/UK-led campaign over the western Pacific was the first of its kind in this region and collected a unique data set. The measurements are now starting to produce insight into how the composition of the remote tropical atmosphere is affected by human activities occurring nearly halfway around the world.”

    Researchers examined 17 CAST and 11 CONTRAST flights and compiled over 3,000 samples from high-ozone, low-water air parcels for the study. In the samples, the team detected high concentrations of chemicals associated with biomass burning—hydrogen cyanide, acetonitrile, benzene and ethyne.

    “Hydrogen cyanide and acetonitrile were the smoking guns because they are emitted almost exclusively by biomass burning. High levels of the other chemicals simply added further weight to the findings,” said study co-author Julie Nicely, a graduate student from the University of Maryland.

    Next, the researchers traced the polluted air parcels backward 10 days, using the National Oceanic and Atmospheric Administration (NOAA) Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model and precipitation data, to determine where they came from. Overlaying fire data from NASA’s moderate resolution imaging spectroradiometer (MODIS) on board the Terra satellite, the researchers connected nearly all of the high-ozone, low-water structures to tropical regions with active biomass burning in tropical Africa and Southeast Asia.

    “The investigation utilised a variety of models, including the NCAR CAM-Chem model to forecast and later analyse chemical and dynamical conditions near Guam, as well as satellite data from numerous instruments that augmented the interpretation of the aircraft observations,” said study co-author Douglas Kinnison, a project scientist at the University Corporation for Atmospheric Research.

    In the paper, the researchers also offer a new explanation for the dry nature of the polluted air parcels.

    “Our results challenge the explanation atmospheric scientists commonly offer for pockets of high ozone and low water: that these zones result from the air having descended from the stratosphere where air is colder and dryer than elsewhere,” said University of Maryland Professor Ross Salawitch, the study’s senior author and principal investigator of CONTRAST.

    “We know that the polluted air did not mix with air in the stratosphere to dry out because we found combined elevated levels of carbon monoxide, nitric oxide and ozone in our air samples, but air in the higher stratosphere does not contain much naturally occurring carbon monoxide,” said Anderson.

    The researchers found that the polluted air that reached Guam never entered the stratosphere and instead simply dried out during its descent within the lower atmosphere. While textbooks show air moving upward in the tropics, according to Salawitch, this represents the net motion of air. Because this upward motion happens mostly within small storm systems, it must be balanced by air slowly descending, such as with these polluted parcels released from fires.

    Based on the results of this study, global climate models may need to be reassessed to include and correctly represent the impacts of biomass burning, deforestation and reforestation, according to Salawitch. Also, future studies such as NASA’s upcoming Atmospheric Tomography Mission will add to the data collected by CAST and CONTRAST to help obtain a clearer picture of our changing environment.

    In addition to those mentioned above, the study’s authors included UMD Department of Atmospheric and Oceanic Science Professor Russell Dickerson and Assistant Research Professor Timothy Canty; CAST co-principal investigator James Lee of the University of York; CONTRAST co-principal investigator Elliott Atlas of the University of Miami; and additional researchers from NASA; NOAA; the University of California, Irvine; the California Institute of Technology; the University of Manchester; the Institute of Physical Chemistry Rocasolano; and the National Research Council in Argentina.

    This research was supported by the Natural Environment Research Council, National Science Foundation, NASA, and National Oceanic and Atmospheric Administration.

    Daniel C. Anderson et al. ‘A pervasive role for biomass burning in tropical high ozone/low water structures’ Nature Communications (2016). DOI: 10.1038/ncomms10267. 

    Inset image: Air Tracking. Credit: Daniel Anderson

    Adapted from a University of Maryland press release. 

    Study indicates ‘biomass burning’ may play a larger role in climate change than previously realised.

    The measurements are now starting to produce insight into how the composition of the remote tropical atmosphere is affected by human activities occurring nearly halfway around the world.
    Neil Harris
    CONTRAST and CAST Mission Planes

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    At a special Congregation in the Senate House, the Vice-Chancellor, Professor Sir Leszek Borysiewicz, will admit His Excellency, the Secretary-General of the United Nations, Ban Ki-moon, to an honorary doctorate.

    The honorary degree of Doctor of Law will be conferred on the Secretary-General in recognition of his humanitarian work, support for women’s rights and achievements in pursuit of global peace and security.

    After admission to his degree on 3 February, the Secretary-General will deliver a lecture to members of the University community.

    The honour comes in Mr Ban’s last year as Secretary-General of the United Nations, a position he has held since 1 January 2007.

    He took his bachelor's degree in international relations at Seoul National University before earning a master's degree in public administration at the Kennedy School of Government at Harvard University.

    At the time of his election as Secretary-General, Mr Ban was South Korea's Minister of Foreign Affairs and Trade.

    His 37 years of service with that Ministry included postings in New Delhi, Washington DC and Vienna, and a variety of portfolios, including foreign policy and national security. Mr Ban’s ties to the UN date back to 1975, when he worked for the Ministry's UN Division.

    His extensive diplomatic efforts have helped to put climate change at the forefront of the global agenda, while his notable support of women’s rights, including backing for many high-profile campaigns, has seen the creation of the agency UN Women, plus the establishment of a new Special Representative on Sexual Violence in Conflict.

    Throughout his time at the UN, he has driven forward improvements to the UN peacekeeping operations. Accountability for violations of human rights has received high-level attention through inquiries related to Gaza, Guinea, Pakistan and Sri Lanka, legal processes in Lebanon and Cambodia, and advocacy for the "responsibility to protect," the new UN norm aimed at preventing or halting genocide and other grave crimes.

    He has also worked to strengthen humanitarian response in the aftermath of disasters in Myanmar (2008), Haiti (2010) and Pakistan (2010), and mobilised UN support for the democratic transitions in North Africa and the Middle East. Mr Ban has also increased efforts to rejuvenate the disarmament agenda.

    The honorary degree Congregation will be declared a 'scarlet day', when those holding doctorates wear their 'festal' gowns and all University members attending will wear academical dress.

    University and College buildings will fly flags to mark the occasion while the bells of the University Church ring out.

    The University has been conferring honorary degrees for some 500 years. One of the earliest recorded ceremonies was in 1493, when the University honoured the poet John Skelton. An honorary degree is the highest accolade the University can bestow.

    University’s highest honour conferred for ‘humanitarian work, support for women’s rights and achievements in pursuit of global peace and security’.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    High in the Italian Alps, thousands of stick-like images of people and animals, carved into rock surfaces, offer a tantalising window into the past. Archaeologists believe that the earliest of these 150,000 images date from the Neolithic but that most originate from the Iron Age. The UNESCO-protected ‘Pitoti’ (little puppets) of the Valcamonica valley extend over an area of some three square kilometres and have been described as one of the world’s largest pieces of anonymous art.

    An event taking place next Monday (18 January 2016) at Downing College, Cambridge, will give the public an opportunity to learn more about a fascinating project to explore and re-animate the Pitoti of Valcamonica. Displays and hands-on activities staged by seven of the institutions involved in the EU/European Research Council-funded ‘3D Pitoti’ digital heritage project will show visitors how archaeologists and film-makers have used the latest digital technology to explore an art form often portrayed as simplistic or primitive.

    The exhibitors from Austria, Italy, Germany and the UK will show that the thousands of Pitoti can be seen as “one big picture” as dozens of artists, over a period of some 4,000 years, added narratives to the giant ‘canvases’ formed by sandstone rocks scraped clean by the movement of glaciers across the landscape. The images are etched into the rock surfaces so that, as the sun rises and then falls in the sky, the figures can be seen to gain a sense of movement.

    Displays will introduce visitors to the scanning, machine learning and interactive 3D-visualisation technologies used by Bauhaus Weimar, Technical University Graz, and St Pölten University of Applied Sciences to record, analyse and breathe life into the Pitoti. Cambridge archaeologists Craig Alexander, Giovanna Bellandi and Christopher Chippindale have worked with Alberto Marretta and Markus Seidl to create Pitoti databases using Arctron’s Aspect 3D system.

    The scanned images of the Pitoti are stored in the rock-art research institute in Valcamonica, Centro Camuno di Studi Preistorici, and have given the project’s team an unprecedentedly rich resource to play with in exploring the power of graphic art in combination with other media.

    The 3D Pitoti team members attending next week’s event will engage with visitors who will be given the chance to experience the scanner, UAV (unmanned aerial vehicle), computer sectioning, and the Pitoti ‘oculus rift’ virtual reality experience, made possible by using advanced imaging systems which are creating a new generation of ‘real’ images. The live demonstration of the interactive 3D Pitoti children’s app, developed by Archeocammuni and Nottingham University, is likely to prove popular with younger visitors who will have the chance to handle the technology and ask questions.

    Also taking part in the event will be the renowned craftsperson Lida Cardozo Kindersley who will demonstrate the art of letter cutting as an intensely physical process.

    Archaeologists increasingly believe that the Valcamonica images may have been one element in a kind of ‘proto-cinema’ that might have involved other ‘special effects’. “When I first saw the Pitoti, my immediate thought was that these are frames for a film. Initially I envisaged an animated film but over time I’ve come to realise that the quality of colour, the play of light and shadow, and the texture of the rocks, make the Pitoti much more sophisticated than 2D animated graphics. That’s why we need to work in 3D,” says Cambridge archaeologist and film-maker Dr Frederick Baker, one of the founding participants in the project.

    “Many of the images at Valcamonica are contemporary with classical Greek art but are an under appreciated form of art. I believe that the Pitoti are an example of minimalism, an early precursor to work by Alberto Giacometti and Pablo Picasso. They can be just as powerful as the classical art of Athens and Rome in their own way. By showcasing our project in the neo-classical setting of Downing College, we are highlighting this clash of visual cultures and using the digital to raise the appreciation of what has been seen as ‘barbarian’ or ‘tribal’ art.”

    Members of the 3D Pitoti team captured thousands of images of people, sheep, deer, horses and dogs found on the Valamonica rocks. The digitised images gave the project a ‘casting directory’ of thousands of ‘characters’ in order to create imagined narratives. The creation of moving images using pixels, or dots, echoes the making of the Pitoti which were pecked out of the rock by people striking the surface with repeated blows to produce lines and shapes.

    Dr Sue Cobb, from The University of Nottingham, who led the international team of scientists, said: “Thanks to the 3D Pitoti project, archaeological sites and artefacts can be rendered in stunningly realistic computer-generated models and even 3D printed for posterity. Our tools will give more people online access to culturally-important heritage sites and negate the need to travel to the locations, which can be inaccessible or vulnerable to damage.

    “We overcame a number of technical challenges to innovate the technology, including developing weatherproof, portable laser scanner to take detailed images of the pitoti in situ in harsh, rugged terrain; using both a UAV and glider to take aerial shots of the valley for the computer model and processing huge masses of data to recreate an immersive, film-quality version of the site in 3D.

    “With our new story-telling app, users can scan and animate 3D Pitoti images to construct their own rock art stories from the thousands of fascinating human and animal figures discovered so far. The aim is to show to public audiences that with archaeology there isn’t a single answer to the art’s meaning –there are theories and interpretations - and to teach the importance of the rock art as a biographical record of European history.”

    Next Monday’s event will include a test screening of a 15-minute 3D generated film called ‘Pitoti Prometheus’ which reimagines the story of Prometheus (who, according to legend, created men from clay) by animating digital images captured in Valcamonica. The fully finished film will be launched later in the year.

    The film’s 3D engineer Marcel Karnapke and film-maker Fred Baker (contributing via Skype) will take part in a discussion at the end of the day, enabling the audience to ask questions about the film and the unfolding of an ambitious project which breaks new boundaries in terms of European cross-disciplinary collaboration.

    “We use the word ‘pipeline’ to describe the process by which we’ve scanned and channelled the rock art images through time and space to bring them to mass audiences,” says Baker. “It’s a pipeline which stretches well beyond what we’ve produced and future technologies will undoubtedly open up new understandings of art forms that communicate so much about humanity and our relationships with each other, with the environment, and with imagined worlds.”

    Next Tuesday morning (19 January 2016), a series of talks and workshops, aimed primarily at academics, will take place at the McDonald Institute for Archaeological Research. The two days of events are the official culmination of the 3D Pitoti project. For details of Monday’s event, which is free of charge, go to

    Inset images: Michael Holzapfel (left) and Martin Schaich (right) (ArcTron/3-D Pitoti with permission of Marc Steinmetz/VISUM); Pitoti Prometheus event poster.

    An event next Monday (18 January 2016) will give the public a chance to experience at first hand the technologies that have enabled archaeologists to create 3D visualisations of images etched into rock thousands of years ago. The day-long event is free and open to all.

    When I first saw the Pitoti, my immediate thought was that these are frames for a film ... the quality of colour, the play of light and shadow, and the texture of the rocks, make the Pitoti much more sophisticated than 2D animated graphics
    Frederick Baker
    Eleanora Montinari

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    On Wednesday 6 February, groups of enthusiastic children, aged 4-11 years old, at Icknield Primary School were taken to infinity and beyond during a day of classes that celebrated all things space, including workshops about space and astronauts, games, a hands-on demonstration of a real space suit (including how to go to the toilet in the space suit), making models of space suits and helmets, and building model spaceships.

    The Icknield space day, which is just the beginning of a programme of space-themed events at the School, is linked to the Space to Earth challenge– and Mission X Train Like an Astronaut challenges a call to action for schools to align their teaching activities with the 400km distance between the International Space Station (ISS) and Earth, and Earth and the Moon as part of the UK Space Agency's package of activities for the Principia Mission

    Over the next few months, teachers at the School will incorporate space-themed physics, technology, mathematics, PE and design activities into daily teaching, coinciding with the six-month European Space Agency (ESA) mission to ISS by British Astronaut, Tim Peake. Tim is the first British ESA astronaut to live and work on the ISS. The mission is intended to inspire children and young people to explore the world around them, and engage more fully with science, technology, engineering and maths (STEM) subjects.

    Dr Helen Mason OBE from University of Cambridge’s Department of Applied Mathematics and Theoretical Physics and a Fellow of St Edmund’s College, and Heather MacRae (venturethinking), who is leading the Space to Earth Challenge, spent the day at Icknield school. Throughout the day, Helen and Heather participated in every lesson, taking the excited children on a journey through space and answering any questions from the ever-inquisitive young minds, such as:

    “What are the chances of there being life on other planets?”

    “Could a black hole suck up the space station?”

    “What would happen if the space suit had a hole while the astronaut was space walking?”

    Speaking about the day, Helen said: “We need to get children interested and excited about science at an early age and Tim Peake’s mission is a great opportunity to do this. The mission is called Principia, named after the book by Sir Isaac Newton who studied at the University of Cambridge. So this is a fantastic chance for local schools to get involved in a range of fantastic activities.”

    Eleven-year old, Karla Bolton, who wore the space suit said: “It’s a lot heavier than it looks and you get really warm inside it. It makes you realise how hard it must be to wear it a lot. The best bit of the whole day was trying on the suit. I also found it very interesting learning how small the Earth is compared to the Sun. I’m now really looking forward to watching more stories about Tim in the news.”

    Gregor James, also 11-years old, agreed: “I’m really excited to watch more about Tim’s mission too. We’ve done lots of things on space today and on other days, asking lots of questions. It’s made me more interested in doing science when I go to high school.”

    Icknield School Deputy Head, Tom Snowdon, who is coordinating activities at the School, has noticed the children’s heightened enthusiasm for science over the last couple of months since they started the Space to Earth Challenge.

    He said, “We are part of the Tim Peake Primary Project supported by the European Space Education Resource Office.  We’ve been working with Heather and Helen since November and the children are really starting to connect with the ideas, and to think much more about space and science.

    “Over the Christmas holidays, lots of the children designed and made space helmets that we have been sharing and using for the basis of science lesson work. These have clearly made the children consider what the environment would be like in space and how they would survive there.

    “As a result, they’re now switching on to scientific concepts, as demonstrated by some of their questions that really showed their depth of thought.  It’s opened up their eyes to new possibilities, especially all the different types of careers involved with science and space.”

    Parents of children at Icknield School have also noticed a difference. Dr Matt Davey, a Senior Research Associate at the University of Cambridge Plant Sciences Department, and a research associate of Corpus Christi College, is a parent of pupils at Icknield Primary and Pippins pre-school.

    He said: “My children had an amazing day – they really enjoyed touching the space suit and it was great to hear them talking about it in such detail at home. I even learnt a lot about space suits and living in space, such as how each suit and seat is made for each astronaut. They’re now looking forward to the other planned space events this term at the school, such as docking and working in their classroom version of the International Space Station.”

    Icknield School has further events planned during the next few months, including a visit from Spectrum Drama Company who will perform a dramatisation about Yuri Gagarin, the first man in space. In addition, an astronomer will be visiting the School to give a presentation to Year 6.  

    A primary school in Sawston spent the day learning all about space and one lucky 11-year old girl had the chance to try on a real space suit. 

    We need to get children interested and excited about science at an early age and Tim Peake’s mission is a great opportunity to do this
    Helen Mason OBE

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    A detailed new study of the chequered currency-trading record of John Maynard Keynes might make today’s overconfident currency speculators think twice.

    While Keynes was one of the most famous economists in history, and his stock-picking record as an asset manager was outstanding, a forensic analysis of his personal currency trades found that his record was pedestrian by comparison.

    The findings are forthcoming in the Journal of Economic History, in a study co-authored by Olivier Accominotti from the London School of Economics and Political Science, and David Chambers of the University of Cambridge Judge Business School.

    “Unlike his stock investing, Keynes found currency investing a lot tougher, despite the fact that he was at the centre of the world of international finance throughout the time he traded currencies,” said Chambers. To be sure, Keynes made money from speculating in currencies in the 1920s and 1930s and his profits arose from more than pure chance. “Directionally, he called currencies more or less correctly but he really struggled with timing his trades. One main message for investors today is that if someone as economically literate and well-connected as Keynes found it difficult to time currencies, then the rest of us should think twice before believing we can do any better.”

    In his currency trading, Keynes relied heavily on his own analysis of fundamental economic factors such as inflation, trade balance, capital flows and political developments.

    Such ‘fundamentals-based’ strategy differs from ‘technical’ strategies that follow simple mechanical trading rules but seek profits by identifying market anomalies – typically through the carry trade (betting on high-interest currencies versus low-interest rate currencies) and momentum (betting on currencies which have recently appreciated versus those which have depreciated). Both fundamentals-based and technical trading styles are observed among modern-day currency managers.

    But Keynes produced unremarkable results at the dawn of the modern foreign-exchange market, when dealings were transformed by telegraphic transfer and the emergence of a forward exchange market.

    The period during which he traded was marked by considerable foreign exchange volatility and large deviations of exchange rates from their fundamental values which appear obvious to investors today. However, trading these deviations in real time was hazardous. “Implementing a currency trading strategy based on the analysis of macroeconomic fundamentals was challenging (even) for John Maynard Keynes,” said the research paper.

    This was particularly the case in the 1920s. Currency traders can be judged in terms of the return generated per unit of risk, also known as the Sharpe Ratio. While Keynes generated a Sharpe Ratio of approximately 0.2 (assuming his trading equity was fixed), the same ratio for an equal-weighted blend of the carry and momentum strategies was substantially higher at close to 1.0 after transaction costs. When he resumed currency trading in 1932 after a five-year break coinciding with the return to the gold standard, although Keynes outperformed the carry strategy (whose mean return was negative) in the 1930s, he still underperformed a simple momentum strategy.

    The study also found that Keynes “experienced periods of considerable losses in both the 1920s and 1930s. Indeed, he was close to being technically bankrupt in 1920 and could only stay trading thanks to his ability to borrow funds from his social circle.”

    The research is based on a detailed dataset of 354 personal currency trades by Keynes between 1919 and 1939 (mostly in five currencies against the British pound – the US dollar, French franc, Deutsche mark, Italian lira and Dutch guilder).

    Details of the trades were contained in ledgers kept in the archives at King’s College, Cambridge, where Keynes managed the college endowment fund for decades. Famously shifting the college portfolio from property to stocks, Keynes’s investment writings based on his very successful investment strategy at King’s College later on became a source of inspiration for David Swensen, the architect of the influential “Yale model” for managing university endowments in the US today.

    Originally published on the Cambridge Judge Business School website.

    John Maynard Keynes struggled as a foreign-exchange trader, finds the first detailed study of the famous economist as currency speculator.

    If someone as economically literate and well-connected as Keynes found it difficult to time currencies, then the rest of us should think twice before believing we can do any better.
    David Chambers
    John Maynard Keynes

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    Currently, patients due to undergo surgery are given a dose of anaesthetic based on the so-called ‘Marsh model’, which uses factors such as an individual’s body weight to predict the amount of drug needed. As patients ‘go under’, their levels of awareness are monitored in a relatively crude way. If they are still deemed awake, they are simply given more anaesthetic. However, general anaesthetics can carry risks, particularly if an individual has an underlying health condition such as a heart disorder.

    As areas of the brain communicate with each other, they give off tell-tale signals that can give an indication of how conscious an individual is. These ‘networks’ of brain activity can be measured using an EEG (electroencephalogram), which measures electric signals as brain cells talk to each other. Cambridge researchers have previously shown that these network signatures can even be seen in some people in a vegetative state and may help doctors identify patients who are aware despite being unable to communicate. These findings build upon advances in the science of networks to tackle the challenge of understanding and measuring human consciousness.

    In a study published today in the open access journal PLOS Computational Biology, funded by the Wellcome Trust, the researchers studied how these signals changed in healthy volunteers as they received an infusion of propofol, a commonly used anaesthetic.

    Twenty individuals (9 male, 11 female) received a steadily increasing dose of propofol – all up to the same limit – while undergoing a task that involved pressing one button if they heard a ‘ping’ and a different button if they heard a ‘pong’. At the same time, the researchers tracked their brain network activity using an EEG.

    By the time the subjects had reached the maximum dose, some individuals were still awake and able to carry out the task, while others were unconscious. As the researchers analysed the EEG readings, they found clear differences between those who were responding to the anaesthetic and those who remained able to carry on with the task. This ‘brain signature’ was evident in the network of communications between brain areas carried by alpha waves (brain cell oscillations in the frequency range of 7.5–12.5 Hz), the normal range of electrical activity of the brain when conscious and relaxed.

    In fact, when the researchers looked at the baseline EEG readings before any drug was given, they already saw differences between those who would later succumb to the drug and those who were less responsive to its effects. Dividing the subjects into two groups based on their EEG readings – those with lots of brain network activity at baseline and those with less – the researchers were able to predict who would be more responsive to the drug and who would be less.

    The researchers also measured levels of propofol in the blood to see if this could be used as a measure of how conscious an individual was. Although they found little correlation with the alpha wave readings in general, they did find a correlation with a specific form of brain network activity known as delta-alpha coupling. This may be able to provide a useful, non-invasive measure of the level of drug in the blood.

    “A very good way of predicting how an individual responds to our anaesthetic was the state of their brain network activity at the start of the procedure,” says Dr Srivas Chennu from the Department of Clinical Neurosciences, University of Cambridge. “The greater the network activity at the start, the more anaesthetic they are likely to need to put them under.”

    Dr Tristan Bekinschtein, senior author from the Department of Psychology, adds: “EEG machines are commonplace in hospitals and relatively inexpensive. With some engineering and further testing, we expect they could be adapted to help doctors optimise the amount of drug an individual needs to receive to become unconscious without increasing their risk of complications.”

    Srivas Chennu will be speaking at the Cambridge Science Festival on Wednesday 16 March. During the event, ‘Brain, body and mind: new directions in the neuroscience and philosophy of consciousness’, he will be examining what it means to be conscious.


    Chennu, S et al. Brain connectivity dissociates responsiveness from drug exposure during propofol induced transitions of consciousness. PLOS Computational Biology; 14 Jan 2016

    Brain networks during the transition to unconsciousness during propofol sedation (drug infusion timeline shown in red). Participants with robust networks at baseline (left panel) remained resistant to the sedative, while others showed characteristically different, weaker networks during unconsciousness (middle). All participants regained similar networks when the sedative wore off (right).

    The complex pattern of ‘chatter’ between different areas of an individual’s brain while they are awake could help doctors better track and even predict their response to general anaesthesia – and better identify the amount of anaesthetic necessary – according to new research from the University of Cambridge.

    A very good way of predicting how an individual responds to our anaesthetic was the state of their brain network activity at the start of the procedure
    Srivas Chennu
    Brain networks during the transition to unconsciousness during propofol sedation

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    New research on butterfly genomes has revealed that the genetic components that produce different splotches of colour on wings can be mixed up between species by interbreeding to create new patterns, like a "genetic paint box".

    Research on Amazonian Heliconius butterflies has shown that two of the most common colour patterns, found in combination on the wings of many Heliconius species – the dennis red patch on the base of the forewing, and the ray red streaks that fan out across the hindwing – are controlled by separate genetic switches that arose in completely different species.

    A team of researchers has traced the merging of these two wing pattern elements to interbreeding between butterfly species that occurred almost two million years ago.

    It has been known for some time that exchange of genes between species can be important for evolution: humans have exchanged genes with our now extinct relatives which may help survival at high altitudes, and Darwin's Finches have exchanged a gene that influences beak shape. In butterflies, the swapping of wing pattern elements allows different species to share common warning signs that ward off predators – a phenomenon known as mimicry.

    However, the new study, published today in the journal PLOS Biology, is the first to show such mixing of genetic material can produce entirely new wing patterns, by generating new combinations of genes.

    "We found that different colour patches on the wings are controlled by different genetic switches that can be turned on and off independently. As these switches were shared between species they got jumbled up into different combinations, making new wing patterns," said senior author Professor Chris Jiggins, from Cambridge University's Department of Zoology.

    The researchers sequenced the genomes from 142 individual butterflies across 17 Heliconius species and compared the DNA data, focusing on the regions associated with the two red colour patterns of dennis and ray on the forewing and hindwing. "In each butterfly genome, we narrowed down around 300 million base pairs of DNA to just a few thousand," said Jiggins.

    They found that the genetic switches for these distinct wing splotches operated independently, despite being located next to each other in the genome. The sequencing revealed that the switch for each colour splotch had evolved just once, and in separate species, but had been repeatedly shared across all the Heliconius species at occasional points of interbreeding dating back almost two million years.

    "By identifying the genetic switches associated with bits of wing pattern, when they evolved and how they diverged, we can actually map onto the species tree how these little regions of colour have jumped between species - and we can see they are jumping about all over the place," said Jiggins.

    The key to this evolutionary butterfly painting is the independence of each genetic switch. "The gene that these switches are controlling is identical in all these butterflies, it is coding for the same protein each time. That can't change as the gene is doing other important things," said lead author Dr Richard Wallbank, also from Cambridge's Department of Zoology.

    "It is the switches that are independent, which is much more subtle and powerful, allowing evolutionary tinkering with the wing pattern without affecting parts of the genetic software that control the brain or eyes.

    "This modularity means switching on a tiny piece of the gene's DNA produces one piece of pattern or another on the wings – like a genetic paint box," Wallbank said.


    Research finds independent genetic switches control different splotches of colour and pattern on Heliconius butterfly wings, and that these switches have been shared between species over millions of years, becoming “jumbled up” to create new and diverse wing displays.

    We can actually map onto the species tree how these little regions of colour have jumped between species
    Chris Jiggins
    A range of wing patterns across Heleconius butterfly species.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    The University will host an IdeasLab looking at how breakthroughs in carbon reduction technologies will transform industries. IdeasLabs are quick-fire visual presentations followed by workgroup discussion, and have proved a successful format for engaging various communities in academic thinking.

    Carbon Reduction Technologies: The University of Cambridge IdeasLab

    Wednesday 20 January 16:15 - 17:30

    Sir Leszek Borysiewicz, Vice-Chancellor, will introduce this event, which will look at how research by Cambridge academics has led to breakthroughs in carbon reduction technologies that will transform a range of industries. Ideas to be discussed include:

    • Decarbonizing industrial-scale processes using virtual avatars
    • Self-healing concrete for low-carbon infrastructure
    • Improving solar materials efficiency using quantum mechanics
    • Quantum materials for zero-loss transmission of electricity

    The event is supported Energy@Cambridge, a Strategic Research Initiative that brings together the activities of over 250 world-leading academics working in all aspects of energy-related research, covering energy supply, conversion and demand, across a wide range from departments.

    The speakers, all members of Energy@Cambridge, are:

    Professor Abir Al-Tabbaa, Department of Engineering

    Professor Al-Tabbaa is a Director of the Centre for Doctoral Training in Future Infrastructure and Built Environment. She leads international work on sustainable and innovative materials for construction and the environment. Her particular expertise relates to low-carbon and self-healing construction materials, ground improvement, soil mix technology and contaminated land remediation.

    See also:

    Professor Sir Richard Friend, Department of Physics

    Professor Friend is the Director of the Maxwell Centre and the Winton Fund for the Physics of Sustainability. He is the lead academic on one of Energy@Cambridge’s newest Grand Challenges – Materials for Energy Efficient Information Communications Technology.

    Professor Friend’s research encompasses the physics, materials science and engineering of semiconductor devices made with carbon-based semiconductors, particularly polymers. His research group was first to demonstrate using polymers efficient operation of field-effect transistors and light-emitting diodes. These advances revealed that the semiconductor properties of this broad class of materials are unexpectedly clean, so that semiconductor devices can both reveal their novel semiconductor physics, including their operation in efficient photovoltaic diodes, optically-pumped lasing, directly-printed polymer transistor circuits and light-emitting transistors.

    See also:

    Professor Markus Kraft, Department of Chemical Engineering and Biotechnology

    Professor Kraft is the director of the Singapore-Cambridge CREATE Research Centre and a principal investigator of the Cambridge Centre for Carbon Reduction in Chemical Technology (C4T), one of Energy@Cambridge’s Grand Challenges. C4T is a world-leading partnership between Cambridge and Singapore, set up to tackle the environmentally relevant and complex problem of assessing and reducing the carbon footprint of the integrated petro-chemical plants and electrical network on Jurong Island in Singapore.

    Professor Kraft has contributed to the detailed modelling of combustion synthesis of organic and inorganic nanoparticles. He has worked on fluidization, spray drying and granulation of fine powders. His interested include computational modelling and optimization targeted towards developing carbon abatement and emissions reduction technologies.

    Dr Suchitra Sebastian, Department of Physics

    Dr Sebastian creates and studies interesting quantum materials - often under extreme conditions such as very high magnetic and electric fields, enormous pressures, and very low temperatures - with a view to discovering unusual phases of matter. Among these are the family of superconductors - which have the exciting property of transporting electricity with no energy loss - and hence hold great promise for energy saving applications. One of her research programmes is to create a new generation of superconductors that operate at accessible temperatures, thus providing energy transmission and storage solutions of the future.

    See also:

    Energy@Cambridge is working to develop new technologies to reduce the carbon footprint of industrial processes, energy generation and transmission, and building construction. Its aims include leveraging the University’s expertise to tackle grand technical and intellectual challenges in energy, integrating science, technology and policy research.

    The initiative has four Grand Challenges, focused on developing and delivering new large-scale collaborative activities, facilities, centres and research directions by bringing together academics and external partners to work on future energy challenges where we believe we can make a significant impact.

    Will Science Save Us?

    Friday 22 January

    The Vice Chancellor and Dr Suchitra Sebastian will take part in a lunchtime discussion entitled Will Science Save Us?, which will look at how we accelerate scientific breakthroughs that address society's greatest challenges.

    * * *

    The World Economic Forum is an independent international organisation engaging business, political, academic and other leaders of society to shape global, regional and industry agendas; this year’s theme is The Reshaping of the World: Consequences for Society, Politics and Business.

    The Forum will provide an opportunity for the Cambridge researchers to engage with decision-makers in business, NGOs and in public policy, and to highlight new ideas from Cambridge in responding to global challenges.

    For further information or to contact any of the speakers, please contact the team at Energy@Cambridge.

    The Vice Chancellor of the University of Cambridge is to lead a delegation of academics to the annual meeting of the World Economic Forum at Davos, Switzerland, in January 2016, to explore issues including carbon reduction technologies and how science and engineering can best address society's greatest challenges.

    Coal Fired Power Station (cropped)

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    E-cigarettes are now the most commonly consumed nicotine product amongst children in countries with strong tobacco control policies. In the USA, the 2014 National Youth Tobacco Survey found that e-cigarette use tripled from 2013 to 2014 amongst high schoolers, rising from 4.5% to over 13%, and amongst middle school students increasing from 1% to 4%. These figures are mirrored in England, where e-cigarette use has risen from 5% in 2013 to 8% in 2014 amongst 11-18 year olds.

    As e-cigarette use, rises amongst children and adolescents, there are concerns that their use could lead to tobacco smoking, say researchers from the Behaviour and Health Research Unit at the University of Cambridge. The Behaviour and Health Research Unit (BHRU) is based in the Department of Public Health and Primary Care and funded by the UK Department of Health Policy Research Programme.

    E-cigarettes are currently marketed in around 8,000 different flavours. Internal tobacco industry documents show that young people find tobacco products with candy-like flavours more appealing than those without. Candy- and liqueur-flavoured tobacco products were heavily marketed to young people from the 1970s until 2009, when regulations were imposed.

    In a study funded by the Department of Health, researchers at Cambridge assigned 598 school children to one of three groups: one group was shown adverts for candy-like flavoured e-cigarettes; a second group adverts for non-flavoured e-cigarettes; and a third, control group, in which the children saw no adverts.

    The school children were then asked questions to gauge issues such as the appeal of using e-cigarettes and tobacco smoking (did the children think e-cigarettes or tobacco were ‘attractive’, ‘fun’ or ‘cool’?), the perceived harm of smoking, how much they liked the ads  and how interested they might be in buying and trying e-cigarettes.

    The children shown the ads for candy-flavoured e-cigarettes liked these ads more and expressed a greater interest in buying and trying e-cigarettes than their peers. However, showing the ads made no significant difference to the overall appeal of tobacco smoking or of using e-cigarettes – in other words, how attractive, fun or cool they considered the activities.

    “We’re cautiously optimistic from our results that e-cigarette ads don’t make tobacco smoking more attractive, but we’re concerned that ads for e-cigarettes with flavours that might appeal to school children could encourage them to try the products,” says Dr Milica Vasiljevic from the Department of Public Health and Primary Care at the University of Cambridge.

    Currently across Europe and the USA, marketing and advertising of e-cigarettes is virtually unregulated. For example, in the UK the Committee on Advertising Practice has issued rules for the advertising of e-cigarettes. A key aspect of these rules is that e-cigarette adverts must not be likely to appeal to people under 18, and those who are non-smokers or non-nicotine users as well as not allowing the models in these adverts to appear younger than 25; however, the rules do not provide any explicit prohibitions regarding the advertising of candy-like flavours designed to appeal to children.

    The results of the current study support the imminent changes in EU regulations surrounding the marketing of e-cigarettes, but raise questions about the need for further regulation regarding the content of products with high appeal to children. More research is needed to examine both the short- and long-term impact of e-cigarette advertising, as well as the link between e-cigarette use and tobacco smoking.

    Vasiljevic, M, Petrescu, DC, Marteau, TM. Impact of advertisements promoting candy-like flavoured e-cigarettes on appeal of tobacco smoking amongst children: an experimental study. Tobacco Control; 18 Jan 2016

    Advertisements featuring e-cigarettes with flavours such as chocolate and bubble gum are more likely to attract school children to buy and try e-cigarettes than those featuring non-flavoured e-cigarettes, according to new research published in the journal Tobacco Control.

    We’re cautiously optimistic from our results that e-cigarette ads don’t make tobacco smoking more attractive, but we’re concerned that ads for e-cigarettes with flavours that might appeal to school children could encourage them to try the products
    Milica Vasiljevic
    E-Cigarette/Electronic Cigarette/E-Cigs/E-Liquid/Vaping/Cloud Chasing

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    It’s common to first see exciting new technologies in science fiction, but less so in stories about wizards and dragons. Yet one of the most interesting bits of kit on display at this year’s Consumer Electronics Show (CES) in Las Vegas was reminiscent of the magical Daily Prophet newspaper in the Harry Potter series.

    Thin, flexible screens such as the one showcased by LG could allow the creation of newspapers that change daily, display video like a tablet computer, but that can still be rolled up and put in your pocket. These plastic electronic displays could also provide smartphones with shatterproof displays (good news for anyone who’s inadvertently tried drop-testing their phone onto the pavement) and lead to the next generation of flexible wearable technology.

    But LG’s announcement is not the first time that flexible displays has been demonstrated at CES. We’ve seen similar technologies every year for some time now, and LG itself unveiled another prototype in a press release 18 months ago. Yet only a handful of products have come to market that feature flexible displays, and those have the displays mounted in a rigid holder, rather than free for the user to bend. So why is this technology taking so long to reach our homes?

    How displays work


    Magnified LCD screen.Akpch/Wikimedia Commons


    Take a look at your computer screen through a magnifying glass and you’ll see the individual pixels, each made up of three subpixels – red, green, and blue light sources. Each of these subpixels is connected via a grid of wires that criss-cross the back of the display to another circuit called a display driver. This translates incoming video data into signals that turn each subpixel on and off.

    How each pixel generates light varies depending on the technology used. Two of the most common seen today are liquid crystal displays (LCDs) and organic light emitting diodes (OLEDs). LCDs use a white light at the back of the display that passes through red, green and blue colour filters. Each subpixel uses a combination of liquid crystals and polarising filters that act like tiny shutters, either letting light through or blocking it.

    OLEDs, on the other hand, are mini light sources that directly generate light when turned on. This removes the need for the white light behind the display, reducing its overall thickness, and is one of the driving factors behind the growing uptake of OLED technology.



    The challenges

    Whatever technology is used, there are many individual components crammed into a relatively small space. Many smartphone displays contain more than three million subpixels, for example. Bending these components introduces strain, which can tear electrical connections and peel apart layers. Current displays use a rigid piece of glass, to keep the display safe from the mechanical strains of the outside world. Something that, by design, is not an option in flexible displays.

    Organic semiconductors – the chemicals that directly produce light in OLED displays – have the additional problem of being highly sensitive to both water vapour and oxygen, gases that can pass relatively easily through thin plastic films. This can result in faded and dead pixels, leaving a less than desirable-looking result.

    There’s also the challenge of the large-scale manufacturing of these circuits. Plastics can be tricky materials to work with. They often swell and shrink in response to water and heat, and it can be difficult to persuade materials to bond to it. In a manufacturing environment, where precise alignment and high temperature processing are critical, this can cause major issues.

    Finally, it’s not just flexible displays that need to be developed. The components needed to power and operate the display also need to be incorporated into any overall design, placing constraints on the kinds of shape and size currently achievable.

    What next?


    Circuits patterned on a plastic substrate.Stuart Higgins


    Scientists in Japan have demonstrated how to make electrical circuits on plastic thinner than the width of human hair in an attempt to reduce the impact of bending on circuit performance. And research into flexible batteries has started to become more prevalent, too.

    Developing solutions to these problems is part of a broader area of active research, as the science and technology underlying flexible displays is also applicable to many other fields, such as biomedical devices and solar energy. While the challenges remain, the technology edges closer to the point where devices such as flexible displays will become ubiquitous in our everyday lives.

    The Conversation

    Stuart Higgins, Postdoctoral Research Associate in Optoelectronics, University of Cambridge

    This article was originally published on The Conversation. Read the original article.

    The opinions expressed in this article are those of the individual author(s) and do not represent the views of the University of Cambridge.

    Stuart Higgins (Cavendish Laboratory) discusses the technology being developed to create flexible displays.

    Circuits patterned on a plastic substrate

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    A new study, published today in PNAS, shows that in climbing animals ranging in size from mites to geckos, the percentage of body surface covered by adhesive footpads increases as body size increases, setting a limit to the size of animal using this strategy because larger animals would require impossibly big feet.

    Dr David Labonte and his colleagues in the University of Cambridge’s Department of Zoology found that tiny mites use approximately 200 times less of their body surface area for adhesive pads than geckos, nature's largest adhesion-based climbers. And humans? We’d need as much as 40% of our total body surface, or roughly 80% of our front, to be covered in sticky footpads if we wanted to do a convincing Spiderman impression.

    Once an animal is so big that a substantial fraction of its body surface would need to be sticky footpads, the necessary morphological changes would make the evolution of this trait impractical, suggests Labonte.

    “If a human, for example, wanted to climb up a wall the way a gecko does, we’d need impractically large sticky feet – and shoes in European size 145 or US size 114,”says Walter Federle, senior author also from Cambridge’s Department of Zoology.

    “As animals increase in size, the amount of body surface area per volume decreases – an ant has a lot of surface area and very little volume, and an elephant is mostly volume with not much surface area” explains Labonte.

    “This poses a problem for larger climbing animals because, when they are bigger and heavier, they need more sticking power, but they have comparatively less body surface available for sticky footpads. This implies that there is a maximum size for animals climbing with sticky footpads – and that turns out to be about the size of a gecko.”

    The researchers compared the weight and footpad size of 225 climbing animal species including insects, frogs, spiders, lizards and even a mammal.

    “We covered a range of more than seven orders of magnitude in body weight, which is roughly the same weight difference as between a cockroach and Big Ben” says Labonte.

     “Although we were looking at vastly different animals – a spider and a gecko are about as different as a human is to an ant – their sticky feet are remarkably similar,” says Labonte.

    “Adhesive pads of climbing animals are a prime example of convergent evolution – where multiple species have independently, through very different evolutionary histories, arrived at the same solution to a problem. When this happens, it’s a clear sign that it must be a very good solution.”

    There is one other possible solution to the problem of how to stick when you’re a large animal, and that’s to make your sticky footpads even stickier.

    “We noticed that within some groups of closely related species pad size was not increasing fast enough to match body size yet these animals could still stick to walls,” says Christofer Clemente, a co-author from the University of the Sunshine Coast.

    “We found that tree frogs have switched to this second option of making pads stickier rather than bigger. It’s remarkable that we see two different evolutionary solutions to the problem of getting big and sticking to walls,” says Clemente.

    “Across all species the problem is solved by evolving relatively bigger pads, but this does not seem possible within closely related species, probably since the required morphological changes would be too large. Instead within these closely related groups, the pads get stickier in larger animals, but the underlying mechanisms are still unclear. This is a great example of evolutionary constraint and innovation.”

    The researchers say that these insights into the size limits of sticky footpads could have profound implications for developing large-scale bio-inspired adhesives, which are currently only effective on very small areas.

    “Our study emphasises the importance of scaling for animal adhesion, and scaling is also essential for improving the performance of adhesives over much larger areas. There is a lot of interesting work still to be done looking into the strategies that animals use to make their footpads stickier - these would likely have very useful applications in the development of large-scale, powerful yet controllable adhesives,” says Labonte.

    This study was supported by research grants from the UK Biotechnology and Biological Sciences Research Council (BB/I008667/1), the Human Frontier Science Programme (RGP0034/2012), the Denman Baynes Senior Research Fellowship, and a Discovery Early Career Research Fellowship (DE120101503).


    Labonte, D et al "Extreme positive allometry of animal adhesive pads and the size limits of adhesion-based climbing."PNAS 18 January 2016. DOI:

    Inset images: Vallgatan 21D, Gothenburg, Sweden (photo by Gudbjörn Valgeirsson, footprints added by Cedric Bousquet, University of Cambridge); How sticky footpad area changes with size (David Labonte); Diversity of sticky footpads (David Labonte).

    Latest research reveals why geckos are the largest animals able to scale smooth vertical walls – even larger climbers would require unmanageably large sticky footpads. Scientists estimate that a human would need adhesive pads covering 40% of their body surface in order to walk up a wall like Spiderman, and believe their insights have implications for the feasibility of large-scale, gecko-like adhesives.

    If a human wanted to climb up a wall the way a gecko does, we’d need impractically large sticky feet – and shoes in European size 145
    Walter Federle
    Gecko and ant

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    Ask most people what the hardest material on Earth is and they will probably answer “diamond”. Its name comes from the Greek word ἀδάμας (adámas) meaning “unbreakable” or “invincible” and is from where we get the word “adamant”. Diamond’s hardness gives it incredible cutting abilities that – along with its beauty – have kept it in high demand for thousands of years.

    Modern scientists have spent decades looking for cheaper, harder and more practical alternatives and every few years the news heralds the creation of a new “world’s hardest material”. But are any of these challengers really up to scratch?

    Despite its unique allure, diamond is simply a special form, or “allotrope”, of carbon. There are several allotropes in the carbon family including carbon nanotubes, amorphous carbon, diamond and graphite. All are made up of carbon atoms, but the types of atomic bonds between them differ which gives rise to different material structures and properties.

    The outermost shell of each carbon atom has four electrons. In diamond, these electrons are shared with four other carbon atoms to form very strong chemical bonds resulting in an extremely rigid tetrahedral crystal. It is this simple, tightly-bonded arrangement that makes diamond one of the hardest substances on Earth.

    How hard?


    Vickers test anvil.R Tanaka, CC BY


    Hardness is an important property of materials and often determines what they can be used for, but it is also quite difficult to define. For minerals, scratch hardness is a measure of how resistant it is to being scratched by another mineral.

    There are several ways of measuring hardness but typically an instrument is used to make a dent in the material’s surface. The ratio between the surface area of the indentation and the force used to make it produces a hardness value. The harder the material, the larger the value. The Vickers hardness test uses a square-based pyramid diamond tip to make the indent.

    Mild steel has a Vickers hardness value of around 9 GPa while diamond has a Vickers hardness value of around 70 – 100 GPa. Diamond’s resistance against wear is legendary and today 70% of the world’s natural diamonds are found in wear-resistant coatings for tools used in cutting, drilling and grinding, or as additives to abrasives.

    The problem with diamond is that, while it may be very hard, it is also surprisingly unstable. When diamond is heated above 800℃ in air its chemical properties change, affecting its strength and enabling it to react with iron, which makes it unsuitable for machining steel.

    These limits on its use have led to a growing focus on developing new, chemically-stable, superhard materials as a replacement. Better wear-resistant coatings allow industrial tools to last longer between replacing worn parts and reduce the need for potentially environmentally-hazardous coolants. Scientists have so far managed to come up with several potential rivals to diamond.

    Boron nitride


    Microscopic BN crystal.NIMSoffice/Wikimedia Commons


    The synthetic material boron nitride, first produced in 1957, is similar to carbon in that it has several allotropes. In its cubic form (c-BN) it shares the same crystalline structure as diamond, but instead of carbon atoms is made up of alternately-bonded atoms of boron and nitrogen. c-BN is chemically and thermally stable, and is commonly used today as a superhard machine tool coating in the automotive and aerospace industries.

    But cubic boron nitride is still, at best, just the world’s second hardest material with a Vickers hardness of around 50 GPa. Its hexagonal form (w-BN) was initially reported to be even harder but these results were based upon theoretical simulations that predicted an indentation strength 18% higher than diamond. Unfortunately w-BN is extremely rare in nature and difficult to produce in sufficient quantities to properly test this claim by experiment.

    Synthetic diamond


    Synthetic diamond closeup.Instytut Fizyki Uniwersytet Kazimierza Wielkiego, CC BY


    Synthetic diamond has also been around since the 1950s and is often reported to be harder than natural diamond because of its different crystal structure. It can be produced by applying high pressure and temperature to graphite to force its structure to rearrange into the tetrahedral diamond, but this is slow and expensive. Another method is to effectively build it up with carbon atoms taken from heated hydrocarbon gases but the types of substrate material you can use are limited.

    Producing diamonds synthetically creates stones that are polycrystalline and made up of aggregates of much smaller crystallites or “grains” ranging from a few microns down to several nanometers in size. This contrasts with the large monocrystals of most natural diamonds used for jewellery. The smaller the grain size, the more grain boundaries and the harder the material. Recent research on some synthetic diamond has shown it to have a Vickers hardness of up to 200 GPa.



    Q-Carbon closeup.North Carolina State University


    More recently, researchers at North Carolina State University created what they described as a new form of carbon, distinct from other allotropes, and reported to be harder than diamond. This new form was made by heating non-crystalline carbon with a high-powered fast laser pulse to 3,700 °C then quickly cooling or “quenching” it – hence the name “Q-carbon” – to form micron-sized diamonds.

    The scientists found Q-carbon to be 60% harder than diamond-like carbon (a type of amorphous carbon with similar properties to diamond). This has led them to expect Q-carbon to be harder than diamond itself, although this still remains to be proven experimentally. Q-carbon also has the unusual properties of being magnetic and glowing when exposed to light. But so far it’s main use has been as an intermediate step in producing tiny synthetic diamond particles at room temperature and pressure. These nanodiamonds are too small for jewellery but ideal as a cheap coating material for cutting and polishing tools.

    The Conversation

    Paul Coxon, Postdoctoral research associate, University of Cambridge

    This article was originally published on The Conversation. Read the original article.

    The opinions expressed in this article are those of the individual author(s) and do not represent the views of the University of Cambridge.

    Paul Coxon (Department of Materials Science and Metallurgy) discusses the materials that have each been heralded as the new “world’s hardest material”.


    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    These new materials offer the possibility of either significantly improving the efficiency of current high-temperature fuel cell systems, or achieving the same performance levels at much lower temperatures. Either of these approaches could enable much lower fuel consumption and waste energy. The material was co-invented by Professor Judith Driscoll of the Department of Materials Science and Metallurgy and her colleague Dr Shinbuhm Lee, with support from collaborators at Imperial College and at three different labs in the US.

    Solid oxide fuel cells are comprised of a negative electrode (cathode) and positive electrode (anode), with an electrolyte material sandwiched between them. The electrolyte transports oxygen ions from the cathode to the anode, generating an electric charge. Compared to conventional batteries, fuel cells have the potential to run indefinitely, if supplied by a source of fuel such as hydrogen or a hydrocarbon, and a source of oxygen.

    By using thin-film electrolyte layers, micro solid oxide fuel cells offer a concentrated energy source, with potential applications in portable power sources for electronic consumer or medical devices, or those that need uninterruptable power supplies such as those used by the military or in recreational vehicles.

    “With low power requirements and low levels of polluting emissions, these fuel cells offer an environmentally attractive solution for many power source applications,” said Dr Charlanne Ward of Cambridge Enterprise, the University’s commercialisation arm, which is managing the patent that was filed in the US. “This opportunity has the potential to revolutionise the power supply problem of portable electronics, by improving both the energy available from the power source and safety, compared with today’s battery solutions.”

    In addition to providing significantly improved conductivity, the new electrolyte material offers:

    • minimal heat loss and short circuiting due to low electronic conductivity
    • minimal cracking under heat cycling stress due to small feature size in the construction
    • high density, reducing the risk of fuel leaks
    • simple fabrication using standard epitaxial growth and self-assembly techniques

    “The ability to precisely engineer and tune highly crystalline materials at the nanoscale is absolutely key for next-generation power generation and storage of many different kinds,” said Driscoll. “Our new methods and understanding have allowed us to exploit the very special properties of nanomaterials in a practical and stable thin-film configuration, resulting in a much improved oxygen ion conducting material.”

    In October, a paper on the enhancement of oxygen ion conductivity in oxides was published in Nature Communications. It is this enhancement that improves efficiency and enables low-temperature operation of fuel cells. As a result of the reported advantages, the novel electrolyte material can also potentially be used in the fabrication of improved electrochemical gas sensors and oxygen separation membranes (to extract oxygen molecules from the air). The inventors have also published two other papers showing the enhanced ionic conduction in different materials systems, one in Nano Letters and one in Advanced Functional Materials.

    Cambridge Enterprise is working with Driscoll to take the technology to market, seeking to collaborate with a fuel cell manufacturer with expertise in thin-film techniques to validate the new material.

    A new thin-film electrolyte material that helps solid oxide fuel cells operate more efficiently and cheaply than those composed of conventional materials, and has potential applications for portable power sources, has been developed at the University of Cambridge. 

    The ability to precisely engineer and tune highly crystalline materials at the nanoscale is absolutely key for next-generation power generation and storage of many different kinds.
    Judith Driscoll
    Bloom Energy Fuel Cell

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    License type: 

    0 0

    The fossilised bones of a group of prehistoric hunter-gatherers who were massacred around 10,000 years ago have been unearthed 30km west of Lake Turkana, Kenya, at a place called Nataruk.

    Researchers from Cambridge University’s Leverhulme Centre for Human Evolutionary Studies (LCHES) found the partial remains of 27 individuals, including at least eight women and six children.

    Twelve skeletons were in a relatively complete state, and ten of these showed clear signs of a violent death: including extreme blunt-force trauma to crania and cheekbones, broken hands, knees and ribs, arrow lesions to the neck, and stone projectile tips lodged in the skull and thorax of two men. 

    Several of the skeletons were found face down; most had severe cranial fractures. Among the in situ skeletons, at least five showed “sharp-force trauma”, some suggestive of arrow wounds. Four were discovered in a position indicating their hands had probably been bound, including a woman in the last stages of pregnancy. Foetal bones were uncovered.

    The bodies were not buried. Some had fallen into a lagoon that has long since dried; the bones preserved in sediment. 

    The findings suggest these hunter-gatherers, perhaps members of an extended family, were attacked and killed by a rival group of prehistoric foragers. Researchers believe it is the earliest scientifically-dated historical evidence of human conflict – an ancient precursor to what we call warfare.

    The origins of warfare are controversial: whether the capacity for organised violence occurs deep in the evolutionary history of our species, or is a symptom of the idea of ownership that came with the settling of land and agriculture.

    The Nataruk massacre is the earliest record of inter-group violence among prehistoric hunter-gatherers who were largely nomadic. The only comparable evidence, discovered in Sudan in the 1960s, is undated, although often quoted as of similar age. It consists of cemetery burials, suggesting a settled lifestyle.   

    “The deaths at Nataruk are testimony to the antiquity of inter-group violence and war,” said Dr Marta Mirazón Lahr, from Cambridge’s LCHES, who directs the ERC-funded IN-AFRICA Project and led the Nataruk study, published today in the journal Nature.

    “These human remains record the intentional killing of a small band of foragers with no deliberate burial, and provide unique evidence that warfare was part of the repertoire of inter-group relations among some prehistoric hunter-gatherers,” she said.

    The site was first discovered in 2012. Following careful excavation, the researchers used radiocarbon and other dating techniques on the skeletons – as well as on samples of shell and sediment surrounding the remains – to place Nataruk in time. They estimate the event occurred between 9,500 to 10,500 years ago, around the start of the Holocene: the geological epoch that followed the last Ice Age.

    Now scrubland, 10,000 years ago the area around Nataruk was a fertile lakeshore sustaining a substantial population of hunter-gatherers. The site would have been the edge of a lagoon near the shores of a much larger Lake Turkana, likely covered in marshland and bordered by forest and wooded corridors.  

    This lagoon-side location may have been an ideal place for prehistoric foragers to inhabit, with easy access to drinking water and fishing – and consequently, perhaps, a location coveted by others. The presence of pottery suggests the storage of foraged food.

    “The Nataruk massacre may have resulted from an attempt to seize resources – territory, women, children, food stored in pots – whose value was similar to those of later food-producing agricultural societies, among whom violent attacks on settlements became part of life,” said Mirazón Lahr.

    “This would extend the history of the same underlying socio-economic conditions that characterise other instances of early warfare: a more settled, materially richer way of life. However, Nataruk may simply be evidence of a standard antagonistic response to an encounter between two social groups at that time.”   

    Antagonism between hunter-gatherer groups in recent history often resulted in men being killed, with women and children subsumed into the victorious group. At Nataruk, however, it seems few, if any, were spared.

    Of the 27 individuals recorded, 21 were adults: eight males, eight females, and five unknown. Partial remains of six children were found co-mingled or in close proximity to the remains of four adult women and of two fragmentary adults of unknown sex.

    No children were found near or with any of the men. All except one of the juvenile remains are children under the age of six; the exception is a young teenager, aged 12-15 years dentally, but whose bones are noticeably small for his or her age. 

    Ten skeletons show evidence of major lesions likely to have been immediately lethal. As well as five – possibly six – cases of trauma associated with arrow wounds, five cases of extreme blunt-force to the head can be seen, possibly caused by a wooden club. Other recorded traumas include fractured knees, hands and ribs.   

    Three artefacts were found within two of the bodies, likely the remains of arrow or spear tips. Two of these are made from obsidian: a black volcanic rock easily worked to razor-like sharpness. “Obsidian is rare in other late Stone Age sites of this area in West Turkana, which may suggest that the two groups confronted at Nataruk had different home ranges,” said Mirazón Lahr. 

    One adult male skeleton had an obsidian ‘bladelet’ still embedded in his skull. It didn’t perforate the bone, but another lesion suggests a second weapon did, crushing the entire right-front part of the head and face. “The man appears to have been hit in the head by at least two projectiles and in the knees by a blunt instrument, falling face down into the lagoon’s shallow water,” said Mirazón Lahr.

    Another adult male took two blows to the head – one above the right eye, the other on the left side of the skull – both crushing his skull at the point of impact, causing it to crack in different directions.

    The remains of a six-to-nine month-old foetus were recovered from within the abdominal cavity of one of the women, who was discovered in an unusual sitting position – her broken knees protruding from the earth were all Mirazón Lahr and colleagues could see when they found her. The position of the body suggests that her hands and feet may have been bound.

    The Nataruk remains are now housed at the Turkana Basin Institute, Turkwell Station, for the National Museums of Kenya.  

    While we will never know why these people were so violently killed, Nataruk is one of the clearest cases of inter-group violence among prehistoric hunter-gatherers, says Mirazón Lahr, and evidence for the presence of small-scale warfare among foraging societies.

    For study co-author Professor Robert Foley, also from Cambridge’s LCHES, the findings at Nataruk are an echo of human violence as ancient, perhaps, as the altruism that has led us to be the most cooperative species on the planet.

    “I’ve no doubt it is in our biology to be aggressive and lethal, just as it is to be deeply caring and loving. A lot of what we understand about human evolutionary biology suggests these are two sides of the same coin,” Foley said.    

    Skeletal remains of a group of foragers massacred around 10,000 years ago on the shores of a lagoon is unique evidence of a violent encounter between clashing groups of ancient hunter-gatherers, and suggests the “presence of warfare” in late Stone Age foraging societies.

    The deaths at Nataruk are testimony to the antiquity of inter-group violence and war
    Marta Mirazón Lahr
    Left: Skull of a man found lying prone in the lagoons sediments. The skull has multiple lesions consistent with wounds from a blunt implement. Right: The skull in situ.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    The area surrounding Lake Turkana in Kenya was lush and fertile 10,000 years ago, with thousands of animals – including elephants, giraffes and zebras – roaming around alongside groups of hunter gatherers. But it also had a dark side. We have discovered the oldest known case of violence between two groups of hunter gatherers took place there, with ten excavated skeletons showing evidence of having been killed with both sharp and blunt weapons.

    The findings, published in Nature, are important because they challenge our understanding of the roots of conflict and suggest warfare may have a much older history than many researchers believe.

    Shocking finding

    Our journey started in 2012, when Pedro Ebeya, one of our Turkana field assistants, reported seeing fragments of human bones on the surface at Nataruk. Located just south of Lake Turkana, Nataruk is today a barren desert, but 10,000 years ago was a temporary camp set up by a band of hunter-gatherers next to a lagoon. I led a team of researchers, as part of the In-Africa project, which has been working in the area since 2009. We excavated the remains of 27 people – six young children, one teenager and 20 adults. Twelve of these – both men and women – were found as they had died, unburied, and later covered by the shallow water of the lagoon.

    Ten of the 12 skeletons show lesions caused by violence to the parts of the body most commonly involved in cases of violence. These include one where the projectile was still embedded in the side of the skull; two cases of sharp-force trauma to the neck; seven cases of blunt and/or sharp-force trauma to the head; two cases of blunt-force trauma to the knees and one to the ribs. There were also two cases of fractures to the hands, possibly caused while parrying a blow.

    There must have at least three types of weapons involved in these murders – projectiles (stoned-tipped as well as sharpened arrows), something similar to a club, and something close to a wooden handle with hafted sharp-stone blades that caused deep cuts. Two individuals have no lesions in the preserved parts of the skeleton, but the position of their hands suggests they may have been bound, including a young woman who was heavily pregnant at the time.


    Me and my colleague, Justus Edung, during the excavations.Credit: Robert Foley


    We dated the remains and the site to between 10,500 and 9,500 years ago, making them the earliest scientifically dated case of a conflict between two groups of hunter-gatherers. Stones in the weapons include obsidian, a rare stone in the Nataruk area, suggesting the attackers came from a different place.

    The (pre)history of warfare

    Today we think of warfare, or inter-group conflict, as something that happens when one group of people wants the territory, resources or power held by another. But prehistoric societies were usually small groups of nomads moving from place to place – meaning they didn’t own land or have significant possessions. They typically didn’t have strong social hierarchies either. Therefore, many scholars have argued that warfare must have emerged after farming and more complex political systems arose.


    Man with an obsidian bladelet embedded into the left side of his skull, and a projectile lesion (possibly of a sharpened arrow shaft) on the right side of the skull.Marta Mirazon Lahr


    Naturuk therefore challenges our views about what the causes of conflict are. It is possible that human prehistoric societies simply responded antagonistically to chance encounters with another group. But this is not what seems to have happened at Nataruk. The group which attacked was carrying weapons that would not normally be carried while hunting and fishing. In addition, the lesions show that clubs of at least two sizes were used, making it likely that more than one of the attackers were carrying them.

    The fact that the attack combined long-distance weapons such as arrows and close-proximity weaponry such as clubs suggests they planned the attack. Also, there are other, but isolated, examples of violent trauma in this area from this period in time – one discovered in the 1970s about 20km north of Nataruk, and two discovered by our project at a nearby site. All three involved projectiles, one of the hallmarks of inter-group conflict. Two of the projectiles found embedded in the bones at Nataruk and in two of the other cases were made of obsidian. This tells us that such attacks happened multiple times, and were part of the life of the hunter-gatherer communities at the time.

    So why were the people of Nataruk attacked? We have to conclude that they had valuable resources that were worth fighting for – water, meat, fish, nuts, or indeed women and children. This suggests that two of the conditions associated with warfare among settled societies – territory and resources – were probably common among these hunter-gatherers, and that we have underestimated their role so far.

    Evolution is about survival, and our species is no different from others in this respect. The injuries suffered by the people of Nataruk are merciless and shocking, but no different from those suffered in wars throughout much of our history – sadly even today. It may be human nature, but we should not forget that extraordinary acts of altruism, compassion and caring are also unique parts of who we are.

    The Conversation

    Marta Mirazon Lahr, Reader in Human Evolutionary Biology & Director of the Duckworth Collection, University of Cambridge

    This article was originally published on The Conversation. Read the original article.

    The opinions expressed in this article are those of the individual author(s) and do not represent the views of the University of Cambridge.

    Marta Mirazon Lahr (Leverhulme Centre for Human Evolutionary Studies) discusses the discovery, made by her and her team, of the oldest known case of violence between two groups of hunter gatherers.

    Skull of a man with multiple lesions on the side, probably caused by a club.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    Full programme now online | Bookings open Monday 8 February

    Will artificial intelligence be superior to or as creative as the human brain? Are we letting machines take over and give rise to mass unemployment or worse? Should we be worried about quantum computing and the impact it will have on the way we work, communicate and live in the future? Or should we harness rather than hate the digital deluge?

    As we teeter on the brink of total machine dependency and epoch-making technological developments that cross all areas of our lives, Cambridge Science Festival asks these and many other critical questions.

    The annual Festival presents an impressive line-up of who’s who from the science world and beyond, including Professors Sir John Gurdon (Nobel Laureate), Sir David Spiegelhalter, Richard Gilbertson, Raymond Leflamme, Didier Queloz, Meg Urry, and Tony Purnell, Head of Technical Development for British Cycling. Other speakers include Dr Ewan Birney, Director of the European Bioinformatics Institute; Angus Thirlwell, CEO and Co-founder of Hotel Chocolat; Dr Hermann Hauser, eminent technology entrepreneur; comedian Robin Ince; Charles Simonyi, high-tech pioneer and space traveller; and writer Simon Guerrier (Dr Who).

    At the core of this year’s Festival is a series of events that explores the increasing symbiosis between humans and technology, and the questions this raises for humanity in the coming century. On the first day, a panel of outstanding speakers debate the implications of artificial intelligence. The panel consists of experts from the fields of information technology, robotics and neuroscience, including Dr Hermann Hauser, Dr Mateja Jamnik, Professor Trevor Robbins and Professor Alan Winfield. This event will be moderated by Tom Feilden, Science Correspondent for the Today Programme on BBC Radio 4.

    Organiser of the event, Professor Barbara Sahakian, University of Cambridge, said: “Artificial intelligence could be of great benefit to society, producing innovative discoveries and providing humans with more leisure time. However, workers are concerned that, more and more, jobs are being taken over by artificial intelligence. We can see this in the context of the current trend for robots to work in car factories and driverless trains, and also in the future movement towards driverless cars.

    “Some people feel this is an inevitable progression into the future due to advances in artificial intelligence, information technology and machine learning. However, others including many neuroscientists are not convinced that computers will ever be able to demonstrate creativity nor fully understand social and emotional interactions.”

    The view that machines are taking over every aspect of our lives and whether this is a positive or negative factor of modern living is further examined in the event, ‘The rise of the humans: at the intersection of society and technology’. Dave Coplin, author and Chief Envisioning Officer for Microsoft UK, discusses the future of the UK’s IT and digital industries and addresses the convergence of society and technology, focussing on the developments that are creating so many new opportunities.

    Coplin, who also has a new book coming out shortly, believes our current relationship with technology is holding us back from using it properly and we should think differently about the potential future uses for technology.

    He said: “We should harness, rather than hate, the digital deluge. Individuals and organisations need to rise up and take back control of the potential that technology offers our society. We need to understand and aspire to greater outcomes from our use of technology”.

    Building further on these issues in the second week of the Festival, Zoubin Ghahramani, Professor of Information Engineering at the University of Cambridge and the Cambridge Liaison Director of the Alan Turing Institute, explores intelligence and learning in brains and machines. He asks, what is intelligence? What is learning? Can we build computers and robots that learn? How much information does the brain store? How does mathematics help us answer these questions?

    Professor Ghahramani highlights some current areas of research at the frontiers of machine learning, including a project to develop an Automatic Statistician, and speculates on some of the future applications of computers that learn.

    For many, quantum computing is the answer to machine learning. Influential pioneer in quantum information theory and the co-founder and current director of the Institute for Quantum Computing at the University of Waterloo, Professor Raymond Laflamme presents the annual Andrew Chamblin Memorial Lecture: ‘harnessing the quantum world’. During his lecture, Professor Laflamme will share the latest breakthroughs and biggest challenges in the quest to build technologies based on quantum properties that will change the ways we work, communicate and live.

    A former PhD student of Professor Stephen Hawking, Professor Laflamme is interested in harnessing the laws of quantum mechanics to develop new technologies that will have extensive societal impact. He believes that the development of quantum computers will allow us to really understand the quantum world and explore it more deeply.

    He said: “This exploration will allow us to navigate in the quantum world, to understand chemistry and physics at the fundamental level and bring us new technologies with applications in health, such as the development of drugs, and to develop new materials with a variety of applications.

    “In the next half decade, we will produce quantum processors with more than 100 quantum bits (qubits). As we pass the count of about 30 qubits (approximately one gigabyte), classical computers can no longer compete and we fully enter the quantum world. That will be very exciting, from then on we do not have the support of classical computers to tell us if the quantum devices behave as expected so we will need to find new ways to learn the reliability of these devices. Once we have 30-50 qubits (approximately one million gigabytes), I believe that we will get an equivalent of Moore's law, but for the increased number of qubits.”

    New technologies also have a major impact on healthcare, which comes under the spotlight during the final weekend of the Festival as it returns for the third year running to the Cambridge Biomedical Campus. During the event ‘How big data analysis is changing how we understand the living world’, Dr Ewan Birney, Fellow of the Royal Society and Director of the EMBL European Bioinformatics Institute, explores the opportunities and challenges of genomics and big data in healthcare, from molecular data to high-resolution imaging.

    These kinds of technological revolutions mean biological data is being collected faster than ever. Dr Shamith Samarajiwa, from the Medical Research Council Cancer Unit, explains how analysing biomedical big data can help us understand different cancers and identify new targets for treatments in ‘Battling cancer with data science’. Meanwhile, Dr Peter Maccallum from Cancer Research UK Cambridge Institute, discusses the challenges of storing and processing the terabytes of data produced every week, more than was previously generated in a decade in the event ‘Big data from small sources: computing demands in molecular and cell biology’.

    Speaking ahead of this year’s Science Festival, Coordinator, Dr Lucinda Spokes said, “Using the theme of big data and knowledge, we are addressing important questions about the kinds of technology that affects, or will affect, not only every aspect of science, from astronomy to zoology, but every area of our lives; health, work, relationships and even what we think we know.

    “Through a vast range of debates, talks, demonstrations and performances, some of the most crucial issues of our time and uncertainties about our future as a species will be explored during these packed two weeks.”

    The full programme also includes events on neuroscience, healthcare, sports science, psychology, zoology and an adults-only hands-on session amongst many others.


    Twitter: @camscience #csf2016

    The annual two-week Festival, which runs from 7 – 20 March and stages more than 300 events, examines the growing interaction between humans and technology.

    Circuit city
    Cambridge Science Festival

    Since its launch in 1994, the Cambridge Science Festival has inspired thousands of young researchers and visitor numbers continues to rise; last year, the Festival attracted well over 45,000 visitors. The Festival, one of the largest and most respected, brings science, technology, engineering, maths and medicine to an audience of all ages through demonstrations, talks, performances and debates. It draws together a diverse range of independent organisations in addition to many University departments, centres and museums.

    This year’s Festival sponsors and partners are Cambridge University Press, AstraZeneca, MedImmune, Illumina, TTP Group, Science AAAS, BlueBridge Education, Siemens, ARM, Microsoft Research, Redgate, Linguamatics, FameLab, Babraham Institute, Wellcome Genome Campus, Napp, The Institute of Engineering and Technology, St Mary’s School, Anglia Ruskin University, Cambridge Junction, Addenbrooke’s Hospital, Addenbrooke’s Charitable Trust, James Dyson Foundation, Naked Scientists, Hills Road Sixth Form College, UTC Cambridge, British Science Week, Alzheimer’s Research UK, Royal Society of Chemistry, Cambridge Science Centre, Cambridge Live, and BBC Cambridgeshire.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    The World Economic Forum (WEF) published its annual Global Risks Report in the run up to its annual meeting in Davos. Food and water crises, energy price shocks, biodiversity loss and ecosystem collapse, extreme weather events and failure of climate change mitigation and adaptation, it said, are the biggest threats facing society.

    Three of the top five global risks in terms of likelihood and three of the top five global risks in terms of impact have links to the environment. Of even greater concern, however, are the linkages between these systems, and the trade-offs associated with decisions in one area affecting another.

    This growing recognition of environmental risks for business, and their interconnections, reflects what is emerging as “nexus” thinking in the natural and social sciences.

    Those with long memories will recall that these issues have been high on the Davos agenda for much of the past decade and, therefore, discussed by the great and the good of corporate and political life. So why has significant business action not necessarily followed?

    Making connections

    Five years ago the WEF launched a report on the “Water-Energy-Food-Climate Change nexus”. It was a recognition that water concerns were closely linked to issues such as inequality, terrorism, famine, poverty and disease. This set the stage for business to consider a rounded approach to addressing the intimately interwoven threats from water scarcity, energy and food security and climate change. While there has been some progress, however, there is little evidence of a step change in attitudes and practices commensurate with the scale of the challenges.


    Joined up thinking is


    One reason for this inaction is what the Bank of England’s governor, Mark Carney, called the “tragedy of the horizon” in his speech to the insurance industry in September 2015.

    The impacts of many of these interconnected environmental risks fall outside the traditional decision-making horizons of most of those involved. Current decision makers have little incentive to fix the problem, even if they acknowledge and understand the risks.

    This is illustrated in the latest Global Risks Report, which highlights an alarming finding:

    … the relative absence of environmental risks and, more generally, of long-term issues among the top concerns of business leaders in their respective countries.

    Myopic visions

    Of more than 13,000 business executives in more than 140 economies whose views were sought in the WEF’s Executive Opinion Survey none identified environmental risks as among their top risks for doing business, both in terms of impact or likelihood.

    Similarly, there is a stark contrast in the report’s identification of the top five global risks of highest concern over longer and shorter time frames. The four most important risks over a ten-year period are all environment-related (water, climate change, extreme weather events and food crises), but none of these feature in the 18-month time horizon.

    Responding to potential environmental risks seems to always be just beyond the current decision horizon – important, but not requiring immediate action. We hear much about long-term planning, but it’s about time that environment risks were brought into the here and now.

    To do that we need to understand why there has been a lacklustre response from the global community. One possibility is that key people and institutions – from business, academia and politics – are not yet efficiently working together to create solutions, despite meetings such as those that are taking place this week at Davos.


    Drought is part of wider problems that affect business.EPA/Barbara Walton


    Co-creating responses, now

    The Global Risks Report highlights the need to recognise joint interests and bring people together across shared priorities, but we still lack some tangible way to bring these common agendas together.

    The time is ripe for business leaders to shape the research that will enable them to better respond to major challenges across the nexus and empower them to act sooner rather than later. Instead of a reactive stance, responding when threats become immediate and unavoidable, there is an opportunity to shift to being proactive and collaborative.

    As part of the Nexus2020 project the University of Cambridge Institute for Sustainability Leadership recently convened academics and business leaders to collectively prioritise key issues that need to be addressed. We identified how to help companies manage their dependencies and impacts upon food, energy, water and the environment.

    Those who are gathering at Davos need to seize the opportunity to overcome the tragedy of their short time horizons and work together to identify key questions and possible solutions. Otherwise, as Mark Carney has warned, by the time a problem becomes high on the agenda, it is often too late to respond. Moreover, these interconnected challenges will be harder and more costly to solve if action is delayed. The WEF presents a unique opportunity to co-create responses to the issues that are highlighted in this year’s Global Risks Report. Putting this off till the next meeting should not be an option.

    The Conversation

    Bhaskar Vira, Reader in Political Economy at the Department of Geography and Fellow of Fitzwilliam College; Director, University of Cambridge Conservation Research Institute, University of Cambridge; Gemma Cranston, Senior Programme Manager, Natural Capital Leaders Programme, Cambridge Institute for Sustainability Leadership (CISL), University of Cambridge, and Jonathan Green, Postdoctoral research associate, University of Cambridge

    This article was originally published on The Conversation. Read the original article.

    Find out more about the University of Cambridge's activities at the World Economic Forum 2016 here.

    Bhaskar Vira (Department of Geography), Gemma Cranston (Cambridge Institute for Sustainability Leadership) and Jonathan Green (Department of Geography) discuss what global powers need to do to tackle some of the biggest threats facing society.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


    0 0

    SPOILER ALERT: This article contains spoilers for early episodes of Deutschland 83

    One of the first things that happens in the first episode of Deutschland 83, the riveting German Cold War spy drama on Channel 4, is that Lenora Rauch – the brilliant, manipulative and obviously high-ranking Stasi officer stationed in West Germany – rushes home to East Berlin. After watching Reagan’s “Evil Empire” speech on West German television, she is convinced that NATO is preparing for an imminent nuclear attack. She calls her boss, switches off the telly and leaves her apartment.

    But why on earth would she take a jar of freeze-dried coffee with her? Here is a bit of realism: many consumer goods, especially “luxury” items such as coffee, chocolate, trainers and VCRs were in desperately short supply in the East, available only at huge expense in the specialised “Intershop”, “Delikat” or “Exquisit” outlets.



    This control of supply in turn reflected another desperate shortage: hard currency. The GDR, while well off in the context of the Eastern bloc, was perennially skinned for convertible (Western) currency and resorted to all manner of tricks and blackmail to get its hands on it.

    West Germans could purchase gifts, for example, from cassette recorders to prefab houses, for relatives in the East or friends using the Genex catalogue, a state monopoly operating through a Swiss intermediary. In Deutschland 83, it is poignant that Lenora takes the precious coffee not to her boss (as per usual), but to her sister. She needs it to soften the blow of poaching her border guard son Martin for a crucial Stasi mission in the West.

    The theme of consumer goods shortage is carried through consistently, with an eye for historical detail and a tongue in cheek. Martin Rauch infiltrates the FRG as an army officer called Moritz Stamm and is overwhelmed by the choice of fresh produce in a West German supermarket and nonplussed in an upmarket restaurant when the waitress asks him what kind of steak he would like – “From the cow”, he replies. So full points for realism on consumer goods and their potential leverage.

    Less realistic is the scene in which Martin, working as a border guard prior to his recruitment by the Stasi, accosts two would-be smugglers. Together with a colleague, he taunts them for their individualism, greed, and naivety: did they really think they’d get away with it? So far, so plausible – but then he lets them go, squirrels away the contraband, winks at his colleague and they share a laugh.

    In reality, a border guard conscript such as Martin would scarcely have taken such risks. Who is to say that his colleague, for all the playful elbow-jabbing, is not reporting back to their superior, or worse, the Stasi? In a country of 16m people there were over 90,000 permanent Ministry of State Security employees (admittedly including spies in foreign lands, cleaners, clerical staff etc – but still a large number) and a staggering 180,000 Inoffizielle Mitarbeiter– unofficial collaborators or informants, such as the unfortunate actress wife of the dissident author in Das Leben der Anderen (The Lives of Others)

    In any case, the playfulness, complicity and laxity of the two border guards in letting off the two “parasites” is deeply misleading. In reality the smugglers would probably have faced prison, possibly re-education and certainly official ostracism – even though all they carried was an edition of Shakespeare and one of Marx.

    Trabants and tatty clothes

    In terms of sets and props, production values trump the likely reality of the clothes, fittings, dwellings etc. Sure, the shape of the beer bottles, the cut of clothes and the penchant for skinnydipping are all well researched, but in the TV series, it all looks rather stylish. Clothes fit and interiors are well put together – if occasionally odd or austere. The GDR, however, was a place where everything from nails to shirts, paint to nappies, curtains to cars could be hard to come by.


    There was an 18-year waiting list for a Trabant in East Germany.Fsopolonezcaro via Wikimedia Commons, CC BY


    The waiting list for a Trabant, a two-stroke, minuscule car produced virtually unchanged from the 1950s to the 1980s, was 18 years; the price prohibitive. People wore horrible spectacles (there were only a handful of models to choose from), ill-fitting clothes – and many appeared in Christmas and wedding photographs wearing the same outfit year after year.

    But if the GDR of Deutschland 83 appears less run-down and shabby than it really was, the same is true of Mad Men’s 1960s Manhattan – and if the Elastoplast glossiness makes us more likely to take in this engrossing Cold War spyfest, who cares? For while the plot takes some liberties, it faithfully sticks to the overall facts, the poetic truth, of GDR life.

    That Martin has no idea of his father’s identity or whereabouts is realistic – single parenthood was normal, carried no stigma and could count on comparatively generous state support. That Martin would know the score of the West German football cup final is also realistic – most GDR citizens had access to West German TV and watched it despite official strictures not to do so. The protocol of a regional committee of the ruling Socialist Unity Party that we teach as part of our course on the reconstruction of Germany notes that moving the regular meeting slot is all but inevitable, as otherwise members would leave “in order to catch the seven o’clock news on West German television”.

    Welcome to Stasiland

    Other examples of realism abound. That advanced medication like the immunosuppressant crucial to Martin’s mother’s kidney transplant would be hard to get by (partly because its purchase diminished hard currency reserves)? Realistic. That such treatment was de facto the privilege of the elites in the “state of the workers and peasants”? Realistic. That, partly as a consequence of such perks, the Stasi attracted some of the best and brightest? Realistic. Stasi officers such as Lenora or her boss, the awesomely named Schweppenstette, could well have been razor sharp, flexible, if necessary charming and generally very good at their jobs.


    Young East German border guard Martin Rauch is coerced into spying for the Stasi.Channel 4


    That they rode roughshod over the private lives of citizens? Realistic, too. In an understated but chilling scene, Lenora, Schweppenstette and a sidekick come to the Rauch family home to interview and recruit Martin. They don’t so much as knock. Martin is not awake, but summoned to the kitchen in his pyjamas. The nonchalance of this invasion and its air of brusque, unquestioning and unquestioned power conveys a GDR reality, just as the fact that Lenora will exploit family ties for her political ends does. Of course the TV series presents a condensed, dramatised account. But ideology did trump family loyalty, in official policy and often enough in the reality of GDR citizens.

    Fond memories

    It is less paradoxical than it may seem that, nonetheless, most people were happy in the GDR, most of the time. This, too, Deutschland 83 gets right. It was not just the stability and social security, nor the fact that the state provided public goods free of charge that inspired such identification and for some even patriotism.

    As long as you heeded the rules, which for the majority of Germans under Communist rule meant no conscious blinkering, more a routine of moving within the parameters set by the state, there was no particular reason not to be happy. In the GDR, people made friends, fell in love, argued with their parents, fretted about wedding arrangements, moved to a different city to study, had favourite movies and songs.

    That much of this normalcy was questioned and to an extent invalidated by reunification inspired a good deal of the longing that Germans call “Ostalgie”. And that Westerners frowned upon such regrets – “But … it was a dictatorship!” – only cemented it. Ossis didn’t hanker after the Stasi – how could they? – but they did mourn the loss of the everyday GDR that framed their lives and which reunification had swept away, from the layout of traffic signs to the old brands of chocolate and gherkin.

    Even after the decommissioning, to all intents and purposes, of the Sonderweg thesis, the Third Reich remains the reference point of German history, implicitly or explicitly. While that is warranted, it has some problematic side effects, not least that in comparison to the Holocaust and Hitler, almost any other kind of state crime looks relatively benign and explicable.

    It is here that Deutschland 83 seems to me particularly successful: it is even-handed and almost sympathetic in suggesting that the GDR’s intrusive and cynical policies, vis-à-vis the West and its own citizens, were motivated by the perceived threat of nuclear annihilation.

    At the same time, it makes crystal clear that the operation of a successful secret intelligence network of the kind that places and directs Martin Rauch depends on exploiting precisely the kind of liberties and legal safeguards that the GDR denied its own citizens, or routinely flaunted. There are many lessons in that, not least concerning our own attitudes to the reach of security organisations that we rely on, but whose remit we should not renege on monitoring.

    The Conversation

    Henning Grunwald, Lecturer in History, University of Cambridge

    This article was originally published on The Conversation. Read the original article.

    The opinions expressed in this article are those of the individual author(s) and do not represent the views of the University of Cambridge.

    Henning Grunwald (Faculty of History) discusses how accurate the representation of life in Cold War era East Germany is in Channel 4 drama Deutschland 83.

    The Wall behind the Reichstag, (East) Berlin, Germany (1989/312)

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.


older | 1 | .... | 73 | 74 | (Page 75) | 76 | 77 | .... | 141 | newer