Are you the publisher? Claim or contact us about this channel


Embed this content in your HTML

Search

Report adult content:

click to rate:

Account: (login)

More Channels


Showcase


Channel Catalog


older | 1 | .... | 59 | 60 | (Page 61) | 62 | 63 | .... | 141 | newer

    0 0

    When the first galaxies started to form a few hundred million years after the Big Bang, the Universe was full of a fog of hydrogen gas. But as more and more brilliant sources — both stars and quasars powered by huge black holes — started to shine they cleared away the mist and made the Universe transparent to ultraviolet light. Astronomers call this the epoch of reionisation, but little is known about these first galaxies, and up to now they have just been seen as very faint blobs. But now new observations using the Atacama Large Millimetre/submillimetre Array (ALMA) are starting to change this.

    A team of astronomers led by Roberto Maiolino from the University’s Cavendish Laboratory and Kavli Institute for Cosmology trained ALMA on galaxies that were known to be seen only about 800 million years after the Big Bang. The astronomers were not looking for the light from stars, but instead for the faint glow of ionised carbon coming from the clouds of gas from which the stars were forming. They wanted to study the interaction between a young generation of stars and the cold clumps that were assembling into these first galaxies.

    They were also not looking for the extremely brilliant rare objects — such as quasars and galaxies with very high rates of star formation — that had been seen up to now. Instead they concentrated on rather less dramatic, but much more common, galaxies that reionised the Universe and went on to turn into the bulk of the galaxies that we see around us now.

    From one of the galaxies — given the label BDF 3299 — ALMA could pick up a faint but clear signal from the glowing carbon. However, this glow wasn’t coming from the centre of the galaxy, but rather from one side.

    “These observations enable an unprecedented understanding of the assembly process of the first galaxies formed in the Universe – for the first time we can observe and disentangle the different components contributing to the earliest phases of galaxy formation,” said Maiolino. “These observations have enabled us to test with unprecedented detail theories of galaxy formation in the early Universe.”

    The astronomers think that the off-centre location of the glow is because the central clouds are being disrupted by the harsh environment created by the newly formed stars — both their intense radiation and the effects of supernova explosions — while the carbon glow is tracing fresh cold gas that is being accreted from the intergalactic medium.

    By combining the new ALMA observations with computer simulations, it has been possible to understand in detail key processes occurring within the first galaxies. The effects of the radiation from stars, the survival of molecular clouds, the escape of ionising radiation and the complex structure of the interstellar medium can now be calculated and compared with observation. BDF 3299 is likely to be a typical example of the galaxies responsible for reionisation.

    “We have been trying to understand the interstellar medium and the formation of the reionisation sources for many years. Finally to be able to test predictions and hypotheses on real data from ALMA is an exciting moment and opens up a new set of questions. This type of observation will clarify many of the thorny problems we have with the formation of the first stars and galaxies in the Universe,” said co-author Andrea Ferrara, from the Scuola Normale Superiore in Pisa, Italy.

    “This study would have simply been impossible without ALMA, as no other instrument could reach the sensitivity and spatial resolution required,” said Maiolino. “Although this is one of the deepest ALMA observations so far it is still far from achieving its ultimate capabilities. In future ALMA will image the fine structure of primordial galaxies and trace in detail the build-up of the very first galaxies.”

    The results are reported in the journal Monthly Notices of the Royal Astronomical Society

    Reference:
    R. Maiolino et al., “The assembly of “normal” galaxies at z∼7 probed by ALMA,” Monthly Notices of the Royal Astronomical Society (2015). 

    Adapted from an ESO press release

    An international team of astronomers led by the University of Cambridge have detected the most distant clouds of star-forming gas yet found in normal galaxies in the early Universe – less than one billion years after the Big Bang. The new observations will allow astronomers to start to see how the first galaxies were built up and how they cleared the cosmic fog during the era of reionisation. This is the first time that such galaxies have been seen as more than just faint blobs.

    For the first time we can observe and disentangle the different components contributing to the earliest phases of galaxy formation
    Roberto Maiolino
    The central object is a very distant galaxy, labelled BDF 3299. The bright red cloud just to the lower left is the ALMA detection of a vast cloud of material that is in the process of assembling the very young galaxy

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

    0 0

    The treasures of Cambridge University Library’s Chinese collections are the latest addition to the Digital Library website (http://cudl.lib.cam.ac.uk/collections/chinese) which already hosts the works of Charles Darwin, Isaac Newton and Siegfried Sassoon, as well as unique collections on the Board of Longitude and the Royal Commonwealth Society.

    The oracle bones (ox shoulder blades and turtle shells) are one of the Library’s most important collections and are the earliest surviving examples of Chinese writing anywhere in the world. They are the oldest form of documents owned by the Library and record questions to which answers were sought by divination at the court of the royal house of Shang, which ruled central China between the 16th and 11th centuries BCE. (http://bit.ly/1RJkZEG).

    As the earliest known specimens of the Chinese script, the oracle bone inscriptions are of fundamental importance for Chinese palaeography and our understanding of ancient Chinese society. The bones record information on a wide range of matters including warfare, agriculture, hunting and medical problems, as well as genealogical, meteorological and astronomical data, such as the earliest records of eclipses and comets.

    Never before displayed, three of the 800 oracle bones held in the Library can now be viewed in exquisite detail, alongside a 17th-century book which has been described as 'perhaps the most beautiful set of prints ever made' (http://bit.ly/1fMfAf3). Estimated to be worth millions on the open market, the ‘Manual of Calligraphy and Painting’ was made in 1633 by the Ten Bamboo Studio in Nanjing.

    Charles Aylmer, Head of the Chinese Department at Cambridge University Library, said: “This is the earliest and finest example of multi-colour printing anywhere in the world, comprising 138 paintings and sketches with associated texts by fifty different artists and calligraphers. Although reprinted many times, complete sets of early editions in the original binding are extremely rare.

    “The binding is so fragile, and the manual so delicate, that until it was digitized, we have never been able to let anyone look through it or study it – despite its undoubted importance to scholars.”

    Other highlights of the digitisation include one of the world’s earliest printed books http://bit.ly/1HRsK0k), a Buddhist text dated between 1127 and 1175. The translator (Xuanzang) was famed for the 17 year pilgrimage to India he undertook to collect religious texts and bring them back to China.

    ‘The Manual of Famine Relief’ has also been digitised. This 19th-century manuscript contains instructions for the distribution of emergency rations to famine victims and includes practical advice about foraging for natural substitutes to normal foodstuffs in the event of an emergency.

    Elsewhere, a 14th-century banknote (http://bit.ly/1O8QJwB) is one of the more unusual additions to the Chinese Collections. Paper currency first appeared in China during the 7th century, and was in wide circulation by the 11th century, 500 years before its first use in Europe.

    By the 12th century the central government had realised the benefits of banknotes for purposes of tax collection and financial administration, and by the late 13th century had printed and issued a national paper currency – accounts of it reached Europe through the writings of Marco Polo and others.

    The Library’s banknote, printed on mulberry paper from a cast metal plate, was first issued in 1380. The denomination of the banknote (one thousand cash) is shown by a picture of ten strings of copper cash (10 x 100 = 1000), flanked by a text in seal script which reads: 'Great Ming Paper Currency; Circulating Throughout the World'. The text underneath threatens forgers with decapitation and promises that anyone denouncing or apprehending them will receive not only a reward of 25 ounces of silver but also all the miscreant’s property.

    Huw Jones, part of the digitisation team at Cambridge University Library, said: “The very high quality of the digital images has already led to important discoveries about the material – we have seen where red pigment was used to colour inscriptions on the oracle bones, and seals formerly invisible have been deciphered on several items. We look forward to new insights now that the collection has a truly global audience, and we are already working with an ornithological expert to identify the birds in the Manual of Calligraphy and Painting.”

    Cambridge University Library acquired its first Chinese book in 1632 as part of the collection of the Duke of Buckingham, but the first substantial holdings of Chinese books came with the donation of 4,304 volumes by Sir Thomas Wade (1818–1895), first Professor of Chinese in the University from 1888 until his death.

    The Chinese collections at Cambridge University Library now number about half a million individual titles, including monographs, reprinted materials, archival documents, epigraphical rubbings and 200,000 Chinese e-books (donated by Premier Wen Jiabao in 2009).

    A banknote from 1380 that threatens decapitation, a set of 17th-century prints so delicate they had never been opened, and 3000-year-old ‘oracle bones’ are now freely available for the world to view on the Cambridge Digital Library.

    This is the earliest and finest example of multi-colour printing anywhere in the world.
    Charles Aylmer
    Estimated to be worth millions on the open market, the ‘Manual of Calligraphy and Painting’ was made in 1633 by the Ten Bamboo Studio in Nanjing.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

    0 0

    In a study published today in the journal PLOS ONE, a team of psychologists show that your thinking style – whether you are an ‘empathizer’ who likes to focus on and respond to the emotions of others, or a ‘systemizer’ who likes to analyse rules and patterns in the world—is a predictor of the type of music you like.

    Music is a prominent feature of everyday life and nearly everywhere we go. It’s easy for us to know what types of music we like and don’t like. When shuffling songs on an iPod, it takes us only a few seconds to decide whether to listen or skip to the next track. However, little is known about what determines our taste in music.

    Researchers over the past decade have argued that musical preferences reflect explicit characteristics such as age and personality. For example, people who are open to new experiences tend to prefer music from the blues, jazz, classical, and folk genres, and people who are extraverted and ‘agreeable’ tend to prefer music from the pop, soundtrack, religious, soul, funk, electronic, and dance genres.

    Now a team of scientists, led by PhD student David Greenberg, has looked at how our ‘cognitive style’ influences our musical choices. This is measured by looking at whether an individual scores highly on ‘empathy’ (our ability to recognize and react to the thoughts and feelings of others) or on ‘systemizing’ (our interest in understanding the rules underpinning systems such as the weather, music, or car engines) – or whether we have a balance of both.

    “Although people’s music choices fluctuates over time, we’ve discovered a person’s empathy levels and thinking style predicts what kind of music they like,” said David Greenberg from the Department of Psychology. “In fact, their cognitive style – whether they’re strong on empathy or strong on systems – can be a better predictor of what music they like than their personality.”

    The researchers conducted multiple studies with over 4,000 participants, who were recruited mainly through the myPersonality Facebook app. The app asked Facebook users to take a selection of psychology-based questionnaires, the results of which they could place on their profiles for other users to see. At a later date, they were asked to listen to and rate 50 musical pieces. The researchers used library examples of musical stimuli from 26 genres and subgenres, to minimise the chances that participants would have any personal or cultural association with the piece of music.

    People who scored high on empathy tended to prefer mellow music (from R&B, soft rock, and adult contemporary genres), unpretentious music (from country, folk, and singer/songwriter genres) and contemporary music (from electronica, Latin, acid jazz, and Euro pop). They disliked intense music, such as punk and heavy metal. In contrast, people who scored high on systemizing favoured intense music, but disliked mellow and unpretentious musical styles.

    The results proved consistent even within specified genres: empathizers preferred mellow, unpretentious jazz, while systemizers preferred intense, sophisticated (complex and avant-garde) jazz.

    The researchers then looked more in-depth and found those who scored high on empathy preferred music that had low energy (gentle, reflective, sensual, and warm elements), or negative emotions (sad and depressing characteristics), or emotional depth (poetic, relaxing, and thoughtful features). Those who scored high on systemizing preferred music that had high energy (strong, tense, and thrilling elements), or positive emotions (animated and fun features), and which also featured a high degree of cerebral depth and complexity.

    David Greenberg, a trained jazz saxophonist, says the research could have implications for the music industry. “A lot of money is put into algorithms to choose what music you may want to listen to, for example on Spotify and Apple Music. By knowing an individual’s thinking style, such services might in future be able to fine tune their music recommendations to an individual.”

    Dr Jason Rentfrow, the senior author on the study says: “This line of research highlights how music is a mirror of the self. Music is an expression of who we are emotionally, socially, and cognitively.”

    Professor Simon Baron-Cohen, a member of the team, added; “This new study is a fascinating extension to the ‘empathizing-systemizing’ theory of psychological individual differences. It took a talented PhD student and musician to even think to pose this question. The research may help us understand those at the extremes, such as people with autism, who are strong systemizers.”

    Based on their findings, the following are songs that the researchers believe are likely to fit particular styles:

    High on empathy

    • Hallelujah – Jeff Buckley
    • Come away with me – Norah Jones
    • All of me – Billie Holliday
    • Crazy little thing called love – Queen

    High on systemizing

    • Concerto in C – Antonio Vivaldi
    • Etude Opus 65 No 3 — Alexander Scriabin
    • God save the Queen – The Sex Pistols
    • Enter the Sandman – Metallica

     

    David Greenberg was funded by the Cambridge Commonwealth, European and International Trust and the Autism Research Trust during the period of this work.

    Reference
    Greenberg, DM, Baron-Cohen, S, Stillwell, DJ, Kosinski, M, & Rentfrow, PJ. Musical preferences are linked to cognitive styles. PLOS ONE; 22 July 2015

    Do you like your jazz to be Norah Jones or Ornette Coleman, your classical music to be Bach or Stravinsky, or your rock to be Coldplay or Slayer? The answer could give an insight into the way you think, say researchers from the University of Cambridge.

    Although people’s music choices fluctuates over time, we’ve discovered a person’s empathy levels and thinking style predicts what kind of music they like
    David Greenberg
    Death Angel (cropped)

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes
    License type: 

    0 0

    Professor Chinnery is currently Professor of Neurogenetics at Newcastle University, a post he has held since 2004. He is a Wellcome Trust Senior Fellow in Clinical Science and NIHR Senior Investigator, and was elected a Fellow of the Academy of Medical Sciences in 2009. He has been Director of the Newcastle NIHR Biomedical Research Centre since 2008 and was appointed Director of the Institute of Genetic Medicine in Newcastle in 2010.

    His principal research interest is in understanding the role of mitochondria in human disease, and developing new treatments for mitochondrial disorders. His laboratory will also be moving to Cambridge, to join the MRC Mitochondrial Biology Unit.

    “I am honoured to have been elected Professor of Neurology at the University of Cambridge, and excited by the prospect of moving my laboratory to the MRC Mitochondrial Biology Unit on the Cambridge Biomedical Campus,” says Professor Chinnery. “I look forward to developing new research collaborations across the University from my base in the Department of Clinical Neurosciences.”

    His appointment follows the forthcoming retirement of Professor Alastair Compston as Professor of Neurology and Head of the Department of Clinical Neurosciences. Professor Compston was instrumental in the development of the drug Lemtrada, which last year received approval by the National Institute for Health and Care Excellence (NICE) for use in people with relapsing-remitting multiple sclerosis.

    Professor Patrick Maxwell, Regius Professor of Medicine at the University of Cambridge, adds: “I am delighted that Patrick will be joining us later this year. He has built a strong reputation as an expert in mitochondrial diseases and for his leadership of the Newcastle NIHR Biomedical Research Centre. His experience and expertise will be invaluable to Cambridge, and I am sure will provide superb leadership across Clinical Neuroscience.

    “At the same time, we are sorry to be saying goodbye to Alastair Compston. It is no exaggeration to say that his work on the development of Lemtrada has revolutionised the lives of people living with multiple sclerosis. Under his leadership, Clinical Neurosciences has flourished in an extraordinary way and the School is extremely grateful to him.”

    Professor Patrick Chinnery, an expert in diseases that affect mitochondria – the ‘batteries’ that power our cells – has been appointed as Professor of Neurology and Head of the Department of Clinical Neurosciences at the University of Cambridge. He will take up his appointment on 1 October.

    Patrick Chinnery

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes
    License type: 

    0 0

    Highly-social zebra finches learn foraging skills from their parents. However, new research has found that when juvenile finches are exposed to elevated stress hormones just after hatching, they will later switch strategies and learn only from unrelated adult birds – ignoring their parents’ way of doing things and instead gaining foraging skills from the wider network of other adult finches.    

    Researchers say that spikes in stress during early development may act as a cue that their parents are doing something wrong, triggering the young birds to switch their social learning strategy and disregard parental approaches in favour of acquiring skills exclusively from other birds in the flock.

    This stress cue and subsequent behavioural change would then allow the juveniles to bypass a “potentially maladaptive source of information” – possibly the result of low-quality parental investment or food scarcity at birth – and consequently avoid a “bad start in life”, say the researchers.

    The changes this stress could create in the patterns of individuals' social interactions may impact important population-wide processes, such as migration efficiency and the establishment of animal culture, they say. The new study is published today in the journal Current Biology.

    “These results support the theory that developmental stress may be used as an informative cue about an individual’s environment. If so, it may enable juveniles to avoid becoming trapped in a negative feedback loop provided by a bad start in life – by programming them to adopt alternative, and potentially more adaptive, behaviours that change their developmental trajectories,” said Dr Neeltje Boogert, from Cambridge University’s Department of Zoology, who authored the study with colleagues from the universities of Oxford and St Andrews.

    For the study, the research team took 13 broods of zebra finch hatchlings and fed half of the chicks in each brood with physiologically relevant levels of the stress hormone corticosterone dissolved in peanut oil, and the other half – their control siblings – with just plain peanut oil. The chicks were treated each day for 16 days from the ages of 12 days old.

    Once the chicks reached nutritional independence, they were released with their families into one of two free-flying aviaries, where researchers tracked their social foraging networks using radio tags called PIT tags (Passive Integrated Transponder), about the size of a grain of rice. Each bird's unique PIT tag was scanned when a bird visited a feeder, allowing the researchers to track exactly who was foraging where, when and with whom.

    Using these feeder visit data, the researchers were able to build finch social foraging networks, as the thirteen zebra finch families in the two aviaries foraged and interacted over the course of 40 days.

    They found that the juveniles administered with the stress hormone were less likely to spend time with their parents, spent more time with other unrelated birds and were far less choosy about which birds they foraged with; whereas the control group stuck more closely to their parents, and foraged more consistently with the same flock mates.

    To test whether these stress-hormone induced differences in social network positions affected who learned from whom, Boogert devised a food puzzle for the birds, and recorded exactly when each bird started solving it.

    In the new test, the birds had to learn to flip the lids from the top of a grid of holes to reach the food reward of spinach underneath. All other feeders were removed from the aviaries, and the researchers filmed a series of nine one-hour trials over three days, monitoring and scoring how each bird learned to get to the bait.

    They found that, while the control group of juvenile finches did also learn from some unrelated adults, they mostly copied their parents to find out how to get the spinach. In sharp contrast, the developmentally-stressed chicks exclusively copied unrelated adults instead – not one looked to a parent to figure out the key to the spinach puzzle.

    In fact, the stressed juveniles actually solved the task sooner than their control siblings, despite not using parents as role models to focus on. Boogert says this may be because they relied more on trial-and-error learning, or that they simply had access to the information sooner because they copied a large number of unrelated adult finches rather than just one of their two parents.   

    "If developmentally stressed birds occupy more central network positions and follow many others around, this might make them especially efficient spreaders of disease, as stressed individuals are also likely to have weakened immune systems," said Boogert.

    "The next step is to explore the implications of our results for important population-level processes, such as the spread of avian pox or flu."

    Inset image: Zebra finches in the ‘food puzzle’ experiment. Credit: Dr Neeltje Boogert 

    Juvenile zebra finches that experience high stress levels will ignore how their own parents forage and instead learn such skills from other, unrelated adults. This may help young birds avoid inheriting a poor skillset from parents – the likely natural cause of their stress – and becoming trapped by a “bad start in life”.

    Developmental stress may be used as an informative cue about an individual’s environment. If so, it may enable juveniles to avoid becoming trapped in a negative feedback loop
    Neeltje Boogert
    Zebra Finches

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes
    License type: 

    0 0

    For an insect, grooming is a serious business. If the incredibly sensitive hairs on their antennae get too dirty, they are unable to smell food, follow pheromone trails or communicate. So insects spend a significant proportion of their time just keeping themselves clean. Until now, however, no-one has really investigated the mechanics of how they actually go about this.

    In a study published in Open Science, Alexander Hackmann and colleagues from the Department of Zoology have undertaken the first biomechanical investigation of how ants use different types of hairs in their cleaning apparatus to clear away dirt from their antennae.

    “Insects have developed ingenious ways of cleaning very small, sensitive structures, so finding out exactly how they work could have fascinating applications for nanotechnology – where contamination of small things, especially electronic devices, is a big problem. Different insects have all kinds of different cleaning devices, but no-one has really looked at their mechanical function in detail before,” explains Hackmann.

    Camponotus rufifemur ants possess a specialised cleaning structure on their front legs that is actively used to groom their antennae. A notch and spur covered in different types of hairs form a cleaning device similar in shape to a tiny lobster claw. During a cleaning movement, the antenna is pulled through the device which clears away dirt particles using ‘bristles’, a ‘comb’ and a ‘brush’.

    To investigate how the different hairs work, Hackmann painstakingly constructed an experimental mechanism to mimic the ant’s movements and pull antennae through the cleaning structure under a powerful microscope. This allowed him to film the process in extreme close up and to measure the cleaning efficiency of the hairs using fluorescent particles.

    What he discovered was that the three clusters of hairs perform a different function in the cleaning process. The dirty antenna surface first comes into contact with the ‘bristles’ (shown in the image in red) which scratch away the largest particles. It is then drawn past the ‘comb’ (shown in the image in blue) which removes smaller particles that get trapped between the comb hairs. Finally, it is drawn through the ‘brush’ (shown in the image in green) which removes the smallest particles.

    “While the ‘bristles’ and the ‘comb’ scrape off larger particles mechanically, the ‘brush’ seems to attract smaller dirt particles from the antenna by adhesion,” says Hackmann, who works in the laboratory of Dr Walter Federle.

    Where the ‘bristles’ and ‘comb’ are rounded and fairly rigid, the ‘brush’ hairs are flat, bendy and covered in ridges – this increases the surface area for contact with the dirt particles, which stick to the hairs. Researchers do not yet know what makes the ‘brush’ hairs sticky – whether it is due to electrostatic forces, sticky secretions, or a combination of factors.

    “The arrangement of ‘bristles’, ‘combs’ and ‘brush’ lets the cleaning structure work as a particle filter that can clean different sized dirt particles with a single cleaning stroke,” says Hackmann. “Modern nanofabrication techniques face similar problems with surface contamination, and as a result the fabrication of micron-scale devices requires very expensive cleanroom technology. We hope that understanding the biological system will lead to building bioinspired devices for cleaning on micro and nano scales.”

    Dr Federle’s laboratory and, in part, this project receive financial support from the Biotechnology and Biological Sciences Research Council (BBSRC).

    Inset images: Scanning electron micrograph of the antenna clamped by the cleaner (Alexander Hackmann); Scanning electron micrograph of the tarsal notch (Alexander Hackmann).

    Reference:

    Alexander Hackmann, Henry Delacave, Adam Robinson, David Labonte, Walter Federle. Functional morphology and efficiency of the antenna cleaner in Camponotus rufifemur ants. Open Science; 22 July 2015.

    Using unique mechanical experiments and close-up video, Cambridge researchers have shown how ants use microscopic ‘combs’ and ‘brushes’ to keep their antennae clean, which could have applications for developing cleaners for nanotechnology.

    Insects have developed ingenious ways of cleaning very small, sensitive structures, which could have fascinating applications for nanotechnology – where contamination of small things is a big problem
    Alexander Hackmann
    Scanning electron micrograph of the tarsal notch

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

    0 0

    The Therapeutics Consortium, announced today, will connect the intellectual know-how of several large academic institutions with the drug-developing potential of the pharmaceutical industry, to deliver better drugs to the clinic.

    From early 2018, the Consortium will form a major constituent of the new Milner Therapeutics Institute, which has been made possible through a £5 million donation from Jonathan Milner and will be located in a new building at the Cambridge Biomedical Campus, the centrepiece of the largest biotech cluster outside the United States.

    The Consortium will connect academic and clinical researchers at the University of Cambridge, the Babraham Institute and the Wellcome Trust Sanger Institute with pharmaceutical companies Astex Pharmaceuticals, AstraZeneca and GlaxoSmithKline (GSK). It will provide researchers with the potential to access novel therapeutic agents (including small molecules and antibodies) across the entire portfolio of drugs being developed by each of the companies, in order to investigate their mechanism, efficacy and potential. The terms of the Consortium allow for fast and easy access to these agents and information.

    Each industry partner within the Therapeutics Consortium has committed funding to spend on collaborative projects and will collectively fund an executive manager to oversee the academic/industry interactions. Collaborative projects are expected to lead to joint publications, supporting a culture of more open innovation.

    Professor Tony Kouzarides from the University of Cambridge, who will head the Therapeutics Consortium and the Milner Institute, is currently deputy director at the Gurdon Institute. He says: “The Milner Institute will act as a ‘match-making’ service through the Therapeutics Consortium, connecting the world-leading research potential of the University of Cambridge and partner institutions with the drug development expertise and resources of the pharmaceutical industry. We hope many more pharmaceutical companies will join our consortium and believe this form of partnership is a model for how academic institutions and industry can work together to deliver better medicines.”

    Dr Harren Jhoti, President and CEO of Cambridge-based company Astex Pharmaceuticals, now part of Japan’s Otsuka Group, said: “As a company that was founded right here in Cambridge we are delighted to support this new Consortium working together with leading Cambridge academic and clinical researchers to help us to research and develop ever better treatments for patients.”

    Mene Pangalos, Executive Vice President, Innovative Medicines & Early Development at AstraZeneca said: “We are pleased to be part of this exciting new consortium that brings together world-leading science and technology into a dedicated multi-disciplinary institute focused on translational research.  The proximity of the Institute to our new R&D centre and global headquarters in Cambridge will ensure our scientists can work closely with those at the Milner Institute.”

    Professor Michael Wakelam, Director of the Babraham Institute, said: “The Institute’s participation in the Therapeutics Consortium provides yet one more channel by which our excellence in basic biological research is built upon in partnership with industry-based collaborators. We know from experience that bringing together the best academics and the best pharmacological research is both efficient and enlightening and we look forward to making joint progress.”

    Dr Rab Prinjha, Head of GSK’s Epigenetics Discovery Performance Unit, said: “Late-stage attrition is too high – very few investigational medicines entering human trials eventually become an approved treatment.  As an industry, we must improve our success rate by understanding our molecules and targets better.  This innovative institute which builds on GSK’s very successful collaboration with the Gurdon Institute and close links with many groups across Cambridge, aims to increase our knowledge of basic biological mechanisms to help us bring the right investigational medicines into human trials and ultimately to patients.”

    The Consortium will initially operate from the Wellcome Trust/Cancer Research UK Gurdon Institute, but will move into the Milner Institute in early 2018.

    The Milner Therapeutics Institute

    One of the major aims of the Institute will be to help understand how drugs work and to push forward new ideas and technologies to improve the development of novel therapies. A major, but not exclusive, focus of the Institute will be cancer.

    It is envisaged that the Milner Institute will be equipped with core facilities, such as high-throughput screening of small molecules against cell lines, organoids (‘mini organs’) and tumour biopsies, as well as bioinformatics support to help scientists deal with large datasets. Its facilities will be available to researchers working on collaborative projects within the Therapeutics Consortium and, capacity permitting, to other scientists and clinicians within the Cambridge community.

    In addition, the Milner Institute will have space for senior and junior scientists to set up independent research groups. There will also be associated faculty positions, which will be taken up by scientists in different departments, whose research and expertise will benefit from a close association with the Milner Institute.

    The Milner Institute will be housed within the new Capella building, alongside the relocated Wellcome Trust/MRC Cambridge Stem Cell Institute, a new Centre for Blood & Leukaemia Research, and a new Centre for Immunology & Immunotherapeutics.

    Jonathan Milner, whose donation has made the Milner Therapeutics Institute possible, is a former member of Tony Kouzarides’ research group and experienced entrepreneur. In 1998 they founded leading biotechnology company Abcam together with Professor David Cleevely, which has gone on to employ over 800 people and supply products to 64% of researchers globally.

    An innovative new Consortium will act as a ‘match-making’ service between pharmaceutical companies and researchers in Cambridge with the aim of developing and studying precision medicines for some of the most globally devastating diseases.

    We believe this form of partnership is a model for how academic institutions and industry can work together to deliver better medicines
    Tony Kouzarides
    Lab tubes (cropped)

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes
    License type: 

    0 0

    The results are based on further analysis of survey data from more than 70,000 cancer patients, by Cancer Research UK scientists at the University of Cambridge and University College London, published today in the European Journal of Cancer Care.

    Of the nearly 60,000 survey respondents diagnosed through their GP, almost a quarter (23 per cent) had been seen three or more times before being referred for cancer tests.

    Four in ten (39 per cent) of those who had experienced referral delays were dissatisfied with the support they received from their GP compared to just under three in ten (28 per cent) of those referred after one or two GP visits.

    Overall, patients who had seen their GP three or more times before being referred were more likely to report negative experiences across 10 of 12 different aspects of their care. For example, 18 per cent of these patients were dissatisfied with the way they were told they had cancer, compared to 14 per cent among those who were referred more quickly.

    Four in ten expressed dissatisfaction with how hospital staff and their GP had worked with each other to provide the best possible care, compared to one in three among those referred promptly.

    Dissatisfaction with the overall care received was even higher among the just under one in ten (9 per cent) patients who saw their GP five or more times before being referred.

    Study author Dr Georgios Lyratzopoulos, from the Department of Public Health and Primary Care at the University of Cambridge, said: “This research shows that first impressions go a long way in determining how cancer patients view their experience of cancer treatment. A negative experience of diagnosis can trigger loss of confidence in their care throughout the cancer journey.

    “When they occur, diagnostic delays are largely due to cancer symptoms being extremely hard to distinguish from other diseases, combined with a lack of accurate and easy-to-use tests. New diagnostic tools to help doctors decide which patients need referring are vital to improve the care experience for even more cancer patients.”

    Dr Richard Roope, Cancer Research UK’s GP expert, said: “It’s vital we now step up efforts to ensure potential cancer symptoms can be investigated promptly, such as through the new NICE referral guidelines launched last month to give GPs more freedom to quickly refer patients with worrying symptoms. This will hopefully contribute to improving the patient experience, one of the six strategic priorities recommended by the UK’s Cancer Task Force last week.”

    Reference

    Mendonca S.C. et al, Pre-referral general practitioner consultations and subsequent experience of cancer care: evidence from the English Cancer Patient Experience Survey, European Journal of Cancer (2015)

    Adapted from a press release by Cancer Research UK.

    If it takes more than three trips to the GP to be referred for cancer tests, patients are more likely to be dissatisfied with their overall care, eroding confidence in the doctors and nurses who go on to treat and monitor them, according to new research.

    This research shows that first impressions go a long way in determining how cancer patients view their experience of cancer treatment
    Georgios Lyratzopoulos
    Impatiently Waiting

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes
    License type: 

    0 0
  • 07/31/15--01:28: I is for Iggy the Iguanodon
  • Woodcut of the famous (crowded) banquet in Benjamin Waterhouse Hawkins' standing Crystal Palace Iguanodon, New Year's Eve, 1853.

    On New Year’s Eve 1853, a group of entrepreneurs dined inside the mould for a giant model Iguanodon and, it is reported, sang a rousing song in praise of dinosaurs. The chorus runs: The jolly old beast/Is not deceased/There’s life in him again! ROAR. The model that provided an unlikely dinner venue that December evening was part of a set of concrete dinosaurs – the world’s first full-size dino-sculptures – made for the Crystal Palace at Sydenham.

    Some 160 years on, Cambridge University has revived this song in celebration of the Iguanodon (nicknamed Iggy) on display at the Sedgwick Museum of Earth Sciences. Barney Brown, the University’s head of digital communications, set the dinosaur lyrics to a bluegrass tune which he sings in a gravelly voice. It’s not known when the song was last heard by the public or what the original musical score was. The lyrics appear in WJT Mitchell's The Last Dinosaur Book (1998)  and the song is discussed in Science in Wonderland (2015) by Melanie Keene (Homerton College, Cambridge).

    A plaster replica of a skeleton found in a mine in 1878, Iggy was given to the Sedgwick Museum by the King of Belgium. The original creature would have measured 11 metres from nose to tail and weighed more than an elephant. Fossilised bones of Iguanodon or its close relatives, which lived between 140 and 120 million years ago in the Cretaceous Period, have also been found in several places in Britain, notably the Isle of Wight, West Sussex, East Sussex, Surrey, Kent, Dorset, Yorkshire and Potton in Bedfordshire.

    Iggy is posed in the ‘kangaroo-style’ posture that was an early interpretation of the creature's stance. Palaeobiologist Dr David Norman, who was director of the Sedgwick Museum from 1991 to 2011, has shown in the course of his work on dinosaurs that this upright posture would not have been possible for an animal like this; it would have spent much of its time browsing and walking on all four legs.

    Research by Norman has shown that three of the fingers of Iguanodon’s ‘hands’ were modified to form a load-bearing foot with toes that ended in broad, flattened hooves. The ‘thumb’ was a ferocious dagger-like spike, while its ‘little finger’ was elongate and prehensile, and could have been used to help grasp clumps of vegetation.

    “The animal's back and tail were stiffened by bundles of bony rods – ossified tendons – that you can see if you look along the spine of the animal. These bony tendons would have stiffened the back while it was held more or less horizontally and the tail, which stuck out at the rear, would have acted as a heavy cantilever (or counterbalance) to the front part of the body,” says Norman.

    “The bony tendons running along the sides of the spine would also have prevented the dinosaur from bending the base of its tail, as seen in the skeleton in the Sedgwick Museum, and adopting such a steeply upright posture."

    Underneath the skeleton is a fossilised footprint, thought to have been made by an Iguanodon walking on a soft surface. The footprint was found on the seashore, having eroded out of the cliffs near Atherfield Point on the Isle of Wight. Collections manager Dan Pemberton says: “The footprint in the museum is approximately 17 inches, or 43cm, long from the back of the print to the tip of the middle toe. Even bigger footprints can be found in the Cretaceous rocks exposed on the foreshore of the Isle of Wight.”

    Iguanodon was one of the first dinosaurs to be discovered and scientifically described by the Sussex-based doctor Gideon Mantell (1790-1852). Mantell at first envisaged Iguanodon as a gigantic lizard-like reptile but later on he thought that it might have resembled something like a giant ground sloth. One of his contemporaries (Richard Owen) – who invented the word dinosaur to recognise the existence of a group of stupendously large, extinct reptiles – deduced that it must have been a sort of gigantic reptilian rhinoceros, complete with a horn (a misplaced thumb spike) on the tip of its nose.

    “The discoveries at Bernissart in Belgium helped greatly to clarify our understanding of this animal, since the skeletons were mostly complete and articulated – their bones were preserved in more or less the same arrangement as they had been in life,” says Norman. “Even so, the animal's typical life posture was misinterpreted because the scientist, Louis Dollo, who described them in the 1880s, was convinced that their habits were like those of giraffes, which feed high in the treetops, rather than like rhinoceroses that browse low to the ground.”

    Norman has revealed many new and unexpected aspects of the way of life and general biology of Iguanodon. He has also shown that there was a variety of Iguanodon-like dinosaurs that lived in southern England during the Cretaceous Period including Iguanodon itself, Mantellisaurus, Barilium and Hypselospinus. They were all quite large plant-eaters, and all of them had that very distinctive spiky thumb.

    In her book, Science in Wonderland, Dr Melanie Keene (Homerton College) explores the hugely enthusiastic public response to the creation of one of the world’s first sets of full-sized dinosaur models – including an Iguanodon based on Owen’s version of how it might have looked.

    She says: “The models were made by sculptor Benjamin Waterhouse Hawkins for the landscaped gardens around the Crystal Palace in 1854. An expanded, commercial version of the Great Exhibition that had been held in Hyde Park three years earlier, the Sydenham enterprise was one vast project of visual education. Visitors described its array of artwork, installations, fountains and glasshouse as like a trip to fairyland, and the concrete monsters were the star of the show.”

    Even before the exhibition had opened, the sculpted beasts made it into the newspapers, with coverage of a celebratory meal held in the mould of the Iguanodon model on New Year’s Eve 1853. That evening, investors, men of science, and Hawkins, had dined in style, and had even joined together in song to hymn their achievement: ‘A thousand ages underground,/His skeleton had lain,/But now his body’s big and round/And there’s life in him again!... The jolly old beast/Is not deceased/There’s life in him again!/ROAR’.

    The Iguanodon model has come to be an iconic part of the South East London landscape, and reproduced in many different media. For example, versions of the Crystal Palace monsters have appeared in many children’s books over the past 150 years, from E Nesbit’s Enchanted Castle to Topsy and Tim Meet the Monsters. “Some authors, like the singing diners, favoured a resurrectionary theme, modifying the display’s original didactic intentions and instead converting models such as the Iguanodon into terrifying creatures that came to life at night and menaced the young,” says Keene.

    More recent children’s books have, in line with the increasingly outdated appearance of the creatures, featured the monsters as rather funny-looking friends to children. In 1970 Ann Coates combined both of these elements in her Dinosaurs Don’t Die (1970), which brought the Iguanodon model back to life as a character, ‘Rock’, who befriended her protagonist, Daniel. In the book, boy and creature cross London together to see a more modern interpretation of the Iguanodon on show at the Natural History Museum, the contrast between the old and new forms brilliantly captured in John Vernon Lord’s evocative illustrations.

    Keene adds: “Surviving the catastrophic fire that destroyed the main building in 1936, these amazing artefacts can still be seen today in Crystal Palace Park, and have most recently been cast as objects in need of conservation. Here in Cambridge, a miniature version of Owen’s rhinoceros-like Iguanodon can be seen beside the full-sized skeleton in the Sedgwick Museum.”

    Next in the Cambridge Animal Alphabet: J is for a creature so clever it has been nicknamed the "feathered ape" by researchers.

    Inset images: Louis Dollo supervising the reconstruction of an Iguanodon (Sedgwick Museum of Earth Sciences); Iggy's skull. The entire seketon was dismantled, repaired and repainted in 2004 (Sedgwick Museum of Earth Sciences); Crystal Palace Iguanodon (Wikimedia Commons); Woodcut of the famous banquet in Benjamin Waterhouse Hawkins' standing Crystal Palace Iguanodon, New Year's Eve, 1853 (Wikimedia Commons).

    The Cambridge Animal Alphabet series celebrates Cambridge's connections with animals through literature, art, science and society. Here, I is for Iguanodon – a thousand ages underground, his skeleton had lain, but now his body’s big and round, and there’s life in him again!

    Visitors described the Crystal Palace's array of artwork, installations, fountains and glasshouse as like a trip to fairyland, and the concrete monsters were the star of the show
    Melanie Keene
    Woodcut of the famous (crowded) banquet in Benjamin Waterhouse Hawkins' standing Crystal Palace Iguanodon, New Year's Eve, 1853.

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

    0 0

    William Smith’s 1815 Geological Map of England and Wales, which measures 8.5ft x 6ft, demonstrated for the first time the geology of the UK and was the culmination of years of work by Smith, who was shunned by the scientific community for many years and ended up in debtors’ prison.

    Today, exactly 200 years since its first publication, a copy of Smith’s map – rediscovered after more than a century in a museum box – will go on public display at the Sedgwick Museum of Earth Sciences. Aside from a copy held at The Geological Society in London, the Cambridge map is believed to be the only such map on public display anywhere in the world.

    The iconic map, which is still used as the basis of geological maps to this day, had the greatest influence on the science of geology, inspiring a generation of naturalists and fledgling geologists to establish geology as a coherent, robust and important science. The map was so large, that, for practicality's sake, it was often sold in 15 separate sheets, either loose, or in a leather travelling case.

    Museum Director Ken McNamara said: “This is the world’s earliest geological map. Smith was working from a position of no knowledge when he began. Nobody had ever attempted this before and it’s really quite staggering what this one man achieved over ten or fifteen years, travelling up and down the country as a canal surveyor.

    “It’s incredibly accurate, even now in 2015. If you compare the current geological map of Great Britain today there are amazing similarities. The British Geological Survey still uses the same colour scheme that Smith devised. Chalk is green. Limestone is yellow and it’s still done like that to this day.”

    “This started geology as a modern science. It’s like the Magna Carta of geology, the beginnings of geology as a modern science and that’s why it’s so important.”

    Smith’s map proudly announced itself to the world as: "A DELINEATION of the STRATA of ENGLAND and WALES with part of SCOTLAND; exhibiting the COLLIERIES and MINES; the MARSHES and FEN LANDS ORIGINALLY OVERFLOWED BY THE SEA; and the VARIETIES of Soil according to the Variations in the Substrata; ILLUSTRATED by the MOST DESCRIPTIVE NAMES".

    How many of Smith's great maps still exist is unclear. Around 70 are thought to remain worldwide. The Sedgwick Museum of Earth Sciences at the University of Cambridge, the oldest geological museum in the world, is lucky enough to have three copies.

    For many years the museum knew that it possessed two of Smith's great maps: one a set of 15 sheets bound together as a book; the other, beautifully preserved, nestles in its leather travelling case. Two years ago, in May 2013, a third copy was rediscovered in the collection. Found folded in a box with some other early geological maps, staff believe it had not seen the light of day since Queen Victoria was on the throne.

    Despite its decades hidden from view, the hand-coloured map had been exposed to harsh light for many years before being packed away. The colours were faded, the paper stained and it carried the stains of faecal deposits from long dead spiders and flies.

    The map was then conserved by experts at Duxford, near Cambridge. Nineteenth century dirt and grime was carefully removed, then the original, faded water-colour paint was given a protective coating and subtly restored to enhance the colour of the rock formations. Only 400 were ever produced over at least a four-year period. During that time, Smith continued his geological research and continually made new discoveries, adapting and amending each new edition as he went along. Each individual map took seven or eight days to be coloured.

    McNamara said: “Smith suffered many deprivations in his life. He became a bankrupt and ended up in debtor's prison for a while. Perhaps, almost as galling, he was largely ignored by the geological establishment. However, he gained his due recognition from the Geological Society of London later in life when, in 1831, he was the first person to receive the society's most prestigious medal, the Wollaston Medal.

    “Appropriately, given the hanging of his map in the Sedgwick Museum, it was Adam Sedgwick who presented Smith with his medal. We are, we think, the only museum, library or art gallery in the world to have one of Smith’s legendary maps on public display – and we want as many people as possible to come and see this enormous, iconic and beautiful map for themselves.”

    One of the most important maps of the UK ever made – described as the ‘Magna Carta of geology’ – is to go on permanent public display in Cambridge after being restored to its former glory.

    This is the world’s earliest geological map.
    Ken McNamara

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

    0 0

    Schizophrenia is a long-term mental health condition that causes a range of psychological symptoms, ranging from changes in behaviour through to hallucinations and delusions. Psychotic symptoms are reasonably well treated by current medications; however, patients are still left with debilitating cognitive impairments, including in their memory, and so are frequently unable to return to university or work.

    There are as yet no licensed pharmaceutical treatments to improve cognitive functions for people with schizophrenia. However, there is increasing evidence that computer-assisted training and rehabilitation can help people with schizophrenia overcome some of their symptoms, with better outcomes in daily functioning and their lives.

    Schizophrenia is estimated to cost £13.1 billion per year in total in the UK, so even small improvements in cognitive functions could help patients make the transition to independent living and working and could therefore substantially reduce direct and indirect costs, besides improving the wellbeing and health of patients.

    In a study published today in the Philosophical Transactions of the Royal Society B, a team of researchers led by Professor Barbara Sahakian from the Department of Psychiatry at Cambridge describe how they developed and tested Wizard, an iPad game aimed at improving an individual’s episodic memory. Episodic memory is the type of memory required when you have to remember where you parked your car in a multi-storey car park after going shopping for several hours or where you left your keys in home several hours ago, for example. It is one of the facets of cognitive functioning to be affected in patients with schizophrenia.

    The game, Wizard, was the result of a nine-month collaboration between psychologists, neuroscientists, a professional game-developer and people with schizophrenia. It was intended to be fun, attention-grabbing, motivating and easy to understand, whilst at the same time improving the player’s episodic memory. The memory task was woven into a narrative in which the player was allowed to choose their own character and name; the game rewarded progress with additional in-game activities to provide the user with a sense of progression independent of the cognitive training process.

    The researchers assigned twenty-two participants, who had been given a diagnosis of schizophrenia, to either the cognitive training group or a control group at random. Participants in the training group played the memory game for a total of eight hours over a four-week period; participants in the control group continued their treatment as usual. At the end of the four weeks, the researchers tested all participants’ episodic memory using the Cambridge Neuropsychological Test Automated Battery (CANTAB) PAL, as well as their level of enjoyment and motivation, and their score on the Global Assessment of Functioning (GAF) scale, which doctors use to rate the social, occupational, and psychological functioning of adults.

    Professor Sahakian and colleagues found that the patients who had played the memory game made significantly fewer errors and needed significantly fewer attempts to remember the location of different patterns in the CANTAB PAL test relative to the control group. In addition, patients in the cognitive training group saw an increase in their score on the GAF scale.

    Participants in the cognitive training group indicated that they enjoyed the game and were motivated to continue playing across the eight hours of cognitive training. In fact, the researchers found that those who were most motivated also performed best at the game. This is important, as lack of motivation is another common facet of schizophrenia.

    Professor Sahakian says: “We need a way of treating the cognitive symptoms of schizophrenia, such as problems with episodic memory, but slow progress is being made towards developing a drug treatment. So this proof-of-concept study is important because it demonstrates that the memory game can help where drugs have so far failed. Because the game is interesting, even those patients with a general lack of motivation are spurred on to continue the training.”

    Professor Peter Jones adds: “These are promising results and suggest that there may be the potential to use game apps to not only improve a patient’s episodic memory, but also their functioning in activities of daily living. We will need to carry out further studies with larger sample sizes to confirm the current findings, but we hope that, used in conjunction with medication and current psychological therapies, this could help people with schizophrenia minimise the impact of their illness on everyday life.”

    It is not clear exactly how the apps also improved the patients’ daily functioning, but the researchers suggest it may be because improvements in memory had a direct impact on global functions or that the cognitive training may have had an indirect impact on functionality by improving general motivation and restoring self-esteem. Or indeed, both these explanations may have played a role in terms of the impact of training on functional outcome.

    In April 2015, Professor Sahakian and colleagues began a collaboration with the team behind the popular brain training app Peak to produce scientifically-tested cognitive training modules. The collaboration has resulted in the launch today of the Cambridge University & Peak Advanced Training Plan a memory game, available within Peak’s iOS app, designed to train visual and episodic memory while promoting learning.

    The training module is based on the Wizard memory game, developed by Professor Sahakian and colleague Tom Piercy at the Department of Psychiatry at the University of Cambridge. Rights to the Wizard game were licensed to Peak by Cambridge Enterprise, the University’s commercialisation company.

    “This new app will allow the Wizard memory game to become widely available, inexpensively. State-of-the-art neuroscience at the University of Cambridge, combined with the innovative approach at Peak, will help bring the games industry to a new level and promote the benefits of cognitive enhancement,” says Professor Sahakian.

    Reference
    Sahakian, BJ et al. The impact of neuroscience on society: Cognitive enhancement in neuropsychiatric disorders and in healthy people. Phil. Trans. R. Soc. B; 3 Aug 2015

    Home page image: Brain Power by Allan Ajifo

    A ‘brain training’ iPad game developed and tested by researchers at the University of Cambridge may improve the memory of patients with schizophrenia, helping them in their daily lives at work and living independently, according to research published today.

    This proof-of-concept study is important because it demonstrates that the memory game can help where drugs have so far failed
    Barbara Sahakian
    Cambridge Advanced Training Programme

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

    0 0

    Researchers led by the University of Cambridge have found the earliest example of reproduction in a complex organism. Their new study has found that some organisms known as rangeomorphs, which lived 565 million years ago, reproduced by taking a joint approach: they first sent out an ‘advance party’ to settle in a new area, followed by rapid colonisation of the new neighbourhood. The results, reported today in the journal Nature, could aid in revealing the origins of our modern marine environment.

    Using statistical techniques to assess the distribution of populations of a type of rangeomorph called Fractofusus, the researchers observed that larger ‘grandparent’ rangeomorphs were randomly distributed in their environment, and were surrounded by distinct patterns of smaller ‘parents’ and ‘children’. These patterns strongly resemble the biological clustering observed in modern plants, and suggest a dual mode of reproduction: the ‘grandparents’ being the product of ejected waterborne propagules, while the ‘parents’ and ‘children’ grew from ‘runners’ sent out by the older generation, like strawberry plants.

    Rangeomorphs were some of the earliest complex organisms on Earth, and have been considered to be some of the first animals – although it’s difficult for scientists to be entirely sure. They thrived in the oceans during the late Ediacaran period, between 580 and 541 million years ago, and could reach up to two metres in length, although most were around ten centimetres. Looking like trees or ferns, they did not appear to have mouths, organs, or means of moving, and probably absorbed nutrients from the water around them.

    Like many of the life forms during the Ediacaran, rangeomorphs mysteriously disappeared at the start of the Cambrian period, which began about 540 million years ago, so it has been difficult to link rangeomorphs to any modern organisms, or to figure out how they lived, what they ate and how they reproduced.

    “Rangeomorphs don’t look like anything else in the fossil record, which is why they’re such a mystery,” said Dr Emily Mitchell, a postdoctoral researcher in Cambridge’s Department of Earth Sciences, and the paper’s lead author. “But we’ve developed a whole new way of looking at them, which has helped us understand them a lot better – most interestingly, how they reproduced.”

    Mitchell and her colleagues used high-resolution GPS, spatial statistics and modelling to examine fossils of Fractofusus, in order to determine how they reproduced. The fossils are from south-eastern Newfoundland in Canada, which is one of the world’s richest sources of fossils from the Ediacaran period. Since rangeomorphs were immobile, it is possible to find entire ecosystems preserved exactly where they lived, making them extremely suitable for study via spatial techniques.

    The ‘generational’ clustering patterns the researchers observed fit closely to a model known as a nested double Thomas cluster model, of the type seen in modern plants. These patterns suggest rapid, asexual reproduction through the use of stolons or runners. At the same time, the random distribution of larger ‘grandparent’ Fractofusus specimens suggests that they were the result of waterborne propagules, which could have been either sexual or asexual in nature.

    “Reproduction in this way made rangeomorphs highly successful, since they could both colonise new areas and rapidly spread once they got there,” said Mitchell. “The capacity of these organisms to switch between two distinct modes of reproduction shows just how sophisticated their underlying biology was, which is remarkable at a point in time when most other forms of life were incredibly simple.”

    The use of this type of spatial analysis to reconstruct Ediacaran organism biology is only in its infancy, and the researchers intend to extend their approach to further understand how these strange organisms interacted with each other and their environment.

    The research was funded by the Natural Environment Research Council.

    Reference:
    Mitchell, E. et al., Reconstructing the reproductive mode of an Ediacaran macro-organism, Nature (2015), DOI: 10.1038/nature14646

    ​Inset image: A group of Fractofusus specimens from the ‘E’ surface, Mistaken Point Ecological Reserve, Newfoundland, Canada Credit: AG Liu

    A new study of 565 million-year-old fossils has identified how some of the first complex organisms on Earth – possibly some of the first animals to exist – reproduced, revealing the origins of our modern marine environment.

    Rangeomorphs don’t look like anything else in the fossil record, which is why they’re such a mystery
    Emily Mitchell
    Artist's reconstruction of the Fractofusus community on the H14 surface at Bonavista Peninsula

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

    0 0
  • 08/04/15--01:56: Play’s the thing
  • Brick by brick, six-year-old Alice is building a magical kingdom. Imagining fairy-tale turrets and fire-breathing dragons, wicked sorcerers and gallant heroes, she’s creating an enchanting world. Although she isn’t aware of it, this fantasy will have important repercussions in her adult life: it is helping her take her first steps towards her capacity for abstract thought and creativity.

    Minutes later, Alice has abandoned the kingdom in favour of wrestling with her brother – or, according to educational psychologists, developing her capacity for strong emotional attachments. When she bosses him around as ‘his teacher’, she’s practising how to regulate her emotions through pretence. When they settle down with a board game, she’s learning about rules and turn-taking.

    “Play in all its rich variety is one of the highest achievements of the human species,” says Dr David Whitebread from Cambridge’s Faculty of Education. “It underpins how we develop as intellectual, problem-solving, emotional adults and is crucial to our success as a highly adaptable species.”

    Recognising the importance of play is not new: over two millennia ago, Plato extolled its virtues as a means of developing skills for adult life, and ideas about play-based learning have been developing since the 19th century.

    But we live in changing times, and Whitebread is mindful of a worldwide decline in play. “Over half the world’s population live in cities. Play is curtailed by perceptions of risk to do with traffic, crime, abduction and germs, and by the emphasis on ‘earlier is better’ in academic learning and competitive testing in schools.

    “The opportunities for free play, which I experienced almost every day of my childhood, are becoming increasingly scarce. Today, play is often a scheduled and supervised activity.”

    International bodies like the United Nations and the European Union have begun to develop policies concerned with children’s right to play, and to consider implications for leisure facilities and educational programmes. But what they often lack is the evidence to base policies on, as Whitebread explains: “Those of us who are involved in early childhood education know that children learn best through play and that this has long-lasting consequences for achievement and well being. But the kind of hard quantifiable evidence that is understood by policy makers is difficult to obtain. Researching play is inherently tricky.”

    “The type of play we are interested in is child-initiated, spontaneous and unpredictable – but, as soon as you ask a five-year-old ‘to play’, then you as the researcher have intervened,” explains Dr Sara Baker. “And we want to know what the impact of play is years, even decades, later. It’s a real challenge.”

    Dr Jenny Gibson agrees: “Although some of the steps in the puzzle of how and why play is important have been looked at, there is very little, high-quality evidence that takes you from the amount and type of play a child experiences through to its impact on the rest of its life.”

    Now, thanks to the new Centre for Research on Play in Education, Development and Learning (PEDaL), Whitebread, Baker, Gibson and a team of researchers hope to provide evidence on the role played by play in how a child develops.

    “A strong possibility is that play supports the early development of children’s self-control,” explains Baker.

    “These are our abilities to develop awareness of our own thinking processes – it influences how effectively we go about undertaking challenging activities.”

    In a study carried out by Baker with toddlers and young pre-schoolers, she found that children with greater self-control solved problems quicker when exploring an unfamiliar set-up requiring scientific reasoning, regardless of their IQ. “This sort of evidence makes us think that giving children the chance to play will make them more successful and creative problem-solvers in the long run.”

    If playful experiences do facilitate this aspect of development, say the researchers, it could be extremely significant for educational practices because the ability to self-regulate has been shown to be a key predictor of academic performance.

    Gibson adds: “Playful behaviour is also an important indicator of healthy social and emotional development. In my previous research, I investigated how observing children at play can give us important clues about their well being and can even be useful in the diagnosis of neurodevelopmental disorders like autism.”

    Whitebread’s recent research has involved developing a playful approach to supporting children’s writing. “Many primary school children find writing difficult, but we showed in a previous study that a playful stimulus was far more effective than an instructional one.” Children wrote longer and better structured stories when they first played with dolls representing characters in the story. In the latest study, children first built their story with LEGO, with similar results. “Many teachers commented that they had always previously had children saying they didn’t know what to write about. With the LEGO building, however, not a single child said this through the whole year of the project.”

    The strand of research he leads in the Centre will focus on the results of large-scale longitudinal studies, such as the University of London’s Millennium Cohort Study, which is charting the social, economic and health conditions of individual children. Whitebread hopes to determine how much a child plays, the quality of their playfulness, and with what end result.

    Even when this evidence is known, it is often difficult to develop practices that best support children’s play. The two research strands led by Gibson and Baker will aid this: Gibson will be developing an understanding of the cognitive processes involved in play and measures of playfulness, and Baker will be constructing and evaluating play-based educational interventions.

    Whitebread, who directs PEDaL, trained as a primary school teacher in the early 1970s, when, as he describes, “the teaching of young children was largely a quiet backwater, untroubled by any serious intellectual debate or controversy.” Now, the landscape is very different, with hotly debated topics such as school starting age and the introduction of baseline assessment to those starting school in September 2015.

    “Somehow the importance of play has been lost in recent decades. It’s regarded as something trivial, or even as something negative that contrasts with ‘work’. Let’s not lose sight of its benefits, and the fundamental contributions it makes to human achievements in the arts, sciences and technology. Let’s make sure children have a rich diet of play experiences.”

    Children’s play is under threat from increased urbanisation, perceptions of risk and educational pressures. The first research centre of its kind aims to understand the role played by play in how a child develops.

    Play in all its rich variety is one of the highest achievements of the human species
    David Whitebread

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

    0 0

    The Alan Turing Institute has marked its first few days of operations with the announcement of its new director, the confirmation of £10 million of research funding from Lloyd’s Register Foundation, a research partnership with GCHQ, a collaboration with Cray Inc and EPSRC, and its first research activities.

    The Institute will promote the development and use of advanced mathematics, computer science, algorithms and big data for human benefit. The University of Cambridge is one of the Institute’s founding partners, along with the universities of Edinburgh, Oxford, UCL, Warwick and the Engineering and Physical Sciences Research Council (EPSRC). As of 22 July, the Institute, which will be based at the British Library in London, is now fully constituted and has begun operations.

    Jo Johnson, Minister for Universities and Science, said: “The Alan Turing Institute has set off on a speedy course to secure new lasting partnerships and bring together expertise from across the UK that will help secure our place as a world leader in areas like Big Data, computer science and advanced mathematics.”

    The Institute has also announced that:

    • Has appointed Professor Andrew Blake, who will join the Institute in October, as its first Director;
    • Has accepted a formally approved offer of £10 million of research funding from the board of the Lloyd’s Register Foundation;
    • Will work with GCHQ on open access and commercial data-analysis methods;
    • Is to collaborate with Cray Inc. and EPSRC to exploit a next generation analytics capability on the UK’s Largest Supercomputer for scientific research, ARCHER;
    • Is issuing its first call for expressions of interest from research fellows;
    • Will commence research work this autumn with a series of data summits for commerce, industry and the physical and social sciences and scoping workshops for data and social scientists to inform and shape the Institute’s research agenda.

    Andrew Blake is currently a Microsoft Distinguished Scientist and Laboratory Director of Microsoft Research UK. He is an Honorary Professor in Information Engineering at Cambridge, a Fellow of Clare Hall and a leading researcher in computer vision. He studied Mathematics and Electrical Sciences at Trinity College, and after a year as a Kennedy Scholar at MIT and time in the electronics industry, he completed a PhD in Artificial Intelligence at the University of Edinburgh in 1983.

    “I am very excited to be chosen for this unique opportunity to lead The Alan Turing Institute,” said Blake. “The vision of bringing together the mathematical and computer scientists from the country’s top universities to develop the new discipline of data science, through an independent institute with strategic links to commerce and industry, is very compelling. The institute has a societally important mission and ambitious research goals. We will go all out to achieve them.” 

    “The enthusiasm and commitment of the founding partners have enabled the Institute to make rapid progress,” said Howard Covington, chair of The Alan Turing Institute. “We will now turn to building the Institute’s research activities. We are delighted to welcome Andrew Blake as our new director and to begin strategic relationships with the Lloyd’s Register Foundation and GCHQ. Our cooperation with Cray Inc. is one of several relationships with major infrastructure and service providers that will be agreed over the coming months. We are also in discussions with a number of industrial and commercial firms who we expect to become strategic partners in due course and are highly encouraged by the breadth of interest in working with the Institute.”

    Professor Philip Nelson, Chief Executive of the Engineering and Physical Sciences Research Council (EPSRC) added: “I am delighted to see The Alan Turing Institute up and running. The teams from EPSRC and the founding universities have shown outstanding collaboration in bringing together five of our world-class academic institutions. We look forward to the Institute becoming an internationally leading player in data science.”

    “Getting the most out of big data requires new methods to handle large quantities of information and the clever use of algorithms to distil meaningful knowledge out of such volumes,” said Professor John Aston of Cambridge’s Department of Pure Mathematics and Mathematical Statistics, who is the university's representative on the Alan Turing Institute Board of Directors. “Research in this area could revolutionise our ability to compare, cross-reference and analyse data in ways that have previously been beyond the bounds of human or computer analysis.

    The Alan Turing Institute is a joint venture between the universities of Cambridge, Edinburgh, Oxford, Warwick, UCL and EPSRC – The Alan Turing Institute will attract the best data scientists and mathematicians from the UK and across the globe to break new boundaries in how we use big data in a fast moving, competitive world.

    The Institute is being funded over five years with £42 million from the UK government. The university partners are contributing £5 million each, totalling £25 million. In addition, the Institute will seek to partner with other business and government bodies. The creation of the Institute has been coordinated by the EPSRC which invests in research and postgraduate training across the UK.

    National institute for the development and use of advanced mathematics, computer science, algorithms and ‘Big Data’ has announced its first director, and will start research activities in the autumn. 

    The Alan Turing Institute has set off on a speedy course to secure new lasting partnerships and bring together expertise from across the UK that will help secure our place as a world leader in areas like Big Data, computer science and advanced mathematics.
    Jo Johnson
    Alan Turing - born 100 years ago, 23 June 1912

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

    0 0

    The community of Mathare 3A, built along a small river valley in Nairobi, is located in one of Kenya’s oldest and largest slums. It is lacking in most basic services such as sanitation and electricity. There are few permanent structures, with most people living in temporary shacks made of wood and corrugated iron.

    Now a team of Cambridge researchers and students has been working on a project under the UN-Habitat-coordinated Global Network for Sustainable Housing (GNSH) to build a community centre in the heart of this impoverished area, and they are doing so using a model that makes the community’s involvement central to the process – participatory design.

    This is the first time that UN-Habitat has worked with a university on a project like this, and it is hoped that it will provide a scalable model for future projects with other communities and institutions.

    Project Manager Dr Maximilian Bock, from the Department of Architecture, explains: “the aim of participatory design is not to change the rich culture that already exists in Mathare, but rather to understand it deeply enough to design a space that is useful to and reflective of the community.”

    The first residents started arriving in the Mathare Valley in the 1920s, and by 2012 the population was estimated at 188,000 – with around 1,500 living in the informal settlement of Mathare 3A. The Kintaco community hall currently consists of a temporary structure with a capacity for less than 100 people. At present, it is mainly used by the men.

    Ana Gatóo and Elizabeth Wagemann, also from the Department of Architecture, have now produced construction drawings that will enable the residents of Mathare 3A to build a new, more useful community centre for themselves. The structure consists of replicable units so that, with some training, the residents will be able to learn quickly how to build the hall under the guidance of an onsite engineer.

    One of the crucial steps in the redesign of the hall saw Gatóo travel to Mathare in January 2015. As Gatóo explains, “engaging with each sector of the community was essential to ensuring that the preliminary designs reflected the input of all those who would use the hall in different capacities and at different times of day.

    “The women who participated in the focus group commented that this was the first time they had been specifically asked for their input in the design process of a community construction project.”

    With only a limited period in Mathare to find out what the community members wanted from their new facility, the team adapted the often time-consuming participatory design model into a very visual process. “Using wall charts, pictures, models and coloured stickers,” lead designer Elizabeth Wagemann explains, “we were able to find out what residents thought of other community centres, the potential risks to the hall, how they hoped to use the facility, and what skills they could contribute to constructing and managing it.”

    “There are instances, for example in the neighbouring settlement of Kibera, where community construction projects aren’t used by the residents. The participatory design process is essential for fusing the community’s ambitions for the space with the material and organisational resources necessary to realise the project. Involving the community from the beginning is important in ensuring that, once it is built, they will manage, maintain, and above all make use of it,” says Bock.

    Supported by the UN-Habitat programme, the project draws on the expertise of the Department of Architecture’s Natural Materials and Structures group where the Cambridge team is based. Led by Michael Ramage, the group focuses on adapting natural materials and traditional methods to contemporary architecture. The design team formed by Maximilian Bock, Ana Gatóo and Elizabeth Wagemann was also supported by Research Associate Thomas Reynolds, Masters students Bob Muhia, Katherine Prater, Anna Rowell and Thomas Aquilina, and undergraduate Chloe Tayali, who have each been volunteering around six hours a week on the project.

    Bock explains how they have learned that local acceptance of the building materials is of great importance. “From an environmental perspective, wood is a good sustainable material, but among the local community in Mathare, wooden structures are seen as a fire hazard. In contrast, concrete buildings with multiple floors are seen as aspirational,” he says. “One of the challenges for us was to balance the need for environmental sustainability with the need for local acceptance.”

    In their completed design for a 15x30m building, made out of gabions filled with local or recycled stone and a floor and roof structure made from bamboo, they have managed to match what the community had imagined as well as what the complex network of other stakeholders want.

    One female resident of Mathare 3A commented: “I really like the design as it includes everything our community needs under one roof.”

    The team has also struck up a unique partnership with the Kenyan Forestry Services to provide the sustainable materials for around $20,000 instead of the team’s original estimates of $100,000. “The design could serve as a model for other community centres using locally sourced sustainable building materials,” says Samson Mogire from the Kenya Forestry Research Institute.

    As an experimental project, the team feel it has so far been a great success. The residents of Mathare are already very engaged and their feedback sessions have been lively with questions about the hall and the construction process. And now as the project moves from phase to phase, it also moves further into the ownership of the community.

    Despite its size, Bock states, Mathare 3A has been identified as an area that has previously been overlooked when it comes to development initiatives. It is hoped that projects such as this one will draw attention and further successful development projects to the settlement.

    Back in Cambridge, the team has learnt a lot to apply to participatory design projects in the future, and acquired experience designing with materials that will be important for future research. In two years’ time they will carry out an analysis of how the materials are performing and how the hall is being used to see what else can be learned from this process of building from the ground up.

    The project is also part of the EcoHouse Initiative, which drives sustainable urban development in the developing world. EcoHouse was funded by the AngloAmerican Group Foundation. The research has been enabled by the Higher Education Innovation Fund.

    Inset images: video courtesy of Roadmap to Mathare; a participatory design session in Mathare 3A (Ana Gatóo); the design for the new community centre (Maximilian Bock, Ana Gatóo, Elizabeth Wagemann, Department of Architecture); Mathare 3A (Ana Gatóo).

    In a landmark project with UN-Habitat, a team of Cambridge researchers has designed a community centre in one of Kenya’s biggest and oldest slums, and its future users are now raising funds to build it.

    The women commented that this was the first time they had been specifically asked for their input in the design process of a community construction project
    Ana Gatóo
    Mathare, Nairobi

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

    0 0
  • 08/05/15--04:12: J is for Jay
  • Jays are corvids – members of the crow family. The jays we see in Britain are Eurasian jays. With their pinkish plumage, and characteristic flash of blue, they will be familiar to many people as woodland birds that are increasingly seen in gardens, even in cities.

    Professor Nicky Clayton (Department of Psychology) has carried out pioneering research into the thinking power of corvids. Her observations have revealed these crows to be extremely clever. In Aesop’s Fables, the wise old crow drops pebbles into a pitcher of water to raise the level and allow her to drink. Clayton’s work has revealed that real-life crows can, if they need to, use pebbles in just this way.

    Corvids, including jays, cache (hide) food so that they can retrieve it later. They know who’s watching them and they also show the ability to plan ahead. Perhaps even more remarkably, corvids share their food. Male corvids even show the ability to understand what foods females prefer and will bring their mates tasty titbits.

    We don’t think of corvids as song birds but current research is just beginning to reveal that they are skilled mimics, able to reproduce familiar sounds. As the accompanying film shows, a jay called Romero enjoys mimicking Clayton when she talks to him in one of the Cambridge University aviaries where she and colleagues are transforming our understanding of bird cognition.

    These are just a few of the reasons that Clayton describes jays and other members of the crow family as ‘feathered apes’ – a term that challenges the ways we think about intelligence in the animal kingdom.

    Clayton has been fascinated by birds ever since, as a young girl, she watched them in her garden. Her research into bird cognition has always run in parallel with her passion for dance. “It was the movements of birds that first drew me to them,” she says. “I wanted to know what they were doing, how they move and how they think.”

    Next in the Cambridge Animal Alphabet: K is for a bird that has biologists, physicists and materials scientists working together to unravel the secrets behind its spectacular colour effects.

    Inset images: Eurasian jays (Ljerka Ostojic).

    The Cambridge Animal Alphabet series celebrates Cambridge's connections with animals through literature, art, science and society. Here, J is for Jay – a surprisingly clever corvid with the ability to mimic human voices and much more.

    Eurasian jay

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

    0 0

    Cambridge has nurtured some of the world’s most influential mathematicians – Sir Isaac Newton, James Clerk Maxwell, G.H. Hardy, John Edensor Littlewood and Srinivasa Ramanujan among them. 

    None of these great minds had the benefit of working in the CMS, seven futuristic pavilions built in 2000-3, but thirty Year 10s from Bedfordshire are clearly embracing the opportunity.

    “I’m going to turn you all into a super computer” proclaims James Munro, an applied maths PhD student and today’s workshop leader. His audience exchanges nervous glances but James, an effervescent young tutor, is already bounding between tables handing out dice and describing the scenario.

    “You are in a duel with a cowboy. He misses half of the time and you miss five times out of six. Fortunately, you get to shoot first….”

    James invites everyone to start rolling their dice to simulate the probability of surviving the duel. For several minutes, a satisfying chorus of clattering dice and mathematical discussion fills the room. 

    James keeps score and throws out questions about the results, before introducing progressively more complex scenarios. These include a Mexican Standoff in which one cowboy never misses, another strikes half the time and the weakest misses five times out of six. Before they know it, James’ students are tackling some thorny equations.

    James’ own specialism, fluid dynamics, complements his teaching style. His workshop feels decidedly more energetic than an ordinary maths lesson. 

    Describing his current work, James enthuses “I'm looking at the extremely fast flow that you get when two bubbles pop together to form one bubble. It's too fast to see in experiments and too fast for computers to model, but nothing is too fast for maths!”

    And this is the whole point of the HE Getaway – to give young people inspiring university experiences and introduce them to scholars who rhapsodize about their subjects.

    “I love doing these events”, says James, “maths is often seen as abstract and irrelevant, but playing games as we have today, can I hope, make students care about tough concepts like conditional probability and game theory. This group rose to the challenge and I was delighted to hear so many of them say they were interested in taking maths further.”

    A tour of Emmanuel College and a physics taster session at the Cavendish Laboratory completed an action-packed day. The event was part of an HE partnership programme, in this instance between the Universities of Cambridge and Bedfordshire, which aims to raise aspirations among Year 9 – 11 students and give them an inspiring taste of higher education. 

    As part of the partnership, a group of students from Cambridgeshire and Peterborough visited the University of Bedfordshire’s campus, where their programme included a workshop exploring computer game design. Both groups also took part in a team-building, problem-solving adventure day which included climbing trials and a rafting race. 

    Project Coordinator, Matt Diston, said 

    “We’ve developed a fantastic collaborative relationship with the University of Bedfordshire over the last four years which means we can provide inspiring opportunities for schools in both areas. Our programme is deliberately energetic and varied. We find it’s a really effective way to give Year 9 -11s a stronger sense of their abilities and interests, build their confidence and at the same demystify university so they can make positive decisions about their futures.” 

    A group of intrepid fifteen-year-olds recently visited the University’s Centre for Mathematical Sciences as part of a high-octane HE getaway.

    It's too fast to see in experiments and too fast for computers to model, but nothing is too fast for maths!
    James Munro, Applied Maths PhD student
    PhD student James Munro with HE Getaway students

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

    0 0

    They are tiny magic bullets that are quietly shaping the lives of millions of patients around the world. Produced in the lab, invisible to the naked eye, relatively few people are aware of these molecules' existence or where they came from. Yet monoclonal antibodies are contained in six out of ten of the world’s bestselling drugs, helping to treat everything from cancer to heart disease to asthma.

    Known as Mabs for short, these molecules are derived from the millions of antibodies the immune system continually makes to fight foreign invaders such as bacteria and viruses. The technique for producing them was first published 40 years ago. It was developed by César Milstein, an Argentinian émigré, and Georges Köhler, a German post-doctoral researcher. They were based at the UK Medical Research Council’s Laboratory of Molecular Biology in Cambridge.

    Harnessing the power of the immune system

    Milstein and Köhler wanted to investigate how the immune system can produce so many different types of antibodies, each capable of specifically targeting one of a near-infinite number of foreign substances that invade the body. This had puzzled scientists ever since the late 19th century, but an answer had proved elusive. Isolating and purifying single antibodies with known targets, out of the billions made by the body, was a challenge.

    The two scientists finally solved this problem by immunising a mouse against a particular foreign substance and then fusing antibodies taken from its spleen with a cell associated with myeloma, a cancer that develops in the bone marrow. Their method created a hybrid cell that secreted Mabs. Such cells could be grown indefinitely, in the abdominal cavity of mice or in tissue culture, producing endless quantities of identical antibodies specific to a chosen target. Mabs can be tailored to combat a wide range of conditions.

    When Milstein and Köhler first publicised their technique, relatively few people understood its significance. Editors of Nature missed its importance, asking the two scientists to cut short their article outlining the new technique; as did staff at the British National Research Development Corporation, who declined to patent the work after Milstein submitted it for consideration. Within a short period, however, the technique was being adopted by scientists around the world, and less than ten years later Milstein and Köhler were Nobel laureates.

    A transformation in therapeutic medicine

    In the years that have passed since 1975, Mab drugs have radically reshaped medicine and spawned a whole new industry. It is predicted that 70 Mab products will have reached the worldwide market by 2020, with combined sales of nearly $125bn (£81bn).

     

    An artist’s rendering of anti-cancer antibodies.ENERGY.GOV

     

    Key to the success of Mab drugs are the dramatic changes they have brought to the treatment of cancer, helping in many cases to shift it away from being a terminal disease. Mabs can very specifically target cancer cells while avoiding healthy cells, and can also be used to harness the body’s own immune system to fight cancer. Overall, Mab drugs cause fewer debilitating side-effects than more conventional chemotherapy or radiotherapy. Mabs have also radically altered the treatment of inflammatory and autoimmune disorders like rheumatoid arthritis and multiple sclerosis, moving away from merely relieving symptoms to targeting and disrupting their cause.

    Aside from cancer and autoimmune disorders, Mabs are being used to treat over 50 other major diseases. Applications include treatment for heart disease, allergic conditions such as asthma, and prevention of organ rejection after transplants. Mabs are also under investigation for the treatment of central nervous disorders such as Alzheimer’s disease, metabolic diseases like diabetes, and the prevention of migraines. More recently they were explored as a means to combat Ebola, the virus disease that ravaged West Africa in 2014.

    Fast and accurate diagnosis

    Mabs have enabled faster and more accurate clinical diagnostic testing, opening up the means to detect numerous diseases that were previously impossible to identify until their advanced stages. They have paved the way in personalised medicine, where patients are matched with the most suitable drug. Mabs are intrinsic components in over-the-counter pregnancy tests, are key to spotting a heart attack, and help to screen blood for infectious diseases like hepatitis B and AIDS. They are also used on a routine basis in hospitals to type blood and tissue, a process vital to ensuring safe blood transfusion and organ transplants.

     

    Monoclonal antibodies can be used to rapidly diagnose disease and determine blood type.U.S. Navy/Jeremy L. Grisham

     

    Mabs are also invaluable to many other aspects of everyday life. For example they are vital to agriculture, helping to identify viruses in animal livestock or plants, and to the food industry in the prevention of the spread of salmonella. In addition they are instrumental in the efforts to curb environmental pollution.

    Quietly triumphant

    Yet Mabs remain hidden from public view. This is partly because the history of the technology has often been overshadowed by the groundbreaking and controversial American development of genetic engineering in 1973, which revolutionised the manufacturing and production of natural products such as insulin, and inspired the foundation of Genentech, one of the world’s first biotechnology companies.

    Looking back, the oversight is not surprising. Mabs did not transform medicine overnight or with any major fanfare, and the scientists who made the discovery did not seek fame. Instead, Mabs quietly slipped unobserved into everyday healthcare practice.

    An Argentinian and a German came together in a British Laboratory and changed the face of medicine forever; their story deserves to be told.

    Lara Marks is at University of Cambridge.

    This article was originally published on The Conversation. Read the original article.

    Forty years ago, two researchers at the Medical Research Council’s Laboratory of Molecular Biology in Cambridge developed a new technology that was to win him the Nobel Prize – and is now found in six out of ten of the world’s bestselling drugs. Dr Lara Marks from Department of History and Philosophy of Science discusses the importance of ‘monoclonal antibodies’.

    Monoclonal antibodies

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes
    License type: 

    0 0

    Researchers Alex Thornton, now at the University of Exeter, and Gabrielle Davidson carried out the study with the wild jackdaw population in Madingley village on the outskirts of Cambridge. They found that the jackdaws were able to distinguish between two masks worn by the same researcher, and only responded defensively to the one they had previously seen accessing their nest box.

    Over three consecutive days Davidson approached the nest boxes wearing one of the masks and took chicks out to weigh them. She also simply walked past the nest boxes wearing the other mask. Following this she spent four days sitting near the nest boxes wearing each of the masks to see how the jackdaws would respond.

    The researchers found that the jackdaws were quicker to return to their nest when they saw the mask that they had previously seen approaching and removing chicks to be weighed, than when they saw the mask that had simply walked by.

    They also tended to be quicker to go inside the nest box when Davidson, wearing the mask, was looking directly at them rather than looking down at the ground.

    “The fact that they learn to recognise individual facial features or hair patterns so quickly, and to a lesser extent which direction people are looking in, provides great evidence of the flexible cognitive abilities of these birds,” says Davidson. “It also suggests that being able to recognise individual predators and the levels of threat they pose may be more important for guarding chicks than responding to the direction of the predator’s gaze.”

    “Using the masks was important to make sure that the birds were not responding to my face, which they may have already seen approaching their nest boxes and weighing chicks in the past,” she adds.

    Previous studies have found that crows, magpies and mockingbirds are similarly able to recognise individual people. However, most studies have involved birds in busier urban areas where they are likely to come into more frequent contact with humans.

    Jackdaws are the only corvids in the UK that use nest boxes so they provide a rare opportunity for researchers to study how birds respond to humans in the wild. Researchers at Cambridge have been studying the Madingley jackdaws since 2010.

    “It would be fascinating to directly compare how these birds respond to humans in urban and rural areas to see whether the amount of human contact they experience has an impact on how they respond to people,” says Davidson.

    “It would also be interesting to investigate whether jackdaws are similarly able to recognise individuals of other predator species – although this would be a lot harder to test.”

    The study was enabled by funding from Zoology Balfour Fund, Cambridge Philosophical Society, British Ecological Survey, and BBSRC David Philips Research Fellowship.

    Inset images: Mask (Elsa Loissel).

    Reference:

    Davidson, GL et al.,Wild jackdaws, Corvus monedula, recognize individual humans and may respond to gaze direction with defensive behaviour Animal Behaviour 108 October 2015 17-24.

    When you’re prey, being able to spot and assess the threat posed by potential predators is of life-or-death importance. In a paper published today in Animal Behaviour, researchers from the University of Cambridge’s Department of Psychology show that wild jackdaws recognise individual human faces, and may be able to tell whether or not predators are looking directly at them.

    The fact that they learn to recognise individual faces so quickly provides great evidence of the flexible cognitive abilities of these birds
    Gabrielle Davidson
    Jackdaws on nest box

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes
    License type: 

    0 0

    Researchers led by the University of Cambridge have built a mother robot that can independently build its own children and test which one does best; and then use the results to inform the design of the next generation, so that preferential traits are passed down from one generation to the next.

    Without any human intervention or computer simulation beyond the initial command to build a robot capable of movement, the mother created children constructed of between one and five plastic cubes with a small motor inside.

    In each of five separate experiments, the mother designed, built and tested generations of ten children, using the information gathered from one generation to inform the design of the next. The results, reported in the open access journal PLOS One, found that preferential traits were passed down through generations, so that the ‘fittest’ individuals in the last generation performed a set task twice as quickly as the fittest individuals in the first generation.

    “Natural selection is basically reproduction, assessment, reproduction, assessment and so on,” said lead researcher Dr Fumiya Iida of Cambridge’s Department of Engineering, who worked in collaboration with researchers at ETH Zurich. “That’s essentially what this robot is doing – we can actually watch the improvement and diversification of the species.”

    For each robot child, there is a unique ‘genome’ made up of a combination of between one and five different genes, which contains all of the information about the child’s shape, construction and motor commands. As in nature, evolution in robots takes place through ‘mutation’, where components of one gene are modified or single genes are added or deleted, and ‘crossover’, where a new genome is formed by merging genes from two individuals.

    In order for the mother to determine which children were the fittest, each child was tested on how far it travelled from its starting position in a given amount of time. The most successful individuals in each generation remained unchanged in the next generation in order to preserve their abilities, while mutation and crossover were introduced in the less successful children.

    The researchers found that design variations emerged and performance improved over time: the fastest individuals in the last generation moved at an average speed that was more than twice the average speed of the fastest individuals in the first generation. This increase in performance was not only due to the fine-tuning of design parameters, but also because the mother was able to invent new shapes and gait patterns for the children over time, including some designs that a human designer would not have been able to build.

    “One of the big questions in biology is how intelligence came about – we’re using robotics to explore this mystery,” said Iida. “We think of robots as performing repetitive tasks, and they’re typically designed for mass production instead of mass customisation, but we want to see robots that are capable of innovation and creativity.”

    In nature, organisms are able to adapt their physical characteristics to their environment over time. These adaptations allow biological organisms to survive in a wide variety of different environments – allowing animals to make the move from living in the water to living on land, for instance. 

    But machines are not adaptable in the same way. They are essentially stuck in one shape for their entire ‘lives’, and it’s uncertain whether changing their shape would make them more adaptable to changing environments.

    Evolutionary robotics is a growing field which allows for the creation of autonomous robots without human intervention. Most work in this field is done using computer simulation. Although computer simulations allow researchers to test thousands or even millions of possible solutions, this often results in a ‘reality gap’ – a mismatch between simulated and real-world behaviour.

    While using a computer simulation to study artificial evolution generates thousands, or even millions, of possibilities in a short amount of time, the researchers found that having the robot generate its own possibilities, without any computer simulation, resulted in more successful children. The disadvantage is that it takes time: each child took the robot about 10 minutes to design, build and test. According to Iida, in future they might use a computer simulation to pre-select the most promising candidates, and use real-world models for actual testing.

    Iida’s research looks at how robotics can be improved by taking inspiration from nature, whether that’s learning about intelligence, or finding ways to improve robotic locomotion. A robot requires between ten and 100 times more energy than an animal to do the same thing. Iida’s lab is filled with a wide array of hopping robots, which may take their inspiration from grasshoppers, humans or even dinosaurs. One of his group’s developments, the ‘Chairless Chair’, is a wearable device that allows users to lock their knee joints and ‘sit’ anywhere, without the need for a chair.

    “It’s still a long way to go before we’ll have robots that look, act and think like us,” said Iida. “But what we do have are a lot of enabling technologies that will help us import some aspects of biology to the engineering world.” 

    Reference:
    Brodbeck, L. et al. “Morphological Evolution of Physical Robots through Model-Free Phenotype Development” PLOS One (2015). DOI: 10.1371/journal.pone.0128444

    Researchers have observed the process of evolution by natural selection at work in robots, by constructing a ‘mother’ robot that can design, build and test its own ‘children’, and then use the results to improve the performance of the next generation, without relying on computer simulation or human intervention. 

    We want to see robots that are capable of innovation and creativity
    Fumiya Iida
    Mother and child

    Creative Commons License
    The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

    Yes

older | 1 | .... | 59 | 60 | (Page 61) | 62 | 63 | .... | 141 | newer