Selected Works

Opinion: Laws are not enough for environmental miracle, China Daily, 4 January 2007
The perils of China’s rising environmental problems and how they should be tackled

Stem cells: An appointment with chance, Economist, 30 November 2006
An accident with some chopsticks has led to an experimental medical treatment based on stem cells

Infant pain: Does it hurt? News Feature, Nature, 9 November, 2006
Working out whether premature babies feel pain has important implications for child development

Epigenetics: Learning without learning, Economist, 21 September 2006
The events of childhood may have an impact on the brain, even if no conventional memory is formed

Shedding light on the world, BBC News website, 8 September 2006
The 2006 Millennium Technology Prize has been awarded to Professor Shuji Nakamura, the inventor who is said to have kicked started the "blue laser revolution"

Phantom limb and chronic pain: A hall of mirrors, Economist, 20 July 2006
The search for a new way to treat chronic pain

Epigenome: Unfinished symphony, News Feature, Nature, 11 May 2006
An international endeavour to decipher the “grammar” of heredity

Ornithology: Flight of navigators, News Feature, Nature, 6 October 2005
How birds navigate long distances in the extremely complex areas of the Arctic

Printer forensics: Band aid, Economist, 28 October 2004
How to beat digital forgers

Fear and loathing in the unconscious, Irish Times, 4 March 2004
Too strong a sense of ‘self’ may not be good for you

Emotion and memory: Thanks for no memory, Economist, 13 November 2003
Some evidence about how and why memories are suppressed


Opinion: Laws are not enough for environmental miracle

China Daily, 4 January 2007
http://www.chinadaily.com.cn/opinion/2007-01/04/content_774650.htm

Controlling the environmental impact of China's belated industrial revolution is like the Red Queen's race in Lewis Carroll's Through the Looking Glass (the sequel to Alice's Adventures in Wonderland)—you need to run as fast as you can just to stay where you are.

China's unprecedented economic growth in the past two decades has caused substantial pollution and environmental damage. As the country marches on to new economic heights, its insatiable appetite for energy and resources has provoked widespread concern around the world. To meet the challenge of sustainable development, "running" as fast as possible is not enough; drastic measures are needed to combat rampant environmental problems.

Fortunately, China's central government has realized the dire consequences that environmental damage could have in a wide range of areas such as air quality, water supply, weather patterns, health, agriculture and biodiversity, which eventually will take a heavy toll on the country's economy and social stability. So much so that President Hu Jintao has made sustainable development one of the central pillars of government policy.

Moreover, the Ministry of Science and Technology is conducting an assessment of the effects of global climate change in China. The report, which will be released in the first half of this year, aims to initiate debate on how China can balance its ambitious goals for economic growth with strategies to cut down greenhouse-gas emissions.

These are welcome initiatives, of course. But China must ensure that they will be effectively translated into practice, politically and financially, to tackle pressing environmental issues. China already has more than 100 environmental policies, laws and regulations. In a country of such vast scale and immense complexity, implementation of those rules—or indeed of any types of rules—at the local level has never been easy.

Legislation alone has little effect unless governments are willing to enforce the laws. Without clear incentives and penalties as an intricate part of the environmental policies, they are unlikely to have any effect. Although environmental protection has been a basic national principle since 1983, economic development often takes priority, and is still the main criterion for judging the performance of government officials.

This situation needs to be changed. Market tools should be introduced to provide economic incentives for practices that are environmentally friendly. Selection and promotion of government officials should take into consideration their environmental as well as economic credentials.

To avoid conflicts of interest, agencies responsible for developing natural resources should not be involved in regulating them.

And the State Environmental Protection Agency and green non-governmental organizations (NGOs) should participate in decisions on new development projects based on assessment of potential environmental impact.

They should also be given more power to enforce policy implementation and to expose or even close down heavy polluters. In addition, the government should establish a systematic approach to assess how effectively the environmental policies are implemented nationwide.

Without such rigorous measures, good intentions would remain just that—good intentions.

China should also increase its investment in enforcing environmental policies, improving energy efficiency and developing new forms of energy sources. China is among the biggest emitters of greenhouse gases responsible for climate change. Because China lacks oil, more than three-quarters of its electricity is generated by burning coal, the worst offender of greenhouse-gas emissions. With power demands poised to double by 2020, an increase in coal consumption is unavoidable.

Although alternative energy supplies hold great promise, they are unlikely to solve emission problems in the short term. As the Chinese proverb wisely points out, yuanshui jiebuliao jinke (far-flung waters cannot slake an urgent thirst.) Like it or not, coal will remain China's predominant energy source in the foreseeable future.

Among the key challenges, therefore, are how to make coal consumption cleaner and more efficient. Technologies for meeting these challenges are already available, but are expensive and have received little attention in China due to lack of market or regulatory incentives. Although further research and development are necessary before they can be widely adopted, the main constraints are not technological but political.

But let's not blame China alone and expect that it should come up with a solution on its own. Relative to its huge population, China is far less of an environmental villain than the United States or Europe. Neither is it the only country responsible for its profound environmental damage. Indeed, the world has exacerbated the situation by means of trade and investment that fuel China's rapid economic growth.

The China Water Pollution Map, recently unveiled online by China's Institute of Public and Environmental Affairs, lists 2,700 foreign companies as heavy polluters, including Pepsi in Jilin, Panasonic and Associated British Food and Beverages in Shanghai, and the UK's Purolite resin plant in Zhejiang.

While China should prohibit the operation of companies—domestic and foreign—that do not meet its environmental standards, foreign firms must set an example by making environmental protection part of their code of corporate social responsibility.

Export trade constitutes a large part of China's economy and is a major cause of the country's increasing pollution—because products go abroad whereas pollutants stay behind. Therefore, importing countries have the obligation to help China meet the challenge of sustainable development.

They could fund more initiatives that cut greenhouse-gas emissions in China, such as those under the Kyoto Protocol's Clean Development Mechanism. The importing countries should also be more proactive in transferring to the country environmentally benign technologies, such as those for cleaner manufacturing, water conversation, waste treatment, and improving energy efficiency.

In addition, developed nations should help China's environmental planners and managers as well as the country's thousands of green NGOs—most of which are small, isolated and poorly funded.

Financial support would certainly be welcome. But more importantly, they should also help train Chinese officials and campaigners to be more adept at increasing the public's environmental awareness, contributing to government policies, and monitoring their implementation.

Industrialized countries will also contribute to the steady increase of coal use projected by the International Energy Agency. This has made input in research and development of clean-coal technologies all the more pertinent. Market and regulatory incentives should be put in place at the global level to encourage the use of such technologies around the world.

Development and sustainability are not mutually exclusive terms. But reconciling them remains a great challenge for humanity. China has the moral right to develop. However, the fact that it has no emission-reduction target does not mean that the country—as a moral agent and a rising world power—has no responsibility for protecting the environment and saving the planet.

After all, nature is not something to be conquered by humans. Instead, it is a system that sustains us and we must learn to live within it. In the past two decades, China has stunned the world by creating an economic miracle. It is time for the country to demonstrate that it can also create an environmental miracle and shape the world's path to sustainable development.

This would be a true indication of a powerful nation.

© 2007 China Daily



Stem cells: An appointment with chance

Economist, 30 November 2006
http://www.economist.com/science/displaystory.cfm?story_id=8348729

An accident with some chopsticks has led to an experimental medical treatment based on stem cells

LIKE other fields of endeavour, science has fashions—and one of its most fashionable areas at the moment is the study of stem cells. This is a subject that provokes high passions, particularly when the cells in question are drawn from human embryos. It also encourages the lowest form of scientific behaviour, fabricating data (see article). A tragicomic stem-cell story, however, is probably a first. But a piece of research reported in this week's New England Journal of Medicine by Zhu Jianhong of Fudan University and his colleagues began that way. Its first subject was a woman admitted into Huashan Hospital in Shanghai with a chopstick in her brain. It ended triumphantly, though, with the trial of a treatment that may heal the sort of brain injuries that the woman in question suffered.

Stem cells are the cells responsible for making bodies, and then repairing the natural wear and tear to which they are subject while they are alive. The body-forming cells are the embryonic stem cells that are causing so much political trouble in America because obtaining them involves destroying early-stage embryos known as blastocysts. Some people think that destroying blastocysts is murder.

The repairing sort of cells, though, are uncontroversial, and are turning up in more and more places. Even tissues once believed not to change much after childhood, and thus not to need the renewing ministrations of stem cells, are yielding them. Heart-muscle tissue, for example, has recently been shown to have them.

Another place where they were not, at first, expected to exist is the brain. But they do. And that discovery meant that the unfortunate lady who had had a chopstick thrust through one of her eyes into part of her brain called the inferior prefrontal subcortex (IPS) presented an opportunity. When the utensil was removed, Dr Zhu decided to try culturing the tissue that came out with it, to see whether there were any stem cells there.

Waste not, want not

To his delight, the extracted tissue thrived and grew, and many of the cells in the resulting culture did indeed contain proteins known to be characteristic of neural stem cells. But Dr Zhu wanted to be sure that that was truly what he had.

The defining feature of a stem cell is self-renewal. When such a cell divides, at least one of its daughters is also a stem cell (the other may set off on the route to specialisation that allows stem cells to generate new tissue). The way to test whether a particular cell is a stem cell, therefore, is to grow it individually. A single stem cell will divide continuously and form a spherical colony consisting of its progeny. Other cells will not. Dr Zhu found that about 4% of the cells from his chopstick-injured patient were able to form such colonies, which confirmed his conjecture.

Thus inspired, he started collecting samples from other patients with traumatic open-head injuries (though none with quite such an unusual cause as the first). He has managed to derive neural stem cells from 16 of these patients, out of a total of 22, and believes that success depends on which region of the brain is affected. Cells from the IPS are the best source, so it seems he was lucky in his original patient.

The point of the exercise, though, was to see whether neural stem cells could be obtained reliably, with a view to using them as a treatment. For a suitable dose of stem cells might not only help a damaged piece of tissue to repair itself; it would also, if the cells in question had come from the patient who was being treated, escape attack by his immune system. This idea of self-treatment is one of the reasons adult stem-cell science is so fashionable.

First, Dr Zhu tried it out on mice (the mice in question had had their immune systems turned off, so that they would not reject the transplanted cells). He injected stem cells he had cultured from his patients into mouse brains and found that they successfully differentiated into the various cell types found in the nervous system. Just as importantly, the resulting nerve cells were able to conduct electrical impulses and could form the specialised junctions called synapses, by means of which nerve cells talk to each other.

Having shown that the stem cells worked in healthy mouse brains, Dr Zhu tried them out on injured mouse brains. Another common property of stem cells is to accumulate at sites of injury, where their services are obviously needed. In order to track the movements of the cells, his team attached tiny magnetic particles to them before they transplanted them, and also injected them with a dye. They found that cells implanted into healthy brains stayed put, whereas those implanted into damaged brains moved towards the injured area.

The final animal trial was a safety test using monkeys. It was designed to look for cancer, and for signs that the cells had wandered from the brain to other organs such as the heart and the liver, where they might have caused trouble. No such signs were seen.

So the team moved on to people. They transplanted neural stem cells derived from eight patients with open-head injuries back into the patients who had provided the initial tissue and allowed the cells to migrate to the injury sites. (In one case, they used magnetic particles to follow the process.) Then they asked a separate group of specialists to look both at their experimental patients and at a group of people with similar brain injuries but no transplant. The second research group did not know who had and who had not been treated, so as to make the trial “blind”. Using standard behavioural tests, they concluded that the treated patients had lower disability scores.

As Dr Zhu stresses, this is a mere pilot study, and it is too early to draw strong conclusions. But if subsequent work confirms his finding, what started as an unfortunate piece of serendipity may lead to a valuable new technique for repairing injured brains.

© 2006 Economist Newspaper Limited


Infant pain: Does it hurt?


Nature, 9 November, 2006
http://www.nature.com/nature/journal/v444/n7116/full/444143a.html

Working out whether premature babies feel pain has important implications for child development, says Jane Qiu.

It was a precarious beginning. Teresa was born 14 weeks earlier than she should have been. Having spent 26 weeks in the womb, she now lies motionless in an incubator in the neonatal intensive-care unit, relying on mechanical ventilation for breathing and intravenous tubes for nutrition. Her tiny, fragile body is covered with sensors to monitor blood pressure, heart rate, breathing and temperature.

Teresa's destiny is far from certain. She is suffering from the common complications associated with premature birth, such as breathing difficulties, anaemia and infections. She has to go through many routine medical procedures every day, and also faces a number of major operations in the future.

Teresa is a fictional but representative composite of hundreds of thousands of premature babies born each year. Thanks to medical and technological progress, babies born after 23 weeks' gestation in developed countries are now routinely kept alive. But many of them have life-threatening complications. It is estimated that each baby in a neonatal intensive-care unit is subjected to an average of 14 procedures a day — a number that can go up to 50 in extreme cases. Although clinicians and nurses intuitively feel that most of these procedures may be painful, there is no easy way to discover whether the babies are feeling anything.

It's a vexed question. Medics rely on self-reporting for pain assessment, but babies can't use words to explain exactly what they are feeling. For a long time, many health professionals assumed that newborns, and those born prematurely in particular, were too young to perceive pain in the same way that older children and adults do — or if they did, were not able to remember it. They were wary of giving painkillers to infants 'just in case', as the effects and dosing of these drugs had been tested only in adults. Some neonatal clinics started to give painkillers to premature babies during certain medical procedures in the late 1980s but, as yet, there is no consensus on best practice.

"Most intensive-care units have no guidelines for managing pain in preterm babies, so it's all very patchy and up to the individual clinicians and nurses on duty," says Judith Meek, a paediatric consultant at the Elizabeth Garrett Anderson and Obstetric Hospital in London.

The question of whether preterm babies feel pain goes beyond the obvious ethical and humane issues. In recent years, neuroscientists have unearthed evidence suggesting that the nervous systems of premature babies who experience repeated procedures that are potentially painful develop abnormally. Such children may be either excessively susceptible or desensitized to pain as they get older. The problem is likely to increase in the future, as the number of premature births has been rising in westernized countries. This is partly because of the increased number of multiple births resulting from more widespread use of fertility treatments.

There are further implications for neuroscientists too. They hope that, as well as answering the question of whether premature babies feel pain, their results will also help them understand more about how an infant's nervous system develops. This could form the groundwork for the development of much-needed drugs specifically designed for children, whose physiology is often different from that of adults.

Pain barrier

At the moment, practitioners use 'pain-coding scales' to assess the amount of pain babies might feel. Each scale consists of a set of behavioural criteria, such as crying, withdrawal and facial expressions, as well as physiological indicators, such as heart rate, blood pressure and breathing patterns. "All those measures have their strength," says Maria Fitzgerald, a neuroscientist at University College London. "But they may not be true reflections of pain experience, as they can also change under other stressful conditions, such as hunger and cold."

Part of the problem is that the experience of pain is far more than just a straightforward physiological response to stimuli. Being a psychological state, pain is always subjective, and is coloured by a number of cognitive and emotional factors. It can result from nociception — the physiological detection of noxious stimuli by pain receptors and nerves — but can also occur independently of this, as a sufferer of phantom-limb pain will attest.

Animal studies suggest that nociception is likely to occur in premature infants. But to work out whether premature babies experience pain, neuroscientists need to tackle two key questions. The first is whether the signals generated by the stimuli actually reach the cerebral cortex, the part of the brain that is crucial for the perception of bodily sensation. The second is: to what extent are premature infants conscious, that is, sufficiently self-aware to experience and be emotionally distressed by pain? Have they yet developed minds?

The debate over whether pain signals reach the cortex in premature babies was fuelled by work first published in 2002 by Tim Oberlander, a paediatrician at the University of British Columbia in Vancouver, Canada. He showed that premature babies with severe brain injuries that disable much of the cortex have the same facial expressions and behaviours in response to potentially painful stimuli as normal babies of the same age. This suggested that the behavioural and physiological responses of premature babies to such stimuli are merely reflexes put together at the level of the spinal cord and brain stem. If signals of pain do not go beyond the brain stem, it is unlikely that preterm infants can feel pain.

Recently, however, a number of scientists have attempted to see inside the brains of premature babies to find out if this is the case. Two teams of researchers, one led by Fitzgerald, the other by Kanwaljeet Anand at the University of Arkansas for Medical Sciences in Little Rock, have independently shown that the cortex of premature babies is activated in response to potentially painful stimuli. Both groups used near-infrared spectroscopy (NIRS), a non-invasive imaging technique that can measure subtle changes in the concentration of the blood's oxygen-carrying molecule haemoglobin.

Neuronal activity in the cortex is coupled with changes in the blood flow to different regions of the brain, and this is reflected by the ratio of oxygenated haemoglobin to its deoxygenated counterpart. Because of this, NIRS can be used to monitor neuronal activity in the cortex. It is not as sharp or accurate as other imaging systems, such as functional magnetic resonance imaging (fMRI) and positron emission tomography (PET). But these systems would involve moving a very sick and fragile infant into a noisy and stressful environment. NIRS can be performed by simply placing sensors on a baby in its cot.

Using this method, Fitzgerald's team measured cortical activity in response to a heel lance, a widely used procedure for collecting small blood samples, in 18 babies of between 25 and 45 weeks of gestational age, and postnatal ages ranging from 5 to 134 days. Unexpectedly, even in the youngest babies, there was a small but very clear evoked activity in the somatosensory cortex, the part of the cortex that processes bodily sensation.

A brief, non-invasive mechanical stimulation of the heel did not produce detectable cortical activation, even when babies withdrew their feet in reflex action. The researchers found that the older the baby, the bigger the response in the cortex.

Conscious thoughts

In a parallel study, Anand and his team analysed 40 preterm babies, born at 28 to 36 weeks of gestation, at 25 to 42 hours after birth. They used vein puncture, another way of collecting blood samples, to generate potentially painful stimuli. Following either tactile (skin disinfection) or painful (vein puncture) stimuli, there was an increase in blood flow to the somatosensory cortex, but not to other areas of the cortex. The increase induced by vein puncture was much larger than that by skin disinfection, supporting the findings made by Fitzgerald's team. But unlike Fitzgerald, Anand found that the more premature the baby at birth, the bigger the response to painful stimuli.

This difference between the two studies may be explained by the fact that the teams used different stimuli and babies of different ages. Even so, they do have the same central finding in common: there is activation of the cortex following acute pain in premature babies. The finding is met with great enthusiasm from other pain researchers.

"To make this kind of breakthrough in human neonates is extremely difficult," says Ruth Grunau, a paediatric psychologist also at the University of British Columbia. "It presents a major step forward in the field." But proof of cortical activation in babies is not exactly the same as proof of consciousness.

So how sure are the researchers that those preterm infants do have the same subjective experience that adults recognize as pain? There is simply no proof of consciousness, even in wide-awake adults. And recent studies have raised the question of whether comatose adults, who are also unable to speak, might have some level of consciousness, although this remains highly controversial. So proving what a baby feels is currently impossible. "How can you take any biochemical or biophysical measures and equate that to a subjective, conscious experience?" asks Anand. "It is a conundrum."

Anand thinks that the best way forward, for now, is to infer that premature babies can feel pain. "If a baby manifests the same behavioural and physiological responses to pain as seen in adults, together with activation of the cortex, then we can be fairly confident that he is feeling pain, even though that's no proof of consciousness," Grunau agrees. "We would seem to be holding an extraordinary standard if we didn't infer pain from all those measures."

That being the case, the issue of alleviating pain and suffering becomes paramount, especially now that scientists know that repeated painful experiences can alter the development of the nervous system. Some studies show that children born prematurely become less sensitive to pain, whereas others demonstrate that children, especially those who were born prematurely and who have had multiple operations early in life, need a higher dose of analgesics when they need further surgery. In other words, early repetitive pain experiences can either increase or dampen sensitivity to pain later on in life. "There are a lot of confounds here," Fitzgerald acknowledges. This is not surprising, given the complexity of the phenomenon and the difficulties with human neonate research.

Subjective responses

Studies with newborn rats also show that, depending on the timing, nature and duration of the painful stimuli, the animals become either under- or over-sensitive to pain later on. Both are potentially problematic: children who are less sensitive to pain may be more prone to injury, whereas those who are hypersensitive may withdraw from everyday activities.

Grunau is also concerned about changes in such children's stress responses. She and her colleagues have conducted detailed studies to correlate the amount of potentially painful procedures premature babies undergo in intensive-care units with their levels of stress hormones at various stages of early postnatal development. They found that premature babies of 8 and 18 months have much higher levels of stress hormones than their full-term counterparts, although how long these effects persist remains unknown.

A large body of animal and human studies has shown that prolonged exposure to stress hormones can result in changes in the brain, especially in areas important for learning and memory. "Preterm babies are vulnerable because the structure of their brains is still being formed," says Meek. "This is really worrying."

Detailed analysis of a large number of children between the ages of 5 and 14 indicates that children who were born prematurely are about 2.6 times as likely to have attention-deficit disorder as their full-term equivalents. Such children also tend to have various learning difficulties, increased levels of anxiety and poorer cognitive outcome. It is difficult to pinpoint the precise causes of these differences — after all, prematurity brings a host of problems that could all contribute in one way or another. But, says Grunau, pain is certainly on this list of potential factors.

So researchers are trying to develop ways to measure pain more accurately in premature babies. Fitzgerald is now testing how well cortical activity in response to potentially painful stimuli correlates with subjective assessment of pain by medical practitioners. She is also using imaging to monitor the efficacy of painkillers in the babies. To refine the findings with NIRS, her team is now using electroencephalography, which will allow the group to map how input from nerves is transmitted and processed in various regions of the preterm baby's cortex. Anand and his colleagues are carrying out similar kinds of studies including the effects of prolonged pain, such as post-operative pain and discomfort from being on a ventilator.

Anand's team is also using fMRI and other imaging techniques to broach the highly controversial and politically sensitive question of whether the fetus can feel pain. Researchers agree that studies on premature infants won't address this question. The process of being born alters an infant's brain physiology so radically that it cannot readily be compared to that of a fetus still in the womb. Even with the help of sophisticated imaging technology, tackling this question will be far from easy.

Studies of pain in premature babies could, however, fuel an initiative to develop painkillers designed especially for babies and children. Drug regulatory bodies, government funding agencies and charities have now realized this is an important issue.

In the United States, the Best Pharmaceuticals for Children Act of 2002 authorizes government agencies, in particular the National Institutes of Health and the Food and Drug Administration, to help develop drugs specifically for children. Task forces of scientists, clinicians and ethicists specialize in various conditions affecting newborns, including pain as well as neurological, cardiovascular and respiratory diseases. The initiatives aim to set priorities for scientific research and to establish ethical guidelines for clinical trials using babies.

One such task force, the Neonatal Pain-Control Group, "is one of the top priorities among all the task forces," says Anand, its chairman. "We expect to see some major clinical advances in neonatal pain management within the next few years."

© 2006 Nature Publishing Group


Epigenetics: Learning without learning

Economist, 21 September 2006
http://www.economist.com/science/displaystory.cfm?story_id=7941685

The events of childhood may have an impact on the brain, even if no conventional memory is formed

FREUD was famously preoccupied with the influence of early childhood experiences on development. His theory of psychoanalysis, which provided a new approach to the analysis and treatment of abnormal adult behaviour, has attracted both ardent followers and fierce critics. According to this theory, the unconscious mind carries imprints of the past that mercilessly haunt the present. Unearthing those imprints is the key to understanding what is going on and then treating it.

In Freudian theory, the imprints are memories, albeit unconscious ones. In other words, they are encoded in the way that the nerve cells which make up the brain are connected to one another. This theory of unconscious early memory is controversial. On the other hand, it seems clear that early experience is important to later behaviour. So what is going on?

There is a growing school of thought that Freud was right, but for the wrong reasons. According to the members of this school, early experience does profoundly mould the brain. However, it is not memory that it moulds—at least, not memory as conventionally understood. What it actually moulds is the way genes work.

A gene is a piece of DNA that encodes a protein molecule. In other words, it is a set of instructions. But instructions are no use unless they can be executed, and executed in an appropriate way. So cells contain a system to read genes, regulate their reading and convert the results into proteins, which then carry out various functions in the body. Part of this system is called epigenetic imprinting. This is a way of modifying how easily a gene can be read.

Moshe Szyf, of McGill University in Montreal, studies the effect of maternal care on epigenetic imprinting. As he explained at this week's meeting on Epigenetics and Neural Developmental Disorders, held in Beltsville, Maryland, imprinting might be a general mechanism whereby experiences are translated into behaviour. If that turns out to be so, it will affect the understanding and treatment of mental illness.

Imprints of care

The first inkling of this came when Michael Meaney, one of Dr Szyf's long-term collaborators, noticed that rat pups whose mothers spent a lot of time licking and grooming them grew up to be less fearful and better-adjusted adults than the offspring of neglectful mothers. Crucially, these well-adjusted rats then gave their own babies the same type of care—in effect, transmitting the behaviour from mother to daughter by inducing similar epigenetic changes.

When Dr Szyf looked at the brains of the two sorts of rats, he found differences in their hippocampuses. Among other jobs, the hippocampus is involved in responding to stress. Dr Szyf discovered that better-adjusted rats had, in their hippocampuses, more active versions of the gene that encodes a molecule called glucocorticoid-receptor protein. Glucocorticoid is a hormone produced in response to stress and its job is to make the animal behave appropriately. But too much glucocorticoid is a bad thing, so there is also a way to switch off its production. When glucocorticoid binds to its receptor in the hippocampus, that activates the expression of genes which dampen further synthesis of the hormone. This feedback system is weaker in rats that have had little maternal care. As a result, they are more anxious and fearful, and show a heightened response to stress.

The researchers went on to study what is responsible for the difference in expression of the glucocorticoid-receptor gene. They found that two types of imprinting are involved. One adds molecules called methyl groups to the DNA of the gene. This suppresses gene expression. The other adds acetyl groups, which are slightly larger, to the proteins around which the DNA is coiled. This has the opposite effect, making gene expression easier. Rats that had experienced little maternal care showed high levels of methylation and low levels of acetylation of the glucocorticoid-receptor gene and its neighbouring proteins. The opposite was true for those that had had a more attentive upbringing.

This explains why levels of glucocorticoid-receptor protein are different in the two groups of rats. But Dr Szyf still wanted to know what triggers epigenetic tagging in response to maternal care. He suspected a protein called NGFI-A, which is produced in response to stimulations such as licking and grooming. The more stimulation a rat pup receives, the more NGFI-A it produces.

This suspicion was confirmed when he found it is also the case that the more maternal care a pup receives, the more NGFI-A it has bound to its glucocorticoid-receptor genes. Then, in a series of experiments on cell cultures, he showed that when bound to this gene NGFI-A attracts two enzymes involved in epigenetic tagging. The enzymes in question are histone acetyltransferase (which adds acetyl groups to proteins) and methylated DNA-binding protein-2 (which removes methyl groups from DNA).

According to Dr Szyf, epigenetic modifications in response to maternal care occur during the first week of life after birth—the so-called critical period. The effects are stable, and persist into adulthood. Because this type of programming involves adding and removing chemical groups to and from the DNA and its nearby protein molecules, the researchers wondered whether reversing those reactions during adulthood would affect an animal's behaviour. To test this, they used two chemicals. One, called TSA, inhibits the enzyme that removes acetyl groups. The other, called L-methionine, is a donor of methyl groups.

When injected into the brain cavity near the hippocampus, TSA increased both the amount of acetylation and the level of expression of the glucocorticoid receptor in adult rats that had had little maternal care early in life. As a result, those rats became less anxious and fearful. By contrast, L-methionine increased the level of methylation and thus reduced the expression of the gene in animals with loving mothers, and led to fear, anxiety and a heightened response to stress.

Marked for life

Dr Szyf and Dr Meaney have made a strong case that different epigenetic profiles resulting from early experience correlate with behavioural differences in adults—in rats at least. They are now looking at people. Their first human study is into whether those who commit suicide have different imprints in their hippocampuses from those who die in accidents. They are also studying blood samples from people with depression or with violent tendencies, to look for epigenetic markers that may exist for either of these two behaviours. If successful, that might lead to new methods of diagnosing psychiatric conditions.

Meanwhile, Dr Szyf suspects that response to stress is just one behaviour that might be regulated this way. Whether epigenetics is important for other, more complex, behaviour remains to be seen. If it is, however, the implication could be huge. For decades, attempts to draw a direct line between genes and mental illnesses have disappointed their authors. But environmental explanations have failed, too. Psychiatrists now realise that there is something else in between. That something may be epigenetic imprinting.

© 2006 Economist Newspaper Limited


Shedding light on the world

BBC News website, 8 September 2006
http://news.bbc.co.uk/1/hi/technology/5328586.stm

The 2006 Millennium Technology Prize has been awarded to Professor Shuji Nakamura, the inventor who is said to have kicked started the "blue laser revolution".

Professor Nakamura stunned the world more than 10 years ago with his inventions of light-emitting semiconductors: blue, green and white light-emitting diodes (LEDs) and the blue laser diode.

Blue light has since opened up many opportunities. For example, blue LEDs are used in full-colour flat-screen displays, while blue lasers will change the face of information technology, some say.

The 1m Euro (£680,000) prize, presented by the Finnish President Tarja Halonen, recognised these developments.

As the world's largest technology award, the Millennium Technology Prize awards outstanding achievement aimed at promoting the quality of life and sustainable development.

"Professor Nakamura's technological innovations in the field of semiconductor materials and devices are groundbreaking," said Jaakko Ihamuotila, chairman of the Millennium Prize Foundation.

Blue beams

Increasingly, computers and communications are relying on light to send, store and process information. Devices that work with light are much faster and can store more data.

Lasers are key components of many of these devices. CD, DVD players, and storage systems all utilise these intense focused beams of light.

But all lasers are not equals.

The shorter the wavelength of the laser's light, the smaller the width of the focused beam. As the beam is responsible for reading and writing data onto a disc, for example; the smaller it is, the more densely packed the data can be.

Professor Nakamura's breakthrough was in developing a blue laser source.

This has a shorter wavelength than its infrared and red equivalent (the common light sources used in today's standard DVD and CD players, for example), and so it represents a big step forward in storage capacity.

Going from infrared to blue quadruples the amount of data that can be stored in a given area.

Currently, companies such as Sony, Toshiba and Samsung are exploiting Professor Nakamura's invention in the next generation of DVD players, which promise better sound and high-definition pictures.

Light advance

Professor Nakamura's inventions are also starting to have an impact on another industry.

His white LEDs combine blue, green and red LEDs, and could one day revolutionise the lighting industry.

Some already compare the developments made possible by Professor Nakamura with those of Thomas Edison, the inventor of the tungsten lightbulb.

A light using the LEDs, known as a solid-state light, consumes just four watts of electricity to produce as much light as a 60-watt incandescent lightbulb.

"It is estimated that it is possible to alleviate the need for 133 nuclear power stations in the US by the year 2025 if white solid-state lighting is implemented," said Professor Nakamura.

Bulbs using Nakamura's semiconductor materials are now widely used in traffic lights around the world. They are expected to last over 10 years, whereas conventional bulbs last just 6 months.

World illumination

The power savings could be huge. Currently, keeping the UK's traffic lights running with conventional bulbs requires the equivalent energy of two medium-sized power stations.

The new light sources are also ideally suited to run off solar power and are therefore ideal for use in remote areas of developing countries.

Professor Nakamura hopes that as a result of his work there will be light in parts of the world where today there is not even electricity.

"The University of California has a motto: let there be light," said Professor Nakamura. "It could also serve as a motto for my own research."

He plans to donate part of the prize to organisations that help to implement solid-state lighting in developing countries, such as the Light Up The World Foundation or Engineering Without Borders.

But the benefits of his LEDs are not just restricted to high-tech gadgets and bringing light to the world.

The blue LED's ultraviolet properties could also provide a cheap and efficient way to clean water or counter pollution.

"Professor Nakamura's work is making important contributions toward improving the quality of life and the health of our planet," said Henry T Yang, the chancellor of the University of California at Santa Barbara.

© 2006 British Broadcasting Corporation


Phantom limb and chronic pain: A hall of mirrors

Economist, 20 July 2006
http://www.economist.com/science/displaystory.cfm?story_id=E1_STRRSSS

The search for a new way to treat chronic pain

PEOPLE who have had parts of their bodies amputated often experience a vivid and bizarre sensation that their missing limbs are still there and can even be moved around. Most also suffer an excruciating pain in these sensory ghosts, which are known as phantom limbs. It is difficult enough to alleviate pain that has known causes, such as that triggered by inflammation and cancer. Treating pain in a limb that does not exist is harder still.

There is, though, a technique that works. It uses mirrors to create an illusion that the missing limb is still there. What nobody has known until now is how this helps. But Herta Flor of the University of Heidelberg in Germany, thinks she might have cracked it. If she is right, as she explained to an audience at the recent Forum of European Neuroscience in Vienna, she may have opened a door to the treatment of other forms of persistent pain.

All in the mind

Dr Flor and her colleagues studied people who suffer from a form of phantom limb known as a telescoping phantom. After the amputation of a hand at the wrist, some patients feel the missing hand extending from the amputation stump (an extended phantom), whereas others feel it coming out of the elbow (a halfway retracted phantom) or the shoulder (a completely retracted phantom). The researchers wanted to know what causes these illusions, so they asked their patients to imagine that they were opening and closing their missing fists, and measured their brain activity during this process using magnetic-resonance imaging (MRI).

Patients with extended phantoms showed activity in the hand area of a part of the brain called the motor cortex, which is responsible for controlling movement. This response is identical to that shown by unamputated volunteers asked to do the same thing.

Those with halfway retracted and completely retracted phantoms had different responses. They showed activity in the elbow and shoulder areas of the motor cortex respectively. In other words, a greater degree of retraction involves a greater rewiring of the brain. And the greater the retraction, the worse the pain.

That correlation led Dr Flor to wonder if the alleviation of pain caused by the mirror treatment might be the result of a reversal of the neurological changes associated with telescoping. In the mirror treatment, a patient with an amputated hand places his normal and phantom hands on each side of the mirror. The position of his normal hand is adjusted so that its reflection in the mirror is superimposed on the position he feels his phantom hand to be in. He is then asked to move his hands (both normal and phantom) simultaneously, and to watch them moving. The illusion created by the mirror gives him the impression that both of his hands are, indeed, moving. After a few weeks of such daily treatment sessions, each lasting 15 minutes, about half of patients report that their phantom limbs and the pain associated with them have disappeared.

To understand what is happening, Dr Flor and her colleagues resorted once again to MRI, to record the brain activity of amputees while they were performing the mirror task. As predicted, patients without phantom-limb pain showed activity in the hand area of the motor cortex. They also showed activity in the hand area of the sensory cortex, the part of the brain that receives signals from the hand. The mirror was giving visual feedback that the sensory cortex was interpreting as a real hand.

Those with pain lacked this visual feedback system, and showed no activity in the sensory or motor cortex. Dr Flor conjectures that the mirror therapy might be able to restore the feedback mechanism in patients with retracted phantoms by providing enough nerve input for the brain to reverse some of the changes that follow amputation. The team is now conducting follow-up experiments to see whether this is what is happening.

All of these results suggest to Dr Flor that phantom-limb pain is best understood as a form of unconscious learning, similar to motor reflexes and perception skills. According to this interpretation, mirror therapy works by replacing noxious memories with innocuous ones.

With this hypothesis in mind, she reckoned that another way to treat phantom-limb pain might be to prevent pain memories from forming in the first place. To test this idea, she asked a number of patients undergoing amputation to take a drug called memantine. This blocks the activity of protein molecules called NMDA receptors, which are important in many types of learning and memory.

Thirteen patients took memantine for four weeks after their amputations (and sometimes just before, as well). The researchers then studied those patients' brain activity and bodily sensations over the course of a year, comparing what they found with the results from a group of amputees treated with a placebo.

To their delight, they discovered that memantine not only reduced changes in the brain, it also decreased both the incidence and the intensity of phantom pain. Pushing a bit further, Dr Flor wondered whether her pain-memory theory applied to other types of chronic pain, as well. She turned to a phenomenon called fibromyalgia, which is characterised by widespread pain across the body. Using MRI, her team detected a network of abnormal memory traces in the brains of people with this condition. Many parts of their brains are hyperactive, not only in areas responsible for bodily sensation and movement but also in those involved in the sensation and processing of pain. The researchers also found that memantine was able to dampen brain activity in some affected areas and reduce pain accordingly.

Encouraged by this result, they are now attempting to remove pain memories using drugs that activate the brain's system of cannabinoid receptors. Previous studies have shown that these receptors are involved in extinguishing unpleasant memories such as those related to fear. Dr Flor's hope is that pain—being, in her view, a form of unpleasant memory—will be similarly susceptible.

She is also trying approaches that do not involve drugs. The team has, for example, used cognitive behavioural therapy to train patients with fibromyalgia not to react to pain but rather to focus on enjoyable activities. Dr Flor believes that, by removing the reinforcement loop and bringing in so-called pain-incompatible memories, traces of pain in the brain can be reduced or even extinguished.

If Dr Flor's memory hypothesis is correct, revolutionary approaches for treating chronic pain may be on the horizon. This will be wonderful news for millions of people suffering from this condition. After all, pain is all in the mind. Indeed, pain itself is an illusion, a phantom in the brain. The brain, therefore, may be a good place to start chasing away this ghostly and debilitating sensation.

© 2006 Economist Newspaper Limited


Epigenome: Unfinished symphony

Nature, 11 May 2006
http://www.nature.com/news/2006/060508/full/441143a.html

To correctly 'play' the DNA score in our genome, cells must read another notation that overlays it — the epigenetic code. A global effort to decode it is now in the making, reports Jane Qiu.

Manel Esteller's phone did not stop ringing for weeks. It was summer 2005, and he and his team at the Spanish National Cancer Centre in Madrid had just published a study comparing the activity of DNA in identical twins. The anxious callers were invariably twins whose sibling had developed a serious disease such as cancer or diabetes. Could the study help predict whether they too would succumb, they asked. Did the identical DNA sequence they shared with their afflicted twin mean they had the same genetic predisposition to illness?

Surprisingly, the answer to the second question is 'not necessarily'. Researchers have known for years that, despite their common genes, identical twins can have very different physical constitutions and develop different diseases. The traditional explanation for this is that our environment somehow interacts with our genes to produce our physical attributes, or phenotype, but no one knew exactly how.

The study by Esteller and his team showed that the missing link between nature and nurture could lie in a phenomenon known as epigenetics: a cryptic chemical and physical code written over our genome's DNA sequence. The term 'epigenetics' was first coined in the 1940s by British embryologist and geneticist Conrad Waddington, to describe "the interactions of genes with their environment, which bring the phenotype into being". The term now refers to the extra layers of instructions that influence gene activity without altering the DNA sequence.

By studying 80 pairs of identical twins, ranging in age between 3 and 74, Esteller's team found that epigenetic differences were hardly detectable in the youngest twins, but increased markedly with age. These changes had a striking effect on gene activity: the number of genes that differ in activity between 50-year-old twins was more than three times that in pairs aged 3. "So we are more than our genes," says Esteller. "Not only is the DNA sequence important but also how gene activity is regulated in response to environment. This might explain why many identical twins have different susceptibility to disease."

Spot the difference

As well as offering answers to identical twins, deciphering this epigenetic code promises to dramatically alter our understanding of disease in the wider population (see 'Tagged for disease'). Many cancers might be triggered by epigenetic faults, for example. It should also fill some big gaps in our grasp of how the environment affects a creature's constitution — epigenetic changes explain how simply altering the diet of a pregnant mouse, for example, can completely change the coat colour of her pups, or even alter their response to stress.

"It will illuminate some of the most profound questions in biology," says Stephan Beck, an immunologist at the Wellcome Trust Sanger Institute, Cambridge, UK, who worked on the Human Genome Project. How a given cell executes its unique genomic programme in time and space could shed fresh light not only on development and disease but also on what makes us human, he says.

The complete epigenetic code of our genome, its 'epigenome' has increasingly been the focus of research over the past decade, and scientists are now embarking on an ambitious attempt to crack it. The International Human Epigenome Project, or IHEP, first suggested by Beck and colleagues in 1999, is the logical next step after the Human Genome Project, which published the draft sequence of the human genome's 3 billion DNA letters in 2001. But the IHEP faces daunting challenges. The sequence of the human genome is the same in all our cells, whereas the epigenome differs from tissue to tissue, and changes in response to the cell's environment. Can researchers really hope to pin down this vast, complex and ever-changing code in a meaningful way?

Clever packaging

If the DNA sequence of the genome is like the musical score of a symphony, then the epigenome is like the key signatures, phrasing and dynamics that show how the notes of the melody should be played. Epigenetic control of gene expression occurs in two main ways: either the DNA itself is chemically altered, or the proteins that package DNA into chromatin (the main component of chromosomes), are modified. These proteins, called histones, determine whether the chromatin is tightly packed, in which case gene expression is shut down (or silenced), or relaxed, in which case gene expression is active.

The first kind of alteration takes the form of methyl groups added to the DNA — frequently to the base cytosine when it is immediately followed by guanine — by a process known as DNA methylation (see graphic). The methyl group can be sensed by proteins that turn gene expression on or off through regulating chromatin structure. The second, more complex kind of alteration involves changes to the histones around which chromosomal DNA is wrapped. Each histone has a protruding 'tail' to which more than 20 chemical tags can attach, like charms on a bracelet. Some of these tags, or certain combinations of them, dubbed the histone code, give rise to relaxed chromatin; others have the opposite effect.

Epigenetic codes are much more subject to environmental influences than the DNA sequence. "This could explain how lifestyle and toxic chemicals affect susceptibility to diseases," says Vardhman Rakyan, a researcher at the Sanger Institute. "Up to 70% of the contribution to a particular disease can be non-genetic." Indeed, one key finding of Esteller and his team's study was that epigenetic profiles of twins who had been raised apart or had noticeably distinct lifestyles differed more than those who had lived together for a while or shared similar environments and experiences. Rakyan himself is studying a cohort of identical twins, where one twin has type 1 diabetes and the other does not, with the aim of finding epigenetic changes associated with the disease.

Although different labs around the world, such as Rakyan's, are already working on their own individual studies, several researchers argue that it is time for a coordinated effort. A series of international workshops, and expert and government reports have emerged in recent months that address the value and scope of an international human epigenome project. The ultimate goal of such a project would be to identify all the chemical modifications of DNA and histone proteins for all chromosomes in all types of normal human tissue.

As for the HGP, an international consortium would set priorities, coordinate research efforts, centralize materials and resources, create the necessary technologies and monitor research progress.

Piece by piece

"Epigenomics is at a stage where genomics was 30 years ago, when everyone was working on their part of the puzzle," remarks cancer biol–ogist Peter Jones at the University of Southern California, Los Angeles. Jones was formerly president of the American Association for Cancer Research (AACR), which is based in Philadelphia. "We need to see the bigger picture. It takes concerted efforts on an international scale. And this is how the IHEP would make a difference."

Although a number of funding bodies — such as the Wellcome Trust, the AACR, the US National Cancer Institute (NCI), and the US National Human Genome Research Institute (NHGRI) — have shown interest by taking part in the discussion, funding agencies have yet to commit to financing and leading the project.

All aboard

Since the completion of the Human Genome Project, there have been many multi-centre schemes, each of which costs millions or even billions of dollars. Some of these initiatives, such as the US National Institutes of Health Human Cancer Genome Atlas (which aims to identify and catalogue genetic mutations in human cancers), have prompted arguments over scale and cost-effectiveness. A key question for funding bodies is whether the IHEP would be yet another multi-million-dollar project. Proponents say no. "The goal of the IHEP is not to create another big enterprise, but to make things as cost effective as possible, to interface with wonderful projects that are under way and to fund important pilot projects," says Andrew Feinberg, director of the Centre for Epigenetics of Common Human Disease at the Johns Hopkins University in Baltimore, Maryland.

A number of smaller scale multi-centre epigenome projects are already under way or under discussion in Europe, the United States, India and Japan (see External links, below). Most prominent is that set up by the European Human Epigenome Project (HEP) Consortium in 2000. Following the publication of a pilot project in 2004, the European HEP Consortium will soon make its data on the epigenetics of the entire chromosomes 6, 20 and 22 publicly available.

Although few people doubt the importance of an international human epigenome project, how to go about it remains a subject of debate. A key challenge is defining what the epigenome entails and what cell types to study. Some researchers argue that the project should first tackle blood cells, because they are easy to collect and work with, and are our main 'window' into the epigenome of both healthy and diseased individuals. Once a high-resolution blood epigenome is determined, it will serve as a reference with which other epigenomes, including those of diseased or ageing tissues, could be compared.

But the diversity of epigenomes in different cell types means that it may not make sense to restrict pilot projects to one single tissue, or to a particular time in a tissue's development. After intense discussion in three recent international workshops, researchers in the epigenetic community now agree that initially eight to ten tissues, including the blood, should be studied simultaneously. Ultimately, the epigenome of all tissues, including embryonic stem cells, will be mapped out.

Another question is whether to study cells grown in the lab or biopsies of tissues taken from people. Biopsies contain different cell types, which would muddy the picture, but lab-grown cells might contain abnormal epigenetic tags. At the moment, some biologists are leaning towards lab-grown cells as being the lesser of two evils, but exactly how different the epigenomes of cell lines are compared with normal tissues remains to be seen. The inclusion of cell lines in some pilot studies in the proposed IHEP should be able to resolve this issue.

Final frontier

Perhaps the greatest challenges facing the IHEP are technological: mass-production-style tools must be developed to decode the epigenome, and the morass of data will have to be stored and analysed. At the moment, the main method used to determine DNA methylation sites is reliable, but extremely expensive, and the technology used to study histone marks is prone to problems with accuracy and reproducibility.

Scientists hope to tackle these problems by linking the IHEP to projects on the epigenomes of lab workhorses, such as the yeast, fruitfly and mouse, for which techniques are more advanced. Computational scientists are also developing the sophisticated bioinformatics tools needed to store and analyse multi-dimensional epigenome data.

Given these technological challenges, it is only natural to question whether the research community is ready for such an enormous undertaking. Drawing on the experience of the early days of planning for the HGP, researchers working on epi–genetics are unanimous in thinking they can do it. They have drawn up a plan of how to manage the international assets available for the IHEP and say that, like the HGP, the IHEP will catalyse its own development. "One can never be 100% ready. We have 60% of the technology to go for the real thing," says Thomas Jenuwein, a molecular biologist at the Research Institute of Molecular Pathology at the Vienna Biocentre, Austria. "The rest will happen once the momentum is built up. We should have that vision to go in big."

BOX: Tagged for disease

Thanks to its ability to instruct cells how to 'play' the genetic instructions spelled out in their DNA, the epigenetic code is proving to be central to processes such as development, ageing, cancer, mental health and infertility. Cancer researchers, in particular, are studying it to develop diagnostic and prognostic tools and drugs.

Epigenetics may contribute as much to cancer development as mutations in the DNA itself. For example, Wnk2, a gene whose activity is thought to suppress tumour development, is more often shut down by epigenetic changes in certain brain tumours than it is lost by genetic deletions.

Epigenetic marks can also help predict clinical outcomes. Researchers have identified a pattern of epigenetic modifications within the oestrogen receptor gene that correlates with a cancer patient's chances of survival in response to treatment with the drug tamoxifen.

Epigenetic changes are easier to reverse than genetic mutations, by adding or removing the chemical tags involved. Drugs that do this are now at the forefront of cancer treatments. For example, Vidaza, a drug made by Pharmion Corporation of Boulder, Colorado, has been approved by the US Food and Drug Administration for the treatment of myelodysplastic syndromes, also known as 'preleukaemia'.

In theory, this drug, which blocks DNA methylation, should be active in all cells. In practice, it seems to be active only in cancerous cells, stimulating the expression of many genes, including those that suppress cancer development. A number of other epigenetic drugs are at various stages of clinical trials.

© 2006 Nature Publishing Group


Ornithology: Flight of navigators

Nature, 6 October 2005
http://www.nature.com/nature/journal/v437/n7060/full/437804a.html

The Arctic is a unique testing ground for studying how birds navigate long distances. Jane Qiu catches up with an expedition to unravel the signals that help birds on their migrations.

The captain of an ice-breaker has few problems navigating during the polar summer. The 24-hour sunlight provides constant illumination of the surroundings. When storms whip up, the captain can turn to a suite of global-positioning devices, which pinpoint the ship's location to a matter of metres. Yet all this electronic sophistication is put to shame by the migratory birds wheeling overhead, which navigate thousands of kilometres using nothing more than what is in their heads.

This summer, the ice-breaker Oden tracked migrating birds as they left the Arctic through the Bering Strait, the narrow waterway between Alaska and Siberia. More than 50 polar scientists had gathered as part of a wide-ranging project called Beringia 2005. Among them were Thomas Alerstam of Lund University in Sweden and his research team. They were there to shed light on one of ornithology's greatest mysteries: how do birds navigate during their annual migration?

Thousands of birds gather from all over the world to breed and rear their young in the summer Arctic, where the landscape is temporarily rich with thousands of plants and insect species. Birds such as the Arctic tern (Sterna paradisaea) fly 18,000 kilometres across featureless oceans to get there, and navigational skills are crucial. Drifting off course could cause the birds to miss the land and succumb to exhaustion. When summer wanes, they must once again find their way back, heading as far as South America, New Zealand or even the Antarctic.

The precise nature of their navigational gift has fascinated scientists for hundreds of years. In the early nineteenth century, studies in basic magnetism, along with expeditions to the North and South Poles, inspired scientists to propose that birds are guided by an inner magnetic sense, like a compass needle. Nearly two centuries later, this seemingly bizarre idea was supported by the discovery of magnetite, or iron oxide, in the brains of some bird species. The magnetite is thought to act as a tiny compass.

Clued up

But birds use a number of different navigational cues. As well as Earth's magnetic field, these include the landscape, and the positions of the Sun and the stars. In controlled laboratory experiments, scientists have demonstrated that a number of bird species can use these cues and others, separately or in combination. However, no one knows how the birds put them together to find their way. "It is a different matter in the wild," says Alerstam. "Which cues do they use then? Are some cues more important than others? We have no answer to these questions yet."

To navigate, birds need to know in which direction they are heading. A magnetic compass sense can tell them; so can the position of the Sun, as long as they know the time of day. But direction alone cannot keep a bird on course, says Sönke Johnsen, a biologist at Duke University in Durham, North Carolina. "A compass sense is often not enough to guide an animal to a specific destination or to steer reliably along a long and complex migratory route," he says. Birds must also be able to determine their position relative to a destination and continuously recalculate their heading.

Just three parameters of Earth's magnetic field may be enough to guide the birds on their journeys, researchers think. The first, the strength of the magnetic field, varies with latitude. The second, dip angle, is the angle between the magnetic field line at a given location and Earth's surface; this angle is 0° at the equator and 90° at the north and south magnetic poles. The third parameter, magnetic variation, marks the difference between directions towards the geographic and magnetic poles, and varies depending on location on Earth's surface.

Exactly how birds derive their direction and position from this information isn't entirely clear. The problem is further complicated at high latitudes, where Earth's magnetic field lines converge. "As the dip angle approaches 90°, the magnetic field lines go almost straight into the Earth, leaving very little of the horizontal information that is necessary for orientation," says Alerstam. As birds get closer to the north magnetic pole, which lies 1,400 km from its geographic counterpart, magnetic variation increases. And magnetic minerals in the Arctic Ocean or the tundra can give rise to anomalies in the magnetic field in the region. Daily variations in the magnetic field are also more erratic, because of the way that solar radiation affects the magnetic poles.

The 24-hour daylight of the polar summer means that the birds cannot even use the stars or daily rhythms of the Sun as a pointer. But the very complexity of the Arctic makes it an attractive location for testing birds' navigational strategies. "The only way to find out how birds cope in such complicated geomagnetic regions is to go out there and take a look," says Alerstam.

Bearing up

Two months into this summer's expedition, Alerstam wasn't certain that his team would find any answers. At 4:00 a.m. on 14 August, Oden was steering slowly through thick ice towards Wrangel Island off Siberia. The ship swayed heavily from side to side and the ice was so dense that it had to reverse slightly to gain enough momentum to advance another 50 metres. "We can only hope that the equipment will survive such harsh treatment," Alerstam wrote in his notes.

After a few hours, the ice thinned and Oden started to sail smoothly. In the ship's operation room, Alerstam was absorbed in watching hundreds of echoes appearing on the radar screens. Conditions were perfect after several days of rain, and Oden's radar equipment had picked up echoes of large flocks of migratory birds heading out of the Arctic. Most of them were travelling east and south at high altitudes, 2,000 to 3,000 metres up. One echo even came from an altitude of 4,800 metres — a record height for the trip.

Radar tracking allows scientists to follow individual birds or flocks for up to 15 kilometres. By the end of the trip, Alerstam and his team had recorded nearly 600 such tracks. Based on these echo recordings, they calculated flock densities as well as the speed, direction and altitude of the migration. The motion of helium-filled balloons released from Oden was also tracked, and this information on wind patterns allowed the team to calculate the headings and trajectories of the birds themselves.

The different cues that birds might use for navigation generate very different trajectories, particularly at high latitudes. So researchers can calculate specific paths for each cue and compare them with where the flocks actually go, revealing which cues are most important to the birds. Alerstam calculated several possible routes for the species they wanted to track.

He predicted that, if the birds navigate mainly using a magnetic compass, they would fly northeast from the expedition site towards high Arctic Canada.

Predicting paths for birds that are using a Sun compass is more complicated because the directional information depends on the bird's sense of time. At high latitudes, the distances between longitudes are so small that migratory birds may fly across three or four lines of longitude in a single day. Either the birds' internal clocks adjust to the local time as they go, or they remain constantly jet-lagged. Alerstam calculated that birds following the Sun and not adjusting to local time would curve to the southeast, passing through the Bering Strait and western and northern Alaska. Birds that fly using a Sun compass and correct it for the shifting time zones might be expected to fly a route somewhere between the Bering Strait and the magnetic route.

In 1994, Alerstam went on an expedition to the Russian Arctic and discovered that most birds in western Siberia flew towards the east at the end of the breeding season, a puzzling observation as the birds' destination was south. It turned out that their migratory paths were most consistent with a Sun-compass route without time correction, and Alerstam inferred that the birds were using the Sun to navigate while experiencing constant jet lag3. So on the Oden trip, says Alerstam, "we expected to see massive migrant flow — mostly on southeast courses — over the Bering Strait, western and northern Alaska, and northwesternmost Canada." That is exactly what his team found (see map, overleaf).

Coarse corrections

One curious feature of the strategy that uses the Sun without time correction is that it gives trajectories that approximate a 'great circle' route. A great circle is the shortest arc connecting two points on the globe. Following a great circle route is quicker, but requires the traveller to change compass course continuously; navigating with a constant magnetic compass course is easier, but results in much longer routes. Pilots and sailors regularly follow great circle routes with the help of complex geometric computing. And it seems that birds have found a way to do this too.

"It makes sense as this saves energy, which is important given that they have to fly thousands of miles," says Alerstam.

How have birds learned this navigational trick? For most animals that travel in east–west directions, changes in the time of sunset or sunrise reset their internal clock — but this process takes a few days. If it also takes days in Arctic birds, then as they migrate long distances towards the east or west they will be constantly out of phase with the local time, and misread the Sun as a result. Not knowing that they are out of sync, they will end up flying along the energy-saving great circle routes. "Maybe it is a lucky coincidence," says Alerstam.

Another possibility is that the birds can correct for the changing time zones, but actively suppress this adjustment. According to the late Eberhard Gwinner of the Max Planck Institute for Ornithology in Andechs, Germany, migratory birds in the Arctic should be able to adjust their internal clock very swiftly4. In the Arctic summer, some birds have low levels of melatonin, a hormone that is usually controlled by the daily cycle between light and dark. Lower melatonin levels suggest that the birds have a weaker clock in the polar summer, which may allow them to adapt quickly to changing environmental conditions, but could also be readily suppressed.

Of course birds must have some sense of time to find their direction, even if it is then slightly offset by jet lag. "There are other natural cues the birds rely on, such as the colour and polarization patterns of light," says Michaela Hau of Princeton University in New Jersey.

Once the birds find their way to lower latitudes, points out Alerstam, the conditions for navigation become much less extreme. He suggests that they may then use different strategies to find their way.

Are we nearly there yet?

One thing is clear from the past 30 years of research: birds use a delicate combination of cues to create a sophisticated backup system. If one cue is taken away from them, they will use the next down the line. And the cues that birds use and the way they navigate depends crucially on geographic location and weather conditions.

"At the end of the day, both fieldwork and laboratory experiments are necessary for understanding bird orientation. And they should always go hand in hand," says Martin Wikelski of Princeton University. "Expeditions such as Alerstam's represent one of those points where we are pushing things from a steady state to the next level," he says.

In another such push, Wikelski is working on an international initiative to study small-animal migration around the world. This project will use unmanned aerial vehicles or even a specialized radio receiver in low Earth orbit. Either approach would allow researchers to map the global migratory patterns of birds and insects carrying embedded miniature radio transmitters.

"To track individual birds around the globe will completely revolutionize avian research," says Alerstam. Such studies will not only shed light on bird navigation, but also have implications for preventing or containing animal-based epidemics such as avian influenza. The study of migrating birds may thus have a bearing on much wider questions than just how and why they fly.

© 2005 Nature Publishing Group


Printer forensics: Band aid

Economist, 28 October 2004
http://www.economist.com/science/displayStory.cfm?story_id=3329120

How to beat digital forgers

NO DOCUMENT is safe any more. Counterfeiting, once the domain of skilled crooks who used expensive engraving and printing equipment, has gone mainstream since the price of desktop-publishing systems has dropped. Virtually any kind of paper can be forged, including cheques, banknotes, stock and bond certificates, passports and security cards. For currency alone, millions of dollars in counterfeit banknotes make their way into circulation each year, and 40% of the counterfeits seized this year were digitally produced, compared with 1% a decade ago.

In ancient times, counterfeiting was a hanging offence. In Dante's “Inferno”, forgers were placed in one of the lowest circles of hell. Today, desktop counterfeiters have little reason to worry about prison, at any rate, because the systems they use are ubiquitous and there is no means of tracing forged documents to the machine that produced them. This, however, may soon change thanks to technology developed by George Chiu, Jan Allebach and Edward Delp, three anti-counterfeiting engineers based at Purdue University in Indiana. The results of their research will be unveiled formally on November 5th at the International Conference on Digital Printing Technologies in Salt Lake City.

Though the approaches of the three researchers differ slightly, all are based on detecting imperfections in the print quality of documents. Old-school forensic scientists were—at least so the movies would have you believe—able to trace documents to particular typewriters based on quirks of the individual keys. The researchers from Purdue employ a similar approach, exploiting the fact that the rotating drums and mirrors inside a printer are imperfect pieces of engineering which leave unique patterns of banding in their products.

Although these patterns are invisible to the naked eye, they can be detected and analysed by computer programs, and it is these that the three researchers have spent the past year devising. So far, they cannot trace individual printers, but they can tell pretty reliably which make and model of printer was used to create a document.

That, however, is only the beginning. While it remains to be seen whether it will be possible to trace a counterfeit document back to its guilty creator on the basis of manufacturing imperfections, Dr Chiu is now working out ways to make those imperfections deliberate. He wants to modify the printing process so that unique, invisible signatures can be incorporated into each machine produced. That would make any document traceable.

Ironically, it was after years of collaborating with printing companies to reduce banding and thus increase the quality of prints, that he came up with the idea of introducing artificial bandings that could encode identification information, such as a printer's serial number and the date of printing, into a document. Many factors can affect banding patterns. These include the intensity, timing and width of the pulses of laser light that control the printing process, and the efficiency of the motor controls that steer the laser beam, turn the drum and move the mirrors. All of these could be exploited to produce unique signatures, but Dr Chiu found that the one which works best without compromising print quality is to fiddle with the intensity of the laser. Using a computer model of the human visual system, he has designed a method of banding that is invisible to the eye while remaining all too visible to an expert with the right machine.

The current techniques used to secure documents are either digital (and therefore easy to fake with desktop publishing systems) or too costly for widespread applications (paper watermarks, fibres and special inks). Using the banding patterns of printers to secure documents would be both cheap to implement and hard, if not impossible, for those without specialist knowledge and hardware to evade.

Not surprisingly, the American Secret Service is monitoring the progress of this research very closely, and is providing guidelines to help the researchers to travel in what the service thinks is the right direction. Which is fine for catching criminals. But how the legitimate users of printers will react to Big Brother being able to track any document back to its source remains to be seen.

© 2004 Economist Newspaper Limited


Fear and loathing in the unconscious

Irish Times, 4 March 2004
http://www.ireland.com/newspaper/science
/2004/0304/714385293FTSCI04CAPTFTSC4BRAIN.html

Too strong a sense of 'self' may not good for you, but this may help to understand and ease anxiety disorders, writes Jane Qiu.

What is the feeling of fear and what causes it? We see a vicious dog running at us and our brain instantly sounds the alarm, making our heart and lungs race. We feel afraid. For Aristotle and Plato, fear was above all a physical reaction. William James, the founding father of modern psychology, proposed in 1884 that we experienced emotions such as fear as a result of our bodily sensations.

This theory of emotion has been taken as a truism. Now a study led by Dr Ray Dolan, published in this month's Nature Neuroscience, has provided some scientific evidence to support it.

Dr Dolan was born in Galway and is now the chairman of department of neuropsychiatry and neuropsychology at the Institute of Cognitive Neuroscience, University College London.

He conjectured that if the theory of emotion was correct then the more we were aware of our bodily sensations the stronger the emotions we experienced. To test this, he asked subjects to perform the so-called heartbeat detection task. In each trial, the subjects heard a sequence of notes, each triggered by their own pulse. In half of the trials, the notes were delivered immediately.

In the other half, each note was delivered with a short delay. The subjects were to judge the timing of their own heartbeats relative to the feedback notes. After the task, they were asked to report how they felt.

Dr Dolan found that people who judged their heartbeats accurately had the tendency to report negative feelings such as anxiety and depression. Positive emotions, however, were not affected.

He also identified the part of the brain that mediated such a link. While the subjects were doing the task, their brain was scanned by functional magnetic resonance imaging to spot the active region. Dr Dolan found that a part of the brain called the right anterior insula was responsible for translating bodily sensations into conscious, emotional feelings.

"People who are more aware of their bodily responses have richer emotional lives," says Dr Hugo Critchley who conducted the experiments. "Higher levels of awareness are necessary for better control of these responses.

"But paying attention to your bodily responses may also predispose worries," suggests Dr Critchley. "In fact, people with anxiety disorders often focus on bodily responses and experience them more intensely.

"The right anterior insula is not the only region of the brain that can detect bodily functions. But it is the only one where this detection is available to conscious, emotional feelings."

In other words, the right anterior insula lies at the interface between the "heart" and the "mind", and could be the gateway to consciousness. But there is still a missing link. What makes our heart and lungs race in the first place when we see a vicious dog? Scientists believe that this has to do with our previous experience and knowledge of what is dangerous and threatening. Babies do not react with fear in the same situation. Fear is a learned process.

This learning process involves our body associating two stimuli, that is, the sight of a vicious dog and the racing of the heart and lungs. This is the classical conditioning response that was famously illustrated by Russian physiologist Ivan Pavlov at the turn of the 20th century. Pavlov flashed a light to a dog and then presented him with a meal. After a few repetitions, the sight of light itself was enough to make the dog salivate.

Humans learn in the same way. This is a basic mechanism that the brain has evolved to detect causal relationships in the environment in order to make the correct predictions in terms of hunting for food and avoiding dangers. Once we have learned that vicious dogs are dangerous, our body would react automatically when we see a vicious dog. It is these bodily reactions that give us the conscious feeling of fear.

Consciousness is the great unsolved puzzle in neuroscience, a puzzle being challenged by a better understanding of the functional processes involved.

© 2004 Irish Times


Emotion and memory: Thanks for no memory

Economist, 13 November 2003
http://www.economist.com/displaystory.cfm?story_id=2208626

Some evidence about how and why memories are suppressed

ACCORDING to Freud's theory of repression, the mind hides memories of traumatic events in places where they cannot easily be retrieved, in order to prevent overwhelming anxiety. It is these “repressed memories” that the memory-recovering techniques beloved of some psychiatrists aim to unearth.

The existence of repressed memories is taken as a truism by psychiatry. Unfortunately, it has never been verified by rigorous scientific experiment. And that is not a matter of mere academic interest, since memories apparently recovered by psychiatric techniques such as hypnosis—particularly memories of childhood abuse—have sometimes been enough to put people in prison, even when there has not been any corroborating evidence. Moreover, even in cases where an individual has undoubtedly witnessed something traumatic, the reliability of his memories can be critical to convicting the true perpetrator. Witnesses frequently disagree, and this may reflect the way memory forms. Some actual data on the relationship between unpleasant experiences and memory would therefore be welcome.

In this week's Proceedings of the National Academy of Sciences Bryan Strange, of University College, London, and his colleagues provide some. Rather than abuse their experimental subjects, though, they merely showed them streams of words on a computer screen.

Totalless recall

Some of these words (murder, massacre and so on) had bad connotations. Others (meeting, gathering and conference, for example) were emotionally neutral. The subjects of the experiment, who did not know in advance what was required of them, were asked to look at the stream, which was presented one word at a time. Then, when they had been shown it, they were asked to recall the words in it. In the past, this technique has showed that emotionally charged words are more likely to be recalled than neutral ones. What Dr Strange wanted to look at was how well people remember neutral words adjacent to the emotionally charged ones in the stream. He discovered that words immediately preceding emotionally charged ones were less likely to be remembered than normal.

Intrigued, he pushed a little further. Previous work had established that emotion-associated enhancement of memory is caused, at least in part, by the action of stress hormones, in particular norepinephrine, on a part of the brain called the amygdala. He wondered if a similar mechanism was at work in the emotion-associated memory loss the team discovered.

The action of norepinephrine on the amygdala can be blocked by a drug called propranolol. When the researchers repeated their experiments on volunteers who had been dosed with this drug, they found, as expected, that those volunteers did not remember emotional words any better than neutral ones. In addition, however, they found that memory for neutral words which preceded emotional ones improved.

The team was also able to draw on evidence from a patient who suffers from Urbach-Wiethe disease, a rare genetic disorder that can cause damage to the amygdala. They used brain-imaging techniques to confirm that her amygdalas (people actually have two, one in each hemisphere of the brain) were, indeed, damaged. They also measured her cognitive functions—intelligence, attention and both short-term and long-term memory—and found that these were normal. But her memory was not affected by emotion; she remembered emotionally charged and neutral words equally well, regardless of the order they were presented in.

The memory gap

The kind of memory Dr Strange studied is called explicit memory. It concerns facts and experiences—knowledge that can be recalled by conscious effort and can be reported verbally. Researchers believe that explicit memory is formed in several steps. The first is translating newly learned information into so-called neural correlates. This does not involve permanent changes to the brain's structure. In the second stage, consolidation, structural changes such as the formation and destruction of connections between nerve cells take place. This process involves the expression of genes and the synthesis of new proteins, and Dr Strange suspects that emotion interferes with these biochemical events. As a result, no memory is formed.

Another line of evidence that supports this interpretation is work on post-traumatic stress disorder (PTSD) carried out by Roger Pitman, of Harvard University. Dr Pitman recently conducted a trial to see if propranolol could prevent the development of this disorder, which afflicts those who have been exposed to horrific events, such as battles or plane crashes, with emotionally disturbing flash-back memories. He reasoned that excessive amounts of stress hormones released at the time of a traumatic event might be responsible for overly strong memory formation. Because memory takes time to form, he conjectured that drugs which block the action of these hormones soon after the trauma might decrease the intensity of the memory. This turned out to be true: a course of propranolol started shortly after an acute traumatic event was able to reduce the symptoms of PTSD one month later.

On the face of it, there is something slightly contradictory about these results. It is odd that the amnesia observed by Dr Strange is for events just before an emotionally charged incident, when what is actually desirable is to wipe away any recollection of the incident itself. But a simple laboratory experiment using what are, after all, ultimately harmless words, is not the same as a case of child abuse or the horrors of war. And it seems clear that the amnesia, as well as the memory formation, is in some way a result of the stress hormones.

What is undoubtedly true is that memory, like everything else in biology, is an evolved, functional response. If individuals tend to be better off by not remembering certain things, natural selection will tend to construct their brains that way. Indeed, the existence of post-traumatic stress disorder suggests that individuals are better off without those memories. And in fact, most people do come out of trauma with their psyches intact, so it is possible that what has happened to PTSD sufferers is that the memory-prevention mechanism has gone wrong.

Freud might thus have been right about the reason for what he thought he had observed about trauma and memory. But it looks as though he was wrong about the mechanism. The evidence, though limited at the moment, suggests that memories are not repressed. Rather, they are never formed in the first place. Obviously, no psychiatric technique can recover something that was not there to start with. That is something of which the courts should be acutely aware when they assess the credibility of witnesses. It is also something psychiatrists may care to ponder when they are trying to dredge up “forgotten” childhood memories.

© 2003 Economist Newspaper Limited


 

[Home]   [Selected works]  [My experience]  [Contact me]