Published on
9 December 2008

ID: The Quest for Identity in the 21st Century

By: Baroness Greenfield

ABOUT

Professor Susan Greenfield is Director of the Royal Institution and Professor of Pharmacology at the University of Oxford Seminar on Tuesday 9 December 2008             The speaker was interviewed about her latest book, of the same title, by Dr. Tom Pink, Reader in Philosophy at King's College London. Her other books include The Human Brain: A Guided Tour (1997), The Private Life of the Brain (2000), and Tomorrow’s People: How 21st-Century Technology is Changing the Way We Think and Feel (2003). She has spun off four companies from her research. In 2006 she was installed as Chancellor of Heriot-Watt University and voted ‘Honorary Australian of the Year’. In 2007 she was made Fellow of the Royal Society of Edinburgh.

Dr. Tom Pink: Many thanks for writing this very interesting book, which addresses questions to do with human identity and fulfilment about which I am deeply concerned myself, but from a very different starting point. To be realistic, I suspect that unfortunately not everyone here will have read the book, so I thought to begin with you might want to tell us in a few words what it is about.

Baroness Greenfield: Apologies to those of you who know this book from cover to cover, and also to anyone here who might be a neuroscientist. I ought perhaps to perhaps start by saying why I as a neuroscientist wanted to write the book because it is not the most obvious question that someone working on the brain might ask.

What you need to know, and this is perhaps the key to everything, is that the human brain, compared to those of other species, is totally influenced and malleable, and very much responds to the environment,. This cannot be said, for example, of goldfish. I am sure I will not offend anyone by saying that goldfish do not have great personalities and that they are not great individuals. They have a repertoire of behaviour such that one might be readily swapped for another in case of death. The same might be said of cats or dogs, and certainly not of people.

With greater sophistication and a larger brain it is possible to escape the narrow dictates of the genes, a narrow repertoire of stereotyped behaviour, in favour of a much more personalised brain. If you have a brain that learns rather than just following instinct then you will have individual experience and that in turn translates into your becoming an individual. We know at the physical level how this works. We know that if there is stimulation of the brain, if you interact with the environment, the enhanced activity of brain cells causes them to grow branches which present a greater target to other brain cells so that you can make more connections.

I am sorry to sound so banal at this stage, but nonetheless I do think crucial this personalisation of the brain by proliferation of singular connections that no-one else has ever had in the hundred thousand years that we have stalked this planet, nor ever will have again. These endlessly changing connections are driven in turn by individual experiences so that even if you were a clone, that is to say an identical twin, you would have a unique pattern of brain cell connections This is what neuroscientists call ‘plasticity’. It is a word from the Greek, implying a capacity to be moulded.

So by way of preamble, I hope I have adequately made the point that your brain is very, very sensitive to events. It can be changed by the environment.

A second point… If, as I argue, the environment is about to change in unprecedented ways in this 21st century, it follows that a brain malleable to the environment will change, too. I am interested in exploring the options, possibilities, threats and opportunities that such an unprecedented new environment presents for the brain, and also the way people think and think about themselves, and therefore also about consequences for identity.

A few words only about how the environment is changing… Biotechnology, manipulating as it does the genes, will have an impact not just on one’s longevity, but also on one’s appearance and reproductive status. It could be the case – and I am not endorsing this – that in the future the normal markers between one generation and another will be blurred. The normal ways we have of judging someone’s age – by appearance, health, reproductive status and indeed whether working or not – could all, I argue, could be challenged by biotechnology and smeared, if not completely flattened out. Think of it: you are a child, you are an adult, and then you are dead, without the stages that mark out the narrative we to which we have become accustomed.

Next consider nanotechnology, the science of the very, very small, perhaps only a billionth of a metre in size. That presents not just new types of materials, but materials changing in their properties, and expounding it could be rather like trying to explain plastics to someone living in the Middle Ages. There is the distinct possibility of implants in our bodies. It has been rather facetiously suggested, for example, that you might be cleaning your teeth in front of the bathroom mirror with the mirror or toothbrush speaking to you and saying, ‘If you carry on like this you could lose all your teeth in ten years’ time because you have gum disease’. There is a very sensitive interaction between your most intimate, previously covert, body processes and the outside world that might offer an early warning if there was anything going wrong. That would mean that the firewall of our bodies, how we see ourselves as a physical entity compared to the outside world, had come to be challenged by nanotechnology.

Biotechnology, I suggest then, might be blurring the generational divide, while nanotechnology could challenge the distinction of the body from the outside world. Then, finally, we are faced most pressingly at present with information technology, imposing as it does a lifestyle in two dimensions, and challenging the line between reality and fantasy. I suspect very few people in this room are engaged in ‘Second Life’. For those of you not familiar with it, ‘Second Life’ is, as its name suggests, an alternative existence wherein you can exist as some idealised avatar on a daily basis. You can, if you so choose, be a ginger-bearded dwarf, Vlad the Dragon-Slayer or anything you want to be. There is a whole economy in ‘Second Life’. The University of Oxford has an outpost in it, and the Thomas More Institute could do likewise if it wanted. Sweden has an embassy in ‘Second Life’. There is a millionaire dress designer whose clothes are just virtual clothes designed to dress the avatars in ‘Second Life’. It is quite an intriguing phenomenon. Autistic people, incidentally, are very comfortable in ‘Second Life’, and they prefer to be there where the emphasis is on action, not on feelings or thoughts.

Take also the standard computer games. One fear I have is that a three-fold increase in Ritolin prescriptions for Attention Deficit Disorder over the last 10 years may in some way be linked to a world which places a premium on a short attention-span. A lot of my book is concerned with the impact of information technology, that is to say, with screen-life and computer games, and with changes to the way people think, attitudes to risk, abstract concepts, or process over content.

I shall wind up for the present by saying that if we have an environment in which the distinctions between the body and the outside world, between the cyber-world and reality, between one generation and the next are blurred, then we might think and feel in very, very different ways, and that is something upon which I speculate upon in the book. I was concerned not about whether I was right or wrong, not about whether I should be hailed as a doom-laden dis-comforter, but rather that we are complacently sleep-walking into these technologies and then crying foul once it is too late. I have also compared the materialism of the Western world in the 20th century, with strong ideologies, differing belief systems – religious o and non-religious – and political ideologies as well, to locate alternative options for identity.

Dr. Tom Pink: Thank you. I suppose that, speaking as one who does not really do science, who just reads about it with a certain nervousness at times, one thing that interests me is how you as a brain scientist approach what is surely at some level the study of human nature. What assumptions about the human side of things do you bring to your study? How far do you think those assumptions delineate fairly fixed human parameters?

Baroness Greenfield: Let us start with the composite term, ‘human nature’, which I think trickier than it seems at first sight. We all think we know what human nature is, but if we try to define it we very quickly run into problems. The term ‘human’ somehow disenfranchises, by definition, the rest of the animal kingdom. ‘Nature’ implies something ubiquitous in time and space.

What is it that all human beings for the last hundred thousand years have in common that chimps do not have, irrespective of where they live or lived, of what they do or did, in whatever era they lived in? A scientist, lacking a theological framework, is hard pressed to put a finger on it. After all, animals eat, copulate, sleep and so on, even if in different ways. Animals also think.

I have read a very interesting book of which you may have heard, and I reference it a lot in the book. The author is Steven Mithen, an archaeologist, and the title The Prehistory of the Mind. The book explores from an archaeologist’s point of view, what is special about human beings, compared to, let us say, chimps. Mithen had the great insight, which for me rings true, that chimpanzees, even though they live in complex hierarchies and are very dextrous and intelligent, are never to be seen with symbols of tribal status around their necks. Neanderthals did this, and they had also cave art.

This points up the fact that we have the ability to think metaphorically, that we can see one thing in terms of something else. Although chimps, after arduous training in the lab, can eventually use sign language of some sort, that is not their normal proclivity as it is for a two-year old child. Language is an example, along with art and necklaces, of seeing one thing in terms of something else – ‘status’, if you like, or a symbol, something standing for something else. For me that is what distinguishes humans from the rest of the animal kingdom, even primates.

That makes science possible. It comes down to the connections. The chimp brain has, compared to that of the human, a relatively small front. We have a pre-frontal cortex exaggerated through evolution way beyond the size of the brain as a whole. This front portion of the brain is rather like a kitchen-sink for neuroscience in that everyone attributes to it lots of fancy activities because it does not seem to do anything day-to-day or normal.

A man called Phineas Gage had a bar through his, and he was, so to speak, normal apart from the fact that his character changed. He was working on an American railway gang in the 19th century with a tamping iron which an explosion drove through his head. He went back to work but he was no longer a team-player. Later he earned his living as a fair-ground freak showing off his wound. Then took to the bottle and died of alcoholism.

The human pre-frontal cortex is quite sophisticated, and in my own view it makes possible very specific associations, the imposing of time and space frames of reference, and very clear episodic memories, which are beyond small children and animals.

It is all about thinking symbolically, which is, at the physical level, thanks to our ability to form connections in a way that animals brains cannot. Symbols in turn relate to identity, because as the 20th-century psychologist, Edward Bernays, realised, you can persuade people to buy things they do not need if these ‘say something about them’. In the early 1920s when women did not smoke, advertisers featured pretty girls holding lit cigarettes with the banner, ‘The torch of freedom’. The implication was nothing to do with the quality of the smoking experience, but rather that if a woman had a cigarette it showed she was free. Today the kitchen you have says something about you. Think of the slogans you encounter that exploit your desire to be an individual or that suggest ownership of an article will say something about you. Bernays, incidentally, was Sigmund Freud’s nephew, and it is not surprising perhaps that he shared his uncle’s ideas.

I want to extend that further, even if I can see an elephant trap opening up before me here. It has struck me that behaviours as well as objects might say something about persons. That in turn has led me on to thinking about our using the phrase ‘human nature’ as an excuse when we have done something we do not want to justify. We say, ‘Oh, it is just human nature’. Much of what we do that is ‘just human nature’ might, in fact, be an exaggeration of ordinary animal behaviour, but behaviour that has become symbolic: eating a lot of food because you are miserable, for example. It is not that you need the food, but it will say something about you because you are very thin, or indeed, in certain cultures, if you are very fat. It will say something about you if you go in for excessive sleeping or excessive copulation. I am sure you can see where this is going.

I considered the seven deadly sins as examples of human nature, where the sins were exaggerations of normal human behaviour that had become symbolic and therefore special to humans. This would say something about the humans concerned, about their status or lack of it. I know I am sticking my neck out here, but those seven deadly sins are ubiquitous in humans, regardless of cultural differences, even if the cause of the jealousy or envy in a particular case might mean to us here today.

It all ties in. Human nature is about symbols which cannot be reduced to Burberry umbrellas. They can also be certain types of behaviour that you might refer to as sins.

Dr. Tom Pink: You discuss three types: the Someone, the Anyone and the Nobody. They all seem to be humans, but the symbolic seems particularly important, if I have not misunderstood this, in the Someone. As you have just described things it sounds as though the symbolical is one way in which we express ourselves as individuals…

Baroness Greenfield: It is precisely that which the Nobody lacks.

Dr. Tom Pink: But not the Anyone, because there is a use of symbols not to assert yourself, but rather to assimilate yourself to a community of fellow symbol-users. Humans need not always be centred on what is specifically human, whereas it may be very much part of the story of human nature that we are animals so that the commonality we have with animals may be important. It can be very important when considering something to which you give a lot of attention, namely fulfilment. You use the word, I think, quite often in your book. When people have a story about human nature, it usually comes with a story about fulfilment. However, in some ways the story you are telling about humans as symbol-users is not really one about fulfilment. In fact at one stage you mention one who has individuality without fulfilment.

Baroness Greenfield: I have painted three scenarios, and I shall concentrate on just one of them, as Tom has asked. Let me just give an overview first. The Somebody scenario is characterised primarily in 20th-century and Western cultures in which expression is made by virtue of symbols, whether behaviours or possessions: you do want to be a Somebody, do you not? Who, after all, likes the idea of being a Nobody? Much has been written about this in a book called Affluenza by Oliver James, a really insightful psychiatric examination of why people should be unhappy.

Nobody, as the name suggests, lives saying ‘Yuk!’ and ‘Wow!’, and ricocheting off the moment. We are all Nobody when we are just born, since small children are passive recipients of sensation. This can be recapitulated have a ‘sensational’ time, rather than a ‘cognitive time’: downhill skiing, dancing, fine wine, or in some cases psychoactive drugs which mess up the connections so that they no longer work properly, producing regress to William James’s booming, buzzing confusion of early infancy.

And what fascinates me is that people actually pay money to abrogate the sense of self. I argue that the computer world might infantilise the brain and keep it in the stage of raw sensation and pure hedonism where things do not mean anything, where it is the experience that counts rather than the content.

The Anyone scenario is a collective narrative of ideology, religious or otherwise, where the individual is subsumed in the greater collective.

Let us return now to the Someone scenario. We could go on owning more and more symbols, but at the same time the World Heath Organisation is predicting that the biggest disease of this century will be, not aids, but depression. With Oliver James we might ask why people with lots of material possessions have psychiatric illnesses and remain unfulfilled. We are faced by individuality without fulfilment. There is no point in having the kitchen that says everything about you, that says how individual you are, when you find on getting home that your neighbour has exactly the same kitchen. There follows a kind of arms race for individuality. If the goal is expression of individuality through those symbols then failure is certain because the same symbols will sooner or later become available to other people, too. The result is ever-increasing unhappiness.

Dr. Tom Pink: At the same time it does look as if some people might be fulfilled otherwise than by realising themselves individually. In the case of Anyone you point to that fulfilment without individuality. How does it work?

Baroness Greenfield: The narrative there is collective. Let us take the case of a political ideology, rather than a religious one. In Marxism we have the story-line of the worker striking up against an evil middle-class bourgeoisie. It is very thrilling and exciting to be part of that story, especially when you are on the right side, as one of the workers. I conjecture, as a non-Marxist, that it might also be very comforting: marching and singing together with repetition of slogans and the exclusion of anything contradictory. It might actually reinforce certain brain connections that seem to give pleasure.

Fulfilment is a very loaded word, but for me is something it can be attained only in a narrative. The Nobody scenario, with its hedonistic and momentary existence, will never allow for either fulfilment or individuality. The Somebody scenario does not bring fulfilment even if individuality is reached. In the Anyone scenario, people may be fulfilled but without individuality. There is a fourth option…

Dr. Tom Pink: If you are fulfilled in it, what is wrong with the Anyone scenario? Is it that it is not rational, or…

Baroness Greenfield: Oh, no! I do not put a great premium on rationality myself. The physicist Niels Bohr once said to a student, ‘You are not thinking, you are just being logical’. I am not one of those scientists who holds with computational models of the brain, in fact quite the opposite, so I do not have any particular respect for the merely rational.

There is nothing wrong with the Anyone scenario, except that I personally put great weight on individuality. I respect the fact that others may not do so. On the whole, if a collective group of people subsume their individualities under some greater narrative which is benign and helpful to society, that is fine by me – unless, for example, it is a terrorist organisation – , but it would not be for me.

Even if it were an option people should at least interrogate, our individuality is what makes us humans special, even compared to primates: we have a unique narrative, a unique repertoire, a unique collection of memories and agendas and theories. Even a clone or identical twin is unique. For me the apotheosis of being human is the expression of that individuality. I find it wonderful to do so, rather than sublimate it on ski slopes or dance floor or dining room. When I was a college tutor in medicine at Oxford I encountered students going through crises, not of personal relationships, but ones based on the fact that they had been stars at their schools, and now at Oxford they were surrounded by such stars. I would tell them: ‘Do not worry about your tutorial partner. You have only one rival, and that is yourself of last week. Worry about stretching yourself as an individual rather than about how you measure up to the others’.

I would urge that, before dismissing individuality, we should treat it with respect and think about it, because it is a vital commodity in our possession.

Dr. Tom Pink: Do you think that what we are now beginning to understand about the brain might provide a better way of being individual, or is it actually a threat to the individual? Where does the balance lie?

Baroness Greenfield: That’s an interesting question, and one which makes me nervous, because I can see we are now heading towards the ‘free will’ issue. As you may know, neuroscientists are increasingly called into courts of justice. Perhaps the Institute should consider holding one day a debate between a neuroscientist and a lawyer over culpability. Is there a gene giving a propensity to a certain crime? The short answer is that there is no such thing. Genes may be necessary, but certainly not sufficient causes. Complex traits are not locked into the gene.

There are those who would argue for a genetic predisposition to criminality. There is, of course, what has become known as the ‘Twinkie defence’. (Twinkies are some kind of American confectionery.) A murderer claimed that he had been eating lots of Twinkies before the crime, and that therefore his blood sugar level was so high that he was no longer responsible for his actions. I’m not joking! I believe the person question actually got out of the Murder One charge and had to face only one of Murder Two. Others point to a brain scan and say, ‘This area is abnormal compared to what we see in other people’s scans’. The problem is whether that is cause or effect. Do you have an abnormal brain scan because of your behaviour, or has that structural abnormality caused the behaviour?

My own view is that such an argument will not hold. One might nonetheless properly ask were the line should be drawn between responsibility for actions and deconstruction of the same in brain terms. Free will is a big question. There are correlations between various physical phenomena or features and certain deviations or aberrations in ways of behaving. It is a chicken and egg question. It really is hard to know what comes first, when like me, you believe everything has a physical basis. We do know that thinking can actually change brain scans, as I show in the book.

So it is a very interesting issue, and as a neuroscientist I would not like to come down saying you can deconstruct the brain into certain areas that would allow people off the hook in legal terms because then you might end up saying, ‘Well, Osama bin Laden had a bad day on 9/11, as he had been eating Twinkies’! No one would find that very convincing, so why should he be held culpable and someone else who had also been eating Twinkies ‘not guilty’?

It does come down, once again, to where you draw the line. All of us, eventually, might have deconstructive analysis of our different genes and predispositions. Have you seen the film, The Boys from Brazil, where they clone Hitler but then have to repeat all the environmental influences as well. As a neuroscientist, I think it a very tricky question.

Dr. Tom Pink: I strongly think that there is going to be much less emerging from neuroscience than people suppose. The notion that there can be lots of influences outside one’s freedom on the way one exercises it – lots of causal influences – is something we knew all along, for otherwise we should not bother at all to persuade people, i.e., to seek to act as a causal influence on their acting in some way we prefer than in another. There are also people like Libet saying they have discovered wonderful things, but a lot of that depends not upon empirical data but rather on interpretation. People are very often importing into neuroscience certain models of freedom that are not compulsory.

Baroness Greenfield: Freedom or free will?

Dr. Tom Pink: Free will… One thing about which you write very interestingly, which is closely bound up with the notion of individuality, is creativity, a way for people to be individual in future.

Baroness Greenfield: Yes, that is the fourth option, in case any of you were waiting to hear what that was… If one buys into such a ‘sound-byte’ analysis of the options, with none of the three options previously outlined offering fulfilment with individuality, creativity does offer, in my view, a way of being fulfilled and individual. By being ‘creative’ I do not mean only writing a symphony or a book: it might be found in the way you rearrange your kitchen. I shall not say anything here accountancy being creative!

But it can be. In everyday life, you suddenly see a connection, or a different way of doing something, and say, ‘Aha!’, or ‘Wow! Eureka! That is it! I have got it!’. This is the wonderful thing about science even if I agree with you that science is not going to answer the big question about free will. It may answer the question of consciousness which is another story altogether, but as always it depends upon the question you are actually asking. Creativity can, I think, be thought about in neuroscientific terms, whether or not you agree with me that it is a desirable thing to have.

You get a lovely excited tingle in science when, as someone said, you can see what everyone else can see but think what no-one else has thought. For me, it is a heady feeling to be the first person to have a thought, however banal, even if it is just about how to rearrange the kitchen, let alone how we might cure Alzheimer’s Disease.

In my view creativity can be analysed in three stages, two of which are necessary and the third sufficient. The first is actually to be able to deconstruct or to challenge dogma. Take painting (and I am not a painter): if we wished to paint this glass, it would be necessary first of all to deconstruct it you into colours, shapes and textures and three-dimensionality. It must be seen in sensational rather than cognitive terms. In science you have to challenge dogma, asking, ‘But why are we assuming that?’. It is necessary to challenge something – a convention or an association – and pull that apart. That is the first stage.

Next you have to bring some things that are unusual. That is what schizophrenics often do: they make up words. People on drugs will often do the same and claim to be very creative. Children sometimes do it: I recall a picture of a purple sheep made by the four-year-old child of one of my post-doctoral students. That again is not really sufficient, because the little picture of the purple sheep is not hanging in Tate Modern and no one is claiming it is a great work of art. My own view is that it is not sufficient to deconstruct something or to bring together unusual elements – whether ideas or words or painting – but the result has to mean something. When you look at Damien Hirst’s sheep in formalin, why is that different from the little girl’s purple sheep? Surely it is because it makes you think about certain things and see the world in a new way. You may not agree with it but it makes you have thoughts that you did not have before. In the same way science will suddenly trigger lots of new associations. In neuroscientific terms you break conventional associations and forge new ones, and that combination triggers in turn a whole raft of new associations, both in yourself and in the beholder. As a result you say, ‘Aha! Now I understand, now I see something in terms of something else.’ That for me is creativity.

Dr. Tom Pink: Do you think that understanding the neuroscience of creativity, if we can do such a thing, is going to enable people to be creative? Is it as simple as that?

Baroness Greenfield: That is an interesting question. I do not know how many people here are familiar with Jackson’s famous mind-game of ‘Colour-Blind Mary’? You know how philosophers like to present strange scenarios that would never exist in reality to prove a point? Suppose that this Mary has been raised in a completely monochrome environment, and she is yet a brilliant physiologist treating of colour vision. She knows everything there is to know about colour vision. Now we ask if, raised as she was in a colour-blind environment, Mary were suddenly exposed to a world of colour, would know anything more than she knew already, given she knew every single fact. I think your philosopher’s question come down to this.

Just because you know everything about something, do you really understand it? That does, of course, beg the question, ‘What do we mean by “understand”?’. If it amounts to ‘experience something first-hand’, then probably No. But if ‘understanding cause and effect’, ‘understanding sequences of events’, then Yes. But it does depend on what you really mean by ‘understanding’. This perhaps touches on religious issues. Can we oppose understanding something at a visceral level to being able to understand something only in a cerebral manner.

Speaking personally, in the matter of religion I can understand things in a cerebral way, but the visceral experience eludes me, as some of my friends here know. I am quite comfortable with understanding in those two guises. In neuroscience we need to crack the issue of consciousness. As things stand, however, neuroscience tell you about processes in the brain, and help you interpret and correlate those processes with phenomenology, with what people are feeling. It does not tell you how the water has turned into wine, or how the everyday bump and grind of physical events in the brain translates into subjective experience which is very different.

Dr. Tom Pink: I suppose there is a big question that always arises when talking to brain scientists in this area: What has one still to understand when one knows all the neurophysiology? What is added by common-sense psychology, as philosophers often call it? You seem to see the mind as the personalisation of the brain. Could you say a little more about what you mean by that?

Baroness Greenfield: Neuroscience is bedevilled by semantics because people use words in ways that implies they are synonymous when they are not. In the past people often talked of mind versus brain. Then there was a backlash and some scientists have simply shrugged it off and declared, ‘Well the mind is the brain’. That is not so. Why have two words if you need only one. You do not say, for example, ‘I am out of my brain’.

The two words do not go hand in hand, and it strikes me that the mind might be the personalisation of which I spoke earlier: the personalised configuration, with endless updating of brain-cell connections driven by unique experiences. When you are born you evaluate the world by your senses: how sweet, how fast, how bright. Gradually these will coalesce into a recurring pattern – your mother’s face, for example, which you might address as, Mum’. She will feature again and again in your life, and by that route you will move from the sensory to the cognitive. Even if something more novel, more bright, more sense-laden is presented to you, your mother will nonetheless remain more significant to you. Sadly with dementia those connections are dismantled so that you retreat back into a sensory world where a face means nothing anymore: just an abstract image and sensation.

When we talk of ‘losing the mind’, ‘blowing the mind’, or ‘letting yourself go’, we are thinking of conditions in which, temporarily or permanently, for one reason or another, we are not using the connections. I am quite comfortable with the mind as something above and beyond mere generic brain. For me it is the personalisation of the brain in the way described.

How does this relate to what is called the soul? I used to say, ‘Well, if you say the soul is immortal, the brain certainly is not, so the two should not be conflated’. The mind, meanwhile, is part of the personalisation of the brain, so one can make distinctions. If we are to focus on semantics here, I should say that consciousness is something else again. People often talk about consciousness in the mind. In my own view consciousness is what you hope you are going to lose tonight in sleep. It is what is lost under anaesthesia. You can lose it without necessarily losing your mind, correlating with an evanescent phenomenon in the brain: not the local hard-wired connections, but something more temporary and transient that we are now starting to study in my laboratory.

Dr. Tom Pink: Your description is very striking in that it is, in some ways, almost the reverse of a fairly venerable picture I have encountered: the most mentally mental is actually associated with generality and reason, while the particular is associated with the material, as also would have been consciousness. That is, I think, because you are, unsurprisingly, very influenced by some more recent philosophy than I often study.

Baroness Greenfield: We must be careful not to equate the general with the abstract. Are you alluding to abstract concepts?

Dr. Tom Pink: Well, general concepts that can be applied to whole classes of things. It is just a way of thinking a particular thing.

Baroness Greenfield: Certainly I think we should be careful to distinguish that from an abstract concept, like love.

Dr. Tom Pink: Oh, yes, absolutely. But you can have a general concept of something.

Baroness Greenfield: Yes, exactly.

Dr. Tom Pink: Anyway, thank you very much, and perhaps it is time to move on to this evening’s general discussion.

General Discussion

Dr. Antoine Suarez: I am a quantum physicist. When you say that even thought has a physical basis, which kind of physics are you assuming?

Baroness Greenfield: We shall come to the ‘physics’ presently. What do I mean by ‘physical’? Let us take an example. There were three groups of adult human volunteers, none of whom could play the piano. The first group – the control group – stared at a piano for five days, and their brain scans, perhaps unsurprisingly, were unchanged. The next group learned five-finger piano exercises and, remarkably, in an area of the brain related to the digits, an expansion in functional terms over just those five days was seen. But the most remarkable group was the third. These volunteers were asked to imagine they were playing the piano, and their scans were the same as those of the people physically doing so. This is why I claim that a thought is something that has purchase.

Now, the actual physics of it is not my area of expertise. It is not quantum physics. Roger Penrose, as I am sure you know, has argued that this might be the basis of consciousness. I have two problems with quantum physics applied to the brain. One is that the temperature of our hot brains would make it unworkable, because it has to be very, very cold for these tiny events to take place. Additionally, the quantum physical world is one that might apply equally to the heart, or to any tissue or cells in the body. That theory requires another constraining factor that disenfranchises the rest of the body and makes the brain special.

We are talking about the physical world, the macro-physical world, the Newtonian world, and we can see physical changes if someone is asked to think.

Dr Antoine Suarez: But are you assuming deterministic physics in the brain?

Baroness Greenfield: That is a very hard question to answer. Are you alluding to the ‘butterfly effect’? It is very hard to show a very clear relationship, because – one can, I am sure, apply chaos theory to the brain – there is such a concatenation of things, all influencing each other, that a very small effect might widen out into a global effect in brain terms. Take a simple gene. If it is activated it can cause thirty or forty thousand different types of proteins to be made. Each of these in turn will have impacts on things within the brain, including the switching on of other genes. The whole thing is inter-related and complex, so complex that, as with chaos theory, tracing is impossible.

Russell Wilcox: I have two questions to ask. One is built on abstraction. You spoke of the symbolic distinguishing the human person – a symbol user – from society. That is founded on one tradition in anthropology. In a sense, the archetypal system of symbol use is language. Wittgenstein said that there is no such thing as a private language. That seems to probe more deeply. What is it, then, that underpins our capacity to use and manipulate symbols? I wonder further about your speaking of symbol use in terms of building up individuality. In a sense language allows us to realise our individuality through a sociability, a socialisation process. In other words the idea of individualisation or personalisation can be entirely harmonious with, and dependent upon, living in a community. Could, therefore, a distinction be drawn between individualisation and personalisation, with the latter taken to mean the process of formation of self-identity through the mediation of the other.

Baroness Greenfield: I am not quite sure what the question is about, but let me respond with a few random thoughts. The first is the on issue of thought and language. I am sure many people here have a more scholarly understanding of that distinction than I, but in my book I attempt to analyse what we mean by ‘thinking’, and what distinguishes that activity – in brain terms, or cognitive or psychological terms – from other mental processes. When you are awake, you are not necessarily ‘thinking’ all the time, for otherwise you would scarcely need the word.

To cut the story short, I believe what ‘thinking’ has that mere experience or just being awake has not, is that if you are merely awake and not thinking, is some sequence or beginning, middle and end, an ‘order’. ‘Thinking” involves what people refer to as a structured string: you move from A to B, from C to D, in some way. You do not necessarily need language for that, because animals can do it. Language helps enormously, but it is not necessary. What is needed is a sequence.

Russell Wilcox: But what about the process of abstraction which humans can undertake, but not animals? That would seem to require some sort of linguistic process. There is a contemporary philosopher, Alasdair Macintyre, who has written on human nature, categorising human beings as ‘dependent rational animals’.

Baroness Greenfield: There are several points there. If it is accepted that you can draw a distinction between language and thinking, abstract concepts would certainly be very hard without something to stand for that abstract concept, most typically a word. In this regard, I fear for the current screen generation… Take, for example, the Robert Frost poem about two paths [‘The Road Not Taken’]. How would you show that on a computer screen? Would you show two paths? It would be ludicrous. Many in the current generation are literal – and I hate what is literal, i.e., not metaphorical – and, if you have literal, physical things in your face, what you see is what you get, and you end up taking the world at face value. Might not those living in a world that is literal end up not being able to form, or conceive, certain abstract concepts…

Russell Wilcox: That, in fact, leads on very nicely to my second point. You spoke of how we are constituted as human beings, but then moved on to what seemed to me a highly voluntarist, indeterminate scenario in which we are able to choose what we value. I, for one, do not find that very convincing. On the one hand you say it is quite possible that what fulfils us may be to some extent objectively determined, but then you say, ‘Well, if it does not do it for you, that is fine; but it does it for me’. Can those two things be entirely separate? Is there, or is there not, an objective basis upon which to say what will fulfil us as human beings.

Baroness Greenfield: Let us consider this entanglement of ourselves with the environment. What I find fascinating about the human brain is that we are in a chicken and egg situation: ‘Does the environment shape the brain and determine, in conjunction with the genes, the sort of person you are, or is there some inner you that actually imposes itself on the outside world and shapes the environment?’. It is, of course, both at once. That is to say, as our technology changes that changes the brain, but by the same token, the brain is mandating certain changes in the outside world and actually requiring the environment to supply computer games, or whatever. It is hard to say what came first: ‘Our culture is making us what we are’, or, ‘We are demanding a certain type of culture’. There is so close a dialogue between brain and environment that you cannot really disentangle them, and that would include, I think, inter-relationships…

Russell Wilcox: That is not exactly what I was asking. The question is what would fulfil me as a particular natural type, as a human being. If you have such a standard, you can see what environmental factors might inhibit that development, and which realise it.

Baroness Greenfield: Well, at one level, one might say, ‘Well, look, this is great; people pay money to do certain things, to go skiing or to dance, or to have fine food and wine, so we shall create an environment with all of that available and everyone will be happy’. I find it interesting that although people to a greater or lesser extent like to do such things in one way or another, no one but no one wants the ‘Brave-New-World’ scenario of happy pills. Well, perhaps this generation does; I simply do not know. On the whole, however, we do not. You would all feel very sad if you met someone you thought unfulfilled, even if such a person said he or she spent all the day on the beach, at the bar, or on the dance floor.

Russell Wilcox: So it is not entirely up to them, then?

Baroness Greenfield: Ah, that is a separate issue. Happiness is not happiness, as it were. It would all be so easy if there was only such a hedonistic world to consider. A favourite literary work of mind is Euripides’ Bacchae which draws a distinction. You need two forces in life, the bread force and the wine force. If you accept that mere hedonism is somehow, although one might like indulging in it, not quite what you want, what is missing? Why do we make ourselves jump through the mental hoops? Why are all of you here this evening rather than in the bar or on the beach? It is because we are seeking something. In my view it is an understanding, a notion of self – the constant birthright of human beings. That is my suggestion…

Russell Wilcox: I do find that convincing…

Baroness Greenfield: Oh, good. Thank you!

Russell Wilcox: …but on that basis, it would have to be the same for others as well…

Baroness Greenfield: It does not have to be some kind of gold standard…

Dr. Michael Platt: There are very many people who do not realise that they are unfulfilled. I am an anaesthetist actually, and your comment about anaesthesia struck me. Actually, anaesthesia is utterly different from normal sleep. It has a different EEG. Anaesthesia seems simply to stop the cells working, so that nothing seems to work and your mind actually stops working. Some people do claim, of course, that they are conscious and that they can see themselves on the table.

I am a pain specialist, too, and I find the complexity of pain one of our great challenges. I doubt actually if we shall ever really sort it out…

Baroness Greenfield: May I say, then, a little bit about pain?

I alluded just now to the idea that we are looking at things in the brain beyond the hard-wired connections in order to try to understand consciousness, and this features in my other book, The Private Life of the Brain. There I argue that consciousness comes in degrees, as indeed we know that anaesthesia comes in degrees. If you are looking for a correlate – and I do stress a correlate, something that matches up, that cannot be called simply cause or effect – we do know now that brain cells can form highly transient, global coalitions that last for less than a second, and then disband. These cannot be seen in conventional brain scans. They can be seen in animal brains with optical imaging which is toxic to the brain. My own suggestion is that these – which I call ‘assemblies’ – expand or contract on a sub-second timescale, and with that you will have degrees of consciousness. How one relates to another we do not know, but let us just say that they do. My proposal in the book is that if you list some of the things we know about pain, the larger the assembly, the greater the degree of pain perceived.

When we speak of pain we express it in terms of other things: ‘pricking’, ‘stabbing’, ‘burning’. We also know that pain thresholds can vary diurnally, which is not due to long-term changes but rather to surges in bio-rhythms, chemicals and so on. Pain is largely absent in dreams, which I describe as a small assembly state because not driven by external stimulus. Pain thresholds are higher in schizophrenics, that is to say they feel pain less. I argue they have small assemblies. There is a lower threshold in depression where, as you know, there is a co-morbidity between feeling pain and being depressed, and I argue that this is due to large assemblies, because of the chemicals involved. Morphine works on special, naturally occurring chemical messenger systems which inhibit the sizes of assemblies. People say they feel in a dream-like state with morphine, which is actually quite interesting.

The old anaesthetics fascinate me because they were slow to come on, and the person undergoing them experiences periods of hyper-excitability and euphoria which characterise what I call the small assembly state. Anaesthetics might work at this level of brain organisation, not on a single brain area, not on isolated cells, but at this mid-level of dynamic coalitions.

Dr. Michael Platt: We always explain that in terms of ‘de-inhibition’, if you like, of the cells. It is a bit like getting drunk. As you go deeper, you start to inhibit your other cells.

Baroness Greenfield: What I am saying is not completely discrepant with that.

Dr. Michael Platt: What you are saying is fascinating. There was something else, something that fascinates me, about which I wanted to ask you. Take someone who has neuropathic pains, say in a hand. You cannot touch the part, because the patient will always say no. If I ask the person, with eyes closed, to tell me when I touch the hand, I can actually hold my finger half an inch away from the skin, and the pain of me touching them is actually felt. When there is ‘neurological deficit’ in some part of the body, the relevant part of the brain magnifies that area. It is almost as if it grows. Do you have any explanation for that fascinating experience.

Baroness Greenfield: Well, we know that the brain is utterly in need of what we call pre-receptor feedback. It needs lots of feedback, all the time. Perhaps this is not answering your question, but I hope it may do so… Are people here familiar with phantom-limb pain? Melzack and Wall long ago suggested that there is a matrix of cells in the brain, and that, if input was not working quite as it should be, if you were not getting the appropriate pre-receptor feedback that your limbs were there, it might actually cause inactivity of the cells I would call a large assembly. This would in turn result in pain.

Also fascinating is the notion of ‘ether-frolics’ and also the nitrous oxide people used to take at fairs. Then there is ketamine, a drug that in high doses is an anaesthetic, and in low doses a drug of abuse…

Dr. Michael Platt: It is actually very good for pain.

Baroness Greenfield: My own suggestion is that the small assembly is a direct hedonistic pleasure, and with the old anaesthetics you actually went through a small assembly phase, as the assemblies gradually, gradually shrunk.

Dr. Michael Platt: Fascinating. So we like the smaller rather than the bigger assembly?

Baroness Greenfield: The small assembly is, so top speak, the brain of the child, of the infant, and of the animal. That is to say, it is about the hedonistic, the here and now, and it does not recruit the pre-frontal cortex which is less exaggerated in such brains. That is why I argue that the screen is so pernicious. It encourages a world of the ‘here and now’, of process rather than content, one that does not involve, encourage, or activate the pre-frontal cortex.

Dr. Michael Platt: The sale of books is goes up and up, and yet people are using screens more and more. There must be some difference between reading on the screen and in a book.

Baroness Greenfield: I do not know how sales of books equate with literacy standards which are certainly not following the same happy trends. I am very much worried by the rise in attention deficit disorder. When you solve an IQ problem or when you are playing a computer game, the emphasis is on the experience of the moment, the thrill of winning, the thrill of seeing a connection. When you play a computer game involving rescue of a princess, you do not care about the princess. When, however, you read a book, the whole point is that you care about the princess, about what she is feeling and doing. The other kind of world emphasises sensation without meaning, whereas a book emphasises meaning but with very little sensation. You are not really having an exciting time, for holding a book is not a brilliant sensation, but it is all about cognitive rather than sensory input.

Dr. Michael Platt: There is a current theory that a brain is activating a movement long before the person thinks about it. What do you think about that?

Baroness Greenfield: Hornekiewicz, who you will know but the others may not, was a great pioneer of medication for Parkinson’s disease, and he once came up with a wonderful expression: ‘Thinking is movement confined to the brain’. That harks back to the notion of sequencing – of a beginning, a middle and an end. Movement is a sequence of muscle contractions. I do think thinking and movement are very closely related.

Dr. Michael Platt: Are you, then, subconsciously thinking about it before you are actually moving? Does this relate to how mathematicians can produce an answer to a problem sometimes without actually knowing how they have thought of it?

Baroness Greenfield: That again touches on the idea of Niels Bohr, that without necessarily being logical you can suddenly intuit a connection. In a very creative way you can suddenly seeing a connection. I know Roger Penrose, and I have spoken to him about it. He looks at things in a very physical, pictorial way, to see a connection.

Dr. Andrew Hegarty: Might I ask if Tom wants to come in on anything, having listened to these other interventions since the dialogue between the two of you finished?

Dr. Tom Pink: In this matter of movement before awareness are the Libet experiments relevant?

Baroness Greenfield: I am sorry I did not deal with that. For those not familiar with Benjamin Libet, he has earned a place in neuroscience history by undertaking an experiment – one for which I have been the subject of a variation of. It is very simple. There are EEG electrodes on your skull to record your brainwaves. All you have to do is to press a button whenever you fancy doing so. Analysis of the brainwaves shows that your brain has already changed before you press the button, before even you thought you wanted to press the button. This has been scrutinised and critiqued. The free will question arises. When I did the experiment for a TV programme, the whole crew ended up arguing about free will!

Dr.Tom Pink: It is, of course, a classic case of interest in an experiment depending very much upon what interpretation you give the results. Some of what Libet himself thought is rather complicated, but some of his followers have found here what they see as an empirical disproof of free will. It looks as though a person’s behaviour is fixed, or an action fixed, before any proper decision has been taken, or certainly before any freedom has been exercised. Actually, all that the empirical data show is that what they do (the choice of button or whatever) is fixed before they have a belief that it’s fixed in a certain way. That shows nothing about freedom, unless we insert a very disputable assumption, to which Libet himself clearly adhered, that the exercise of freedom is inherently conscious – i.e., that you cannot be exercising freedom in a free decision prior to being aware that you are doing it. That is very debatable. If, of course, you do not make that assumption the empirical data show nothing.

Baroness Greenfield: You ought also to distinguish between the unconscious and the subconscious. As the philosopher, John Searle, said: ‘When I go into a restaurant, I do not say, “What are my genes going to order?”. If I want a hamburger, it is I who orders the hamburger’. Yes, you can deconstruct it all in terms of genetic predispositions, glucose levels, memories, and so on, but that does not negate this other point. If you feel that it is you who has ordered the hamburger, that is what you have done. Just because we can see the physical correlate it does not invalidate the other.

Dr. Tom Pink: I fully agree. There is a problem about the whole area of free will – a sign in fact of a much wider problem – in that people often take up scientific models and assume that you can immediately map the results onto a power of the person, showing that the power in question is no more than this or that. It is usually not a good idea to think in those terms.

Baroness Greenfield: I actually think there are two problems with science there. One is the confusion of correlation with causality – with brain scans, for example, just because an area of the brain is active during a certain task, it does not mean to say that it has caused the action; rather it is merely correlating. The other is the natural tendency we all have to keep life simple and to try to explain things by cutting them down in a reductionist way. My father used to say, ‘You are nothing but 10 shillings’. I would respond, ‘Well, I am 10 shillings’ worth of chemicals!’.

I am very annoyed by the ‘nothing-but-ism’ of some people: the notion that you can reduce things to a simpler level, to the gene, or the ‘transmitter for pleasure’ (as though it had pleasure trapped inside it!), or a brain area ‘for’ this or that (as though it were an autonomous mini-brain)… It is a tendency we all have. People in the ’Sixties used to liken the brain to a computer because things were easily explained in that way. It is, however, simply wrong to try to understand the brain by reducing it to its component parts.

Dr. Andrew Hegarty: Might I just bring the conversation back to the information technology of which you were somewhat critical? You spoke of it as two-dimensional and also of the preference for process over content. Could elaborate a little on that?

Baroness Greenfield: Certainly. In speaking of two-dimensionality I meant it quite literally: the screen is two-dimensional. People sit looking at a screen rather than moving about in three dimensions. I meant nothing fancy or metaphorical. On process rather than content, in order to mean something, that is, have a content, one thing must be seen in terms of something else. As we develop, so we acquire meaning in our lives.

You are born as a small baby into the booming, buzzing confusion, where you are bombarded by senses, and nothing really means anything – it just is. Gradually, you will recognise your mother’s face, and she will mean something to you. She will feature again and again and again in your life in very many episodes, and the meaning will be enhanced. I like to give the example of my brother – and he hates my doing so! When he was three years old, I used to torture him. One of the tortures was to learn Macbeth. So like a little parrot, he had to learn his stuff, or otherwise I would hang him upside-down by his feet. Someone once observed that there must have been quite a storm in playgroup with a three year-old reciting,

To-morrow, and to-morrow, and to-morrow,
Creeps in this petty pace from day to day.

His one extemporisation was, ‘It is a tale told by an idiot, just like Susan’. He thought that was a very funny insert. The point I am making is that, had you said to him, ‘What does “Out, out, brief candle!” mean?’, he was only three and could not grasp the extinction of the candle as a metaphor of life and death, despite of course knowing what a candle was. He could say those words in a literal way – he had learned the facts, if you like – but he could not put one thing in terms of another with a little three year-old brain.

What concerns me about the screen is that it is keeping people in that state of ‘What you see is what you get’, or, ‘A candle is a candle’. That is a world where things are literal, where things happen, where one thing reacts against another or people move around in an autistic way, but all without becoming embedded in an ever wider context in which one thing can be expressed in terms of something else. Unsupervised young people approach the computer screen in that way: see something, react; see something, react…

Compare that with reading a book: the author – the authority – takes you by the hand and you go on a journey. It might not be a journey you like, or one on which you want to go, but it is a journey nonetheless. Intellectually you end up in a place different from the one where you started. Then you go on another journey, and another and another. Gradually you will come to evaluate each book you read, each journey you go on, in terms of the ones you have read already, because that is what human beings do. That is what I call the conceptual framework you build up to enable you to navigate the world. Everything that happens you put in against the checks and balances of the conceptual framework you have already made. That is what education is, or should be: equipping persons to navigate their society, their culture, and their world, as well as to make sense of it all in a way that a very small baby or an Alzheimer’s disease patient cannot: all because you have an infrastructure of connections and associations. The more extensive and deeper that is, the more you can understand the world as you get older.

Dr. Andrew Hegarty: So we can speak of ‘cyber-deprivation’?

Baroness Greenfield: Yes, and my fear is that if you live in a world where you are served up one atomised experience after another, you are not going to reach understanding. I once got very upset with the BBC for advertising something to be broadcast on revision: ‘Is your revision getting you down? Tune into this and it will be served in bite-sized chunks’. That is the very opposite of what is wanted! A fact has to be placed in a context…

Facts standing alone are totally irrelevant and boring. If I said, ‘Come to dinner, and either sit next to Bill who will tell you facts all evening, or next to Ben, who has some ideas’, only the autistic would wish to sit next to Bill. Autistic people love facts.

The whole point, for me, is that information is not knowledge. Computers deliver information, facts. It is necessary then to join up the facts into a wider concept that is not automatically there in software. It could be there, if people thought about it, but you cannot take it as an article of faith that it is going to happen. It is not a given that a child parked in front of a screen will navigate intelligently through search engines and attain a conceptual framework, rather than just facts and facts and facts without any necessary associated.

Dr. Tom Pink: You write a lot about computer games in the book, but I think another aspect of the internet is very important in a different way, namely the world of blogs. In some respects, it seems to me, blogs are delightfully old-fashioned: like, early seventeenth-century pamphlets, only on speed…

Baroness Greenfield: Like Speakers’ Corner, in fact!

Dr. Tom Pink: …and they ricochet off each other, again just like seventeenth-century pamphlets. Here you have a world of massive interpretation, and some of it highly sophisticated. There are blogs that quote Horace…

Dr. Andrew Hegarty: In fact, even blogs written in Latin.

Dr. Tom Pink: That is, surely, utterly unlike the nasty world of computer games of which you write.

Baroness Greenfield: What worries me there is that all the time you spend blogging, you are by definition doing less of something else. Time is limited. That can be extended to social networking. Someone I know once boasted to me about having 900 friends. I think that challenges the very notion of what a friend is! I ask if such ‘friends’ will lend money, stay up all night for you, walk along the beach with you and talk over their problems. It is rather like what ‘texting’ is to writing a proper letter. My mother and aunt during the war used to write letters that my generation probably would not, or could not produce. There are nuances in a letter, with the writer trying to find the right words, and put them in the right place, and express inner feelings. The abbreviated forms of texting cannot be compared with that at all. We are coarsening our relationships in a world of speed-dating and text-messaging, one in which where rapidity and quantity trump quality and depth. If it does not sound too old-fashioned, I believe quality and depth are attained only with investment of time – and if you have 900 ‘friends’ and you are blogging, you lack time.

Josephine Quintavalle: I am an ‘old fogey’ who wants to read books and always encourages everybody around to write. I wonder, however, if the problem with the computer might not be that it is still in its institutional youth and that we are not yet in a position to use it creatively. Today, The Guardian newspaper was distributing a free download of painting: 5,000 copies of the Turner Prize-winning piece that were being given away free at midday. There is a lot more to come out of the computer that we have as yet not even explored. We are, in a sense, still cavemen drawing on the wall.

Baroness Greenfield: I could not agree with you more, and although I know I do sound like a Luddite, I always insist that I am not. I argue passionately that web-designers and computer people should sit down with educationalists and parents and neuroscientists, and determine what we want children to learn. Should we put a premium on creativity? If so, how can we devise software to attain that objective? Are we to put a premium on understanding of abstract concepts? How can we devise software to deliver that? How can we deliver to children a narrative, the sense of a greater conceptual framework, a sense of significance, of content, of process. The difficulties are not insuperable, but we do need to work together. The input is currently mainly visual, with an auditory element added. Imagine a world – and it could be with us quite soon – of embedded computing. That is to say, a world in which your sweater and your glasses have computers in them. I could then ask my watch for the date of the Battle of Hastings. We may encounter a shift from the visual to the auditory – a world in which we might be in permanent Google mode. Imagine it. A man called Gessler has written interestingly of Things That Think.

Josephine Quintavalle: But history has been abolished, has it not?

Baroness Greenfield: No, but it is themed now! Note on the point of transition to the auditory that many people now talk on their mobile telephones a lot. What world are they in? They are not looking at anything, nor are they in the immediate world either. Are we all going to be, all the time, like those people one sees walking in the streets and talking to themselves with a glazed expression.

Alexander Boot: I have found your view of the world fascinating, particularly because it is entirely different from my own.

Baroness Greenfield: As I suspected! I thought I had had too soft a ride so far!

Alexander Boot: If I understand you rightly, you believe that the mystery of man is ultimately soluble by purely physical means.

Baroness Greenfield: No, not at all…

Alexander Boot: I am an ignoramus in your field which I know only from popular books, but when it comes to what man actually is, what the mind is and what consciousness is, I do not think we are any nearer to understanding than we were at the time of Augustine.

Baroness Greenfield: There are several issues here: the mystery of man, big questions like, ‘What is the meaning of life?’, or, ‘Why are we here?’, and so on. I do not think, however, that a very particular question like, ‘How does the brain generate consciousness?’, is in the same category as those others. Certainly, science cannot help me answer the question, ‘How do I prove to you I love my mother?’ – which I do and I cannot. Brain scans and operational acts in whatever number could not prove it. Whereas, just because we have rather crude tools in science and always have had, although they are getting better, and have not yet solved a particular problem, that does not mean we cannot do so eventually. There is a big difference between questions not tractable by science, ones which never will be, and those that are, but simply have not yet been answered.

Alexander Boot: Dr James Le Fanu has just written a book, due to appear in February, about the decade of the brain and the Genome Project. His argument is that they have contributed next to nothing to our understanding of what the human being is. He just stops short of saying that this state of affairs will persevere to eternity.

Baroness Greenfield: Well, you could have tried to tell Christopher Columbus not to sail beyond the harbour walls because he has as yet contributed nothing concrete. You could take such a line, but I should not find it very helpful. Why not have a go, and then see?

Alexander Boot: Because, to use your very own analogy, when we have a go at that, we are not having a go at something else, something perhaps more useful and more practically achievable.

Baroness Greenfield: It is a fact that we live in a world dominated by science and technology, so we must embrace that and try to understand it and its impact on our brains.

Alexander Boot: I would argue that it is even more suicidal to live in a world without knowing anything at all about metaphysics.

Baroness Greenfield: Well they are not mutually exclusive. You can try to know as much as you can in as wide a sweep as possible.

Alexander Boot: I absolutely agree that they are not mutually exclusive at all, but some scientists make it sound as if they were.

Baroness Greenfield: Well, that is different. You should not conflate science with scientists!

Alexander Boot: Science is too important to be left to scientists!

Baroness Greenfield: You have highlighted the need to make several distinctions. One is that certain questions are not meant for science: ‘What is love?’; ‘Why are we here?’; ‘What is the meaning of things?’. Those issues cannot be tackled by science, but I would hold that a question like ‘How does the brain generate consciousness?’ is one of the most exciting things science can ask. I would certainly wish to carry on there, rather than give it up as insoluble. It is certainly true that the Taliban-esque views of some scientists are intellectually bankrupt. There are others like, for example, Francis Collins, who mapped the Human Genome and is a very committed Christian. You cannot simply equate scientists with atheists. It is certainly interesting that some very talented scientists are very committed believers.

You do need to frame questions very specifically if you want specific answers. I have tried to define ‘mind’ by looking at science, at what happens during the development of the brain, at how experience impacts on that. I think that there has been some progress made. You may disagree, but take care not to throw the baby out with the bathwater. Just because we cannot in one jump, from a standing start, solve very lofty and – frankly – ill-defined questions, why should we cease trying to make progress with very formidable tools and some great insights that we are getting in biology?

Melissa Pavey: Well, mine is not a precise question, perhaps not a question at all. I should like to turn back to one of your earlier comments. I am interested in biotechnology and how you said it might smudge generational frontiers. I believe I am right in thinking you hold that our brain reaches its full maturity way past the point when it has finished growing. That is not to take into account the fact that synapses still occur. Might one look at someone of a certain age, with perhaps two or three generations behind him or her, and not necessarily appreciate the great age in question – all the while perhaps suspecting something because of wisdom displayed? I suppose I am really asking about identity.

Baroness Greenfield: Taking the whole nightmare scenario as I paint it, if the biotechnology, the nanotechnology, and the information technology were all at work in smearing distinctions, one might find there would be no more wisdom. If you live in a world where you are spend all your time in two dimensions saying ‘Yuk!’ and ‘Wow!’ and rescuing a princess you do not care for, you will not have much insight or wisdom.