Category Archives: Philosophy

Double Vision

When we try to think critically and to open our imaginations at the same time, a kind of double vision results. At one and the same time we develop our awareness of potential alternatives, making our thinking more flexible, but still remain aware of the limitations of our beliefs, and do not allow our imaginativeness to slip into credulity. We develop meaning but also control belief. It seems to me that developing this double vision is one of the hardest parts of the practice of the Middle Way: but if we are to avoid absolutizing our beliefs we need to develop both meaning and belief. Those of an artistic disposition will find it easier to imagine, and those of a scientific disposition to limit their beliefs to those that can be justified by evidence: but to hold both together? That’s the challenge.

I’ve been reflecting more on the metaphor of double vision, since I heard it used recently in a talk by Jeremy Naydler in the context of the Jung Lectures in Bristol. Naydler used this metaphor in a talk called ‘The Inner Beloved’, which was about the way in which visionary men of the past have maintained images of beloved women that were actually projections of their own psyches (what Jung would call the anima). He spoke of Dante’s vision of Beatrice in the Divine Comedy and Boethius’s figure of Philosophia in The Consolations of Philosophy. These were not ‘real’ women, or had the slightest of relationships to real women, but rather became powerful archetypal symbols of the part of themselves that remained unintegrated. They were the focus of yearning, but also the path of sublimated wisdom – never possessed but always beckoning and challenging.

The capacity for double vision is central if one is to cultivate such a figure: for if a man were to project it onto a real woman (or vice-versa) the results could be (and often are)disastrous. “Being put on a pedestal” probably creates conflict when the real person starts behaving differently from the idealisation – for example, needing time of her own away from a relationship. It is only by maintaining a critical sense of how the mixed up, complex people and things in our experience are not perfect and do not actually embody our idealised projections that we can also give ourselves an imaginative space to engage with the archetype itself. Recognising that the archetype puts us in touch with meaningful potentials, showing us how we could be ourselves, and how we could relate to the world, can provide a source of rich inspiration that I see as lying at the heart of what religions and artistic traditions can positively offer us without absolute belief. 

The annunciation, a Christian artistic motif that I’ve previously written about on this site, for me offers an example of the archetypal in its own terms. For most of us, it is much easier to look for the archetypes in art, and separate this mentally from trying to develop balanced justified beliefs with the real people we meet every day, rather than prematurely over-stretching our capacity to separate them by risking archetypal relationships with real people. That’s why lasting romantic relationships need to be based on realistic appraisal rather than seeing the eternal feminine or masculine in your partner, and also why venerating living religious teachers like gods may be asking for trouble.

Personally, I do have some sense of that double vision in my life. My imaginative sense and relationship to the archetypes has developed from my relationship to two different religious traditions (Buddhism and Christianity) as well as from the arts and an appreciation of Jungian approaches. On the other hand, my love of philosophy and psychology provide a constant critical perspective which also provide me with a respect for evidence and a sense of the importance of the limitations we must apply to practical judgement. Sometimes I find myself veering a little too far in one direction or the other, slipping towards single vision rather than double vision, and then I need to correct my course. Too much concentration on cognitive matters can make my experience too dry and intellectual. Underlying emotions and bodily states can then come as an unpleasant surprise. On the other hand too much imagination without critical awareness can reduce my practical resources in other ways, as my beliefs become less adequate to the circumstances.

Our educational system overwhelmingly only supports a single vision, with the separation of the STEM subjects on the one hand from arts and humanities on the other. But a single vision seems to me an impoverished one, even within the terms of that vision. Those with a single vision based on scientific training and values tend to have some understanding of critical thinking, but to think critically with more thoroughness it’s essential to be aware of your own assumptions and be willing to question them – which requires the ability to imagine alternatives. There are also those with a single vision who are willing to imagine, but tend to take the symbolic realm as in some sense a key to ‘knowledge’ of ‘reality’, and thus uncritically adopt beliefs that they can link with their imaginative values. For example, those who, like Jung, find astrology a fascinating study of meaning, often seem to fail to draw a critical line when it comes to believing the predictions of astrology – for which there is no justification.

If it is not simply a product of limited education or experience, a single vision is likely to be associated with absolutisation; because absolutisation, being the state of holding a belief as the only alternative to its negation, excludes alternatives. We avoid allowing ourselves to enter the world of the other kind of vision, then, by regarding ours as the only source of truth, and by disparaging and dismissing the other as ‘woo’ (from the scientific side) or as soulless nerds (from the imaginative). Rather than accepting that we need to develop the other kind of vision, we often just construct a world where only our kind of vision is required. Then we share it with others on social media and produce another type of echo chamber – alongside those created by class, region, educational level, or political belief.

Developing a double vision, then, is an important part of cultivating the Middle Way, and thus also a vital way beyond actual or potential conflicts. A failure to recognise your projection onto someone, for example, creates one kind of conflict, but a failure to imagine may take all the energy out of it and lead to another type of division between you. We may not be able to develop double vision all at once, and it’s best not to over-stretch our capacity for it, but the counter-balancing path is open to you right now from here. Here are some follow-on suggestions on this site: if you’re a soulless nerd, go to my blogs about Jung’s Red Book. If you’re more of credulous “woo” person, try my critical thinking blogs.

Pictures (both public domain): double vision from the US air force and Simone Martini’s ‘Annunciation’.

The MWS Podcast 124: Abeba Birhane on a person is a person through other persons

Our guest today is Abeba Birhane. Abeba is an Ethiopian cognitive science PhD student presently living in Dublin. She blogs regularly at on topics including philosophy, psychology, feminism, anthropology. She recently drew quite a bit of attention on the internet in an article for Aeon entitled Descartes was wrong: a person is a person through other persons and this will be the topic of our discussion today

MWS Podcast 124: Abeba Birhane as audio only:
Download audio: MWS_Podcast_124_Abeba_Birhane
Click here to view other podcasts

The Third Phase revisited

Back in 2013, I wrote a post on this site called the Third Phase. This suggests that, although nothing in history is inevitable, there do seem to be some signs that our civilization as a whole may be entering a new phase of engagement with conditions. You could see that as a new kind of science, philosophy, psychology, or practice. Here is the crucial part of that post that explains the three phases:

In the medieval era, complexity was ignored because of the over-simplifications of the ‘enchanted world’ and its unresolved archetypes. We mistook projections of our psychological functions for ‘real’ supernatural beings. A supernatural world provided a causal explanation for the world around us that prevented us from needing to engage with its complexity. The medieval era was gradually succeeded by the era of mechanistic science, in which linear causal mechanisms took the place of supernatural ones. Although we began to get to grips with the processes in ourselves and the universe, this was at the price of over-estimating our understanding of them, because we were using a naturalistic framework according to which, in principle, all events could be fully explained.

We are now gradually moving beyond this into a third phase of intellectual development. In this third phase, we not only develop models to represent the universe, but we also recognise and adapt to the limitations of these models. We take into account not only what we know, but what we don’t know. The signs of this third phase have been appearing in many different areas of intellectual endeavor.

Look at the original post for a list of what those areas are. They include complexity theory, embodied meaning, brain lateralization, and cognitive bias theory. These are all relatively new developments, involving psychology and neuroscience, that come together to offer the basis of a new perspective. But that perspective is not entirely dependent on them, and is actually far older, since it is another way of talking about the Middle Way. The Third Phase may arguably have first been stimulated by people living as long ago as the Buddha and Pyrrho.

The Third Phase











What strikes me, looking back at this idea more than three years after the original blog, is how simple and obvious it is. The idea of being aware of our limitations is not at all a new one. It’s just the product of the slightly bigger perspective that I’ve tried to illustrate in the diagram above. You can merely be absorbed in the ‘reality’ you think you’ve found, probably reinforced by a group who keep telling you that’s what’s real, or you can start to recognise the way that this ‘reality’ is dependent on (though not necessarily wholly created by) your own projective processes. You look at that hated politician and see the Shadow. You look at scientific theories based on evidence about the earth and see ‘facts’. Or you can look at yourself seeing either of them, and also recognise that those beliefs are subject to your limitations. In both cases that doesn’t necessarily undermine the meaning and justification of what makes the politician hated or the theories highly credible. It just means that you no longer assume that that’s the whole story.

The third phase is not simply a matter of the formalistic shrugging-off of our limitations. It’s not enough just to say “Of course we’re human” if the next moment we go back to business as usual. The third phase involves actually changing our approach to things so as to maintain that awareness of limitation in all the judgements we make. I think that means reviewing our whole idea of justification. In the third phase, we are only justified in our claims if those claims have taken our limitations into account. That’s the same whether those claims are scientific, moral, political or religious. So it’s really not enough just to claim that such-and-such is true (or false) just because of the evidence. People have a great many highly partial ways of interpreting ‘evidence’, and confirmation bias is perhaps the most basic of the limitations we have to live with.

So far people have only really dealt with this problem in formal scientific ways, but science is like religion in being largely a group-based pursuit in which certain socially-prescribed goals and assumptions tend to take precedence, even if very sophisticated  methods are used in pursuit of those goals. Such scientific procedures as peer review and double-blind testing are not the only ways to address confirmation bias, and they are applicable only to a narrow selection of our beliefs. The third phase, if it is happening, is happening in science, but it is also very much about letting go of the naturalistic interpretation of science: the idea that science tells us about ‘facts’ that are merely positively justified as such by ‘evidence’. In the third phase, science doesn’t discover ‘facts’, but it does offer justifications for some beliefs rather than others, and these are acknowledged as having considerable power and credibility. In the third phase, that is enough; we don’t demand an impossible ‘proof’. Better justified beliefs are enough to support effective and timely action (for example, in response to climate change).

The third phase involves a shift in the most widely assumed philosophy of science, but it is not confined to science. It is also a shift in attitude to values and archetypes. Some of us are still caught up in the first, supernaturalist, phase as far as these are concerned, and others in the second or naturalistic phase. Ethics and religious archetypes are either assumed to be ‘real’ or ‘unreal’, absolute or relative, rather than judged in terms of their justification and the limitations of our understanding. I do have values, that can be justified in my context according to my experience of what should be valued. At the same time, the improvement of those values also involves recognizing that they are dependent on a limited perspective that can be improved upon, just as my factual beliefs can be.

Perhaps what I didn’t stress sufficiently in my first post on the topic is that the third phase is not a matter of clearly-defined scientific breakthroughs. It is individuals who can start to exercise the awareness offered by the third phase with varying degrees of consistency. As Thomas Kuhn wrote of scientific breakthroughs or paradigm shifts, they actually depend on a gradual process of individuals losing confidence in an old paradigm and shifting to a new one. But there can also be a tipping point. When it starts to become expected for individuals to recognise the limitations of their justification, as part of that justification itself, social pressure can begin to be recruited to help prompt individual reflection.

We can hope for some future time when the third phase is fully embedded. When religious absolutists stop assuming that the way to make children more moral is to drill them in dogmas. When secularists get out of the habit of dismissing whole areas of human experience in their haste to find a secular counterpart to religious ‘truth’. When promoting understanding of the workings of our brains is no longer considered suspiciously reductive. When the public is so well educated in biases and fallacies that they complain to journalists who let politicians get away with them. When evolutionists respond to creationists not by appealing to superior ‘facts’, but solely by pointing out deficiencies in the justification of creationist belief, in ways that apply just as much in the realm of ‘religion’ as in that of ‘science’. Yes, we are still a long way off the entrenchment of the third phase. We can only try to get it a little more under way in our lifetimes.

Inside the third person

My own habit when I write even the more academic of my books is to freely use the first person: “I want to argue…”. Of course I’m still trying to put forward a case that has wider significance than just for me, but the use of the first person seems a vital aspect of honesty in argument – to show that it’s me arguing from my perspective, and I’m not pretending to be God. The I is a provisionality marker. So it sometimes comes as a shock when I realise just how much insistence on the use of the third person there is in many corners of schools, colleges and universities – particularly in the sciences, both natural and social, and for some reason also in history. Sometimes that just means lots of impersonal constructions like “it is argued that…” or “this evidence shows that…”, but when helping someone with the proof-reading of their dissertation recently I found that they referred to themselves throughout as “the researcher”. This degree of third person pretence seems very jarring to me, and the reasons I reject it have a lot to do with the Middle Way view of objectivity I want to Thinking girl CCpromote.

The reason that many teachers and academics drill their students to write in the third person are all to do with “objectivity”. The idea is that when you write in the third person, you leave yourself out of it. You’re no longer dealing with the “subjective” experiences of your own life, but with general facts that can be supported with evidence. Now, as an experienced teacher, I’d agree with the intention behind this – students do need to learn how to justify their beliefs with reference to evidence or other reasons, and learning to do this is one of the benefits of education. But I’m also convince that this is the wrong way of going about it. Whether or not you use the third person doesn’t make the slightest difference to whether or not you use evidence to support your claims and argue your case critically – but it does reinforce the apparently almost universal human obsession with the idea that you have ‘the facts’, or ‘the truth’ – an implicitly absolute status for your claims. If you really believe that you have ‘the facts’, then the evidence is just a convenient way of getting others to accept the same ‘facts’ that you believe in, not a source of any possible change of view. The ontological obsession hasn’t just emerged from nowhere, but is fuelled by centuries of post-enlightenment linguistic tradition.

Far better, I would argue, to use the first person to own what we say, in the sense of admitting that it’s us, these fallible flesh-and-blood creatures, who are saying it. Then the objective is objective because we have argued it as objectively as we can, not because we are implicitly pretending to view it from a God’s eye view. If we really recognise that objectivity is a matter of degree and depends on us and our judgements, then it is not enough to merely protest that we don’t really mean it when we use ‘factual’ language that habitually bears an absolute interpretation. If we are to bear in mind the limitations of our perspective in practice, we need to constantly remind ourselves of those limitations. The use of the first person offers such a reminder.

Objectivity depends not on ruling ourselves out of our thinking so as to arrive at pure ‘facts’, but rather on acknowledging our role in reaching our beliefs. Recognition of evidence of the conditions around us needs to be combined with a balancing recognition of the limitations with which we are able to interpret such evidence. Neither idealism nor realism, neither mind nor body, neither naturalism nor supernaturalism: but a recognition that none of these categories are ‘factual’ – rather they are absolutizing limitations on our thinking. If we are to take the Middle Way as the basis of objectivity, we need to stop falsely trying to rule ourselves out of the language with which we justify our beliefs.

I’ve spent enough time in schools and universities to know that academic habits are not easily reformed, and that we will probably be stuck with these third person insistences and their cultural effects for some time to come. No teacher will want to disadvantage their students in an exam by teaching them to use the first person if they know that the students will lose marks if they do so. But please let’s not use or spread this unhelpful custom needlessly, and let’s take every opportunity to challenge it. To use the first person to refer to our beliefs is to connect them to our bodies and their meanings and perspectives – which is one of the prime things we need to be doing to challenge the deluded absolutised and disembodied interpretations of the world that are still far too common.


A distillation in four points

I’m always looking for new ways to get across the key points of Middle Way Philosophy in a compact list that can be readily referred to. The Five Principles  on which our summer retreat this year will be based (scepticism, provisionality, incrementality, agnosticism, integration) are one way of doing this, but these five principles focus on qualities to cultivate or use in judgement, rather than on the distinctive world-view they emerge from. So here’s a new attempt to distil that world view into four very brief slogans:

  1. Meaning is body-memory
  2. Belief is assumption
  3. Justification needs provisionality
  4. Truth is archetypal

The explanation of each that follows will no doubt be rather compressed. However, the main idea of this blog is to encourage you to see these points as interdependent (each building on the previous ones), and to at least glimpse how they challenge much conventional thinking and offer new ways forward  for humans stuck in that thinking. For more details on this whole way of thinking, please see first the introductory videos, then the Middle Way Philosophy books.Four Points

1. Meaning is body-memory

The embodied view of meaning tells us that meaning is an accretion of memory. But by ‘memory’ here I don’t anything on the analogy of data-storage which people too often use to try to understand memory.  Rather, whenever we encounter a new experience, we create new synaptic links connected to our whole body’s active engagement in that experience. That experience may involve associating words or symbols with the experience, and when we are prompted by similar words, symbols, or other associated experiences in future, we mildly re-run the synaptic connections associated with it. We thus lay down layer after unconscious layer of memories that then provide the basis of meaning-association in future, and even quite complex or abstract language draws on this embodied experience to be meaningful, via the medium of metaphorical constructions. Think about the most abstract language – a scientific paper, say, or a company board meeting. The meaning of all this language, however abstract, still depends on your body. When you have no body memories to connect with it, you cease to understand what is being said.

2. Belief is assumption

The dominant tradition in philosophy and science, which then influences the way people usually talk about their beliefs, is to think of them as explicit, but explicit beliefs are the tip of a very large unconscious iceberg. Most of our beliefs are a matter of what we assume, rather than what we have explicitly said. If you said you were hungry and then started looking at the sandwiches in a café, it would not be unfair to conclude that you believed that a sandwich might address your hunger, even though you didn’t explicitly say such a thing. Yet, strangely enough, most of the established thinking about how to live our lives just offers explicit reasons for believing one thing rather than another, rather than trying to work with what we actually assume. It is not reasoning (which always proceeds from assumptions) that will help us make our beliefs more adequate to the situation, but rather greater awareness of the assumptions with which we start to reason.

But we can only believe what we first find meaningful in our bodies, so the second point depends on the first.

3. Justification needs provisionality

How do we tell how well a belief is justified? That’s a question at the core of all the judgements we make in everyday life, in ethics, in science, in politics or elsewhere. The traditional answers all involve explicit reasons: for example, that a certain action is wrong because it says so in the Bible, or a certain scientific theory is correct because it can be supported by evidence. But we are constantly subject to confirmation bias, all of us living in our own little echo-chambers in which we seek out what we want to hear. The old ways of justifying our beliefs are not enough by themselves. We need to take into account the mental state in which the judgement is made too, to incorporate psychology as a basic condition in our reasons for adopting one belief rather than another. If we can hold a belief provisionally, so that we can consider possible alternatives, we are better justified than if we do not.

The mental state in which a belief is held is inextricable from the set of assumptions that support that belief. We can hold a belief provisionally if we find alternatives sufficiently meaningful (using our imagination). In the traditional ways of thinking dominant in philosophy and science, this way of justifying our beliefs cannot be taken seriously, because meaning is assumed to depend on belief and belief to depend on justification. In that way of thinking, reasoning comes first rather than the mental states in which the reasoning takes place, but this mistakes the tip for the whole iceberg. The third point thus depends on the first two.

4. Truth is archetypal

People are typically obsessed with ‘truth’, ‘the facts’, God, nature, ontology, ultimate explanations. Surely these things are important? Well, only in the sense that they are meaningful to us, not in the sense that we need to build up justifications of our beliefs by depending on them. If we think of ‘Paris is capital of France’ as true and ‘Paris is the capital of Mongolia’ as false, that is usually a kind of shortcut for the thought that the first is much better justified than the other, and that we assume it in practice. But, according to the third point above (justification needs provisionality), to be justified in believing that ‘Paris is the capital of France’ I need to believe it provisionally, that is to be able to consider alternatives. Whether I actually do this or not, claiming that it is true or false adds nothing to that justification apart from cutting off the provisionality, making it the final story and closing off any further thought or discussion on the subject. Claiming that it is true or false thus actually seems to undermine one’s justification.

Nevertheless, we can respect the motive of those who seek to establish the truth (which they will do best by considering the justification of a belief against alternatives – by doubting the truth of their claims rather than asserting it). Truth can thus still be a kind of symbolic inspiration or archetype (see this blog post for examples), and not claiming to possess archetypal truth a mark of fully respecting it. Just as we need to avoid projecting an archetype on someone else by thinking that they are God, or the perfect woman, or whatever (even though we may also appreciate ideal artistic depictions of God or of the feminine) we need to recognise truth as a symbol that we find meaningful in relation to our body-memory, without projecting it onto a particular set of words that we take to be ‘true’. Instead, whenever there is a discussion about whether we should hold one belief rather than another (in science, politics, ethics etc.) we can focus on justification.

We could not make sense of truth being archetypal if we did not separate meaning (point 1) from belief (point 2), recognising that meaning precedes belief rather than the other way round, and that we can find truth meaningful without believing that we have it. It’s also precisely because of the need to maintain provisionality about our beliefs (point 3) that we cannot justify claims of truth.

This view of truth can potentially transform our view of science, ethics and religion: whether we are talking about scientific facts, the good, or God, we can respect the motivations of those who value these things without accepting that any of them are actually possessed in a particular verbal formula.

The four points and the Middle Way

The Middle Way means a practice of seeking justification for our beliefs in provisionality rather than in consistency or evidence alone. To stay in this provisional zone, we avoid the absolutes of claiming truth on the one hand or falsity on the other. To do this in practice requires our mental states to be provisional, which is just as much a matter of our emotions and body as of our reasoning. It’s not a question of aiming to be in some wonderfully enlightened mental state, but simply of judging better every time by being less confined by our personal echo-chambers than we might otherwise be.

In connection with the founding story of the Buddha from which the term ‘Middle Way’ derives, we need to focus not on the final state that the Buddha supposedly achieved by using the Middle Way, but how his judgements at each stage reflected provisionality and enabled him to move beyond the rigid assumptions of those around him. First he needed to leave the palace with its rigid ‘truths’, then also move beyond the religious world of spiritual teachers and ascetics with their ‘truths’ (which also declared the world ‘false’). If we unpack what is required for the Buddha to go through this process at each stage, it involves maintaining a sense of the meaning of alternatives (point 1), developing a greater awareness of the limited assumptions of those around him (not just their explicit views – point 2), and recognising their lack of justification (point 3). If the Buddha had at any point discovered the ‘truth’, this would have halted his progress by ending the story, but instead the story continues – indefinitely.