Tag Archives: History

In Defence of, the Much Maligned, Twitter

“For any women who are compelled, against their wishes, to wear a Hijab, I would fully support such a notion [to arrange a #TakeOffYourHijab day in solidarity with the Iran protests]. Similarly, I would not like to see any woman compelled, against her wishes, to remove her Hijab either”.

I tweeted this on 31st December last year, in response to the suggestion by – counter-extremist, author, broadcaster and Founding Chairman of Quilliam – Maajid Nawaz that, what he calls the ‘regressive left’ would not support a Take Off Your Hijab Day, even though they have been vocal in their support of World Hijab Day. It’s an uncontroversial response in what I think was an interesting and important debate (one that had been inspired by the Iranian protests which were ongoing at the time).  However, it’s not the content of this debate that I want to discuss here but what happened next and how it caused me to reflect on my overall experience of Twitter (and online communication in general).

I’ve had loads of debates and disagreements on Twitter.  These have covered a whole range of subjects and have involved people from a wide range of political backgrounds.  I’ve debated with left-wing Jeremy Corbyn supporters about media bias and Donald Trump supporters about gun control, but the issue of Muslim women wearing head coverings, and the comments that I made about it, seemed to inspire a level of hostility that I hadn’t encountered on Twitter before.  Now, I should say right away that, although it felt abusive at times, what I experienced was still extremely mild compared to what others – notably women – can, and do, experience on a disturbingly regular basis.  Nevertheless, it was still quite shocking and, while I’m not one who takes offence very easily, it became pretty overwhelming.  This was partly because of the frequency with which the criticisms came, but it was more to do with the nature of the onslaught.  My points were largely being ignored in favour of increasingly personal attacks.  Eventually, feeling deflated and tired following twenty-four hours of Twitter exchanges, I muted the conversation (meaning I could only view my own previous posts, but would not see anything else) and reported some of the most abusive participants, thereby bringing my role in the discussion to an end.

This spiral into uncivilised discourse all seems rather predictable.  It’s a common trope to point out the negative and harmful effects of Twitter, and other forms of social media; it is often discussed by the public and widely reported upon by the media.  I don’t want to play down this aspect of social media; it is real, it can have extremely severe consequences and there has not yet, in my opinion, been anyway near enough done to address it – by either the companies involved, various governments, or society in general.  Online bullying, shaming, threats of rape, and the spread of destructive ideologies are just a few examples of a problem for which endless discussion has led to little in the way of meaningful action.  Nonetheless, my experience, as described above, affected me in a way that was as intense and vivid as it was surprising.  My initial weariness passed quite quickly, and what I was left with was the realisation that the overwhelming majority of my experiences on Twitter have been positive.  Sometimes deeply so.

Sure, as I said before, I’ve had lots of debates and arguments that have often felt intense and fractious, but even these have been positive in one way or another.  Even in some of the most impassioned debates, people have been civil and have tended to focus on the points being made, rather than resorting to personal insults.  Inevitably, such encounters have ended with an agreement to disagree and a mutual well-wishing from each party.  To my mind, the point of such arguments is not to change anyone’s mind – the chance of being successful on a platform like Twitter is miniscule – but to allow parties of differing political persuasions and opinions to understand why someone might think differently to them.

While I don’t doubt that there is a problem with some people swaddling themselves in the safety of their carefully constructed echo-chamber, this hasn’t been my experience.  Brexiters and Trump supporters regularly respond to, and challenge, things that I have written – and I’m always pleased when they do.  For my part, I try very hard to stick to a few simple rules that include: never passing comment on personal features and traits, and never ridiculing people for spelling and grammatical errors.  Although, my biggest weakness, I have to admit, is a tendency for sarcasm.  I am frequently sarcastic on Twitter – much more than I am in ‘real life’ – but I do think it serves a useful purpose.  I try not to be sarcastic about the things detailed above; instead, I usually use sarcasm to highlight, what I think, is a logical error in someone’s argument or just to try and be humorous about something frivolous.  What the former often achieves is the provocation of a response, in a way that blandly pointing out a perceived mistake rarely does, meaning that the issues can then be discussed in greater detail.

Of course a large part of Twitter activity doesn’t involve abuse, or politically infused arguments; most of it consists of superficial attempts to provide stimulation of the neurological pleasure receptors:

Post something that you hope is interesting or funny.

Receive a ‘like’.

Experience an instant, but short lived feeling of satisfaction (or not, if your post doesn’t get any response at all).

Despite this apparently shallow cycle, Twitter (and the wider world of internet communication) can be, and frequently is, the source of meaningful personal encounters and opportunities that might not otherwise be possible.  In the spring of last year, I was struggling to find the motivation I needed to finish an assignment.  I was reading the news, making repeated trips to the cupboard for snacks, listening to music, staring into space and, naturally, checking Twitter.  As part of this particularly long bout of procrastination I constructed and posted some frivolous tweets – hoping, of course, for another short-lived hit of dopamine.  One such Tweet was a comment on my current efforts of procrastination alongside a wish to obtain just a small portion of – Art Historian, Oxford University lecturer, author, TV presenter and enthusiastic Tweeter – Dr. Janina Ramirez’s – apparently (as anyone who follows her work will know) endless levels of energy.  These kinds of Tweets rarely get any response at all, so I was surprised when Janina replied.  Although it was a small gesture I was struck by the kindness of it.  I wasn’t commenting on something she was trying to sell, and she didn’t need to respond; I was quite happy throwing Tweets into the, usually unresponsive, abyss.  Instead, I received my sought-after hit and also enjoyed a fresh wave of motivation, with which I was able to complete the assignment (for which I received my highest mark of the previous few years).

Anyway, to take a sharpened cleaver to a rather long story, this simple sharing of Tweets led to my attending the wonderful Gloucester History Festival, where I was able to chat with Janina, who is the President of the festival, along with some of her supporters and friends – a few of whom I had briefly communicated with on Twitter before.  One of the main things that struck, and surprised, me about this experience was how easy it was to speak to those people I had met previously on Twitter.  Although I don’t avoid crowds (I quite enjoy them), I don’t usually feel very comfortable meeting a lot of new people and can be perfectly happy staying in the background, either on my own or with a small group of friends.  On this occasion, however, the joy of meeting people who I’d only known online, and getting on well with them, was quite emotional and even a little overwhelming.  Since then I regularly converse with the people I met there, and now consider them (Janina included) to be friends.  That such meaningful relationships are possible from relatively flippant tweets is a wonder, and stands firmly in opposition to the characterisation of social media as a vacuous and futile cesspit.

My very involvement with the Middle Way Society, and the friendships I’ve made within it, were also made possible through online communication.  If I hadn’t become involved in a debate about the definition of religion on an online forum several years ago then I wouldn’t be writing this, and nor would I have subsequently had so many wonderful opportunities.  My experience of meeting founding members Robert and Barry (at what I thought was a Secular Buddhist UK retreat, but was in fact a kind of committee meeting for a soon-to-be-no-more organisation), was similar to the one I later had in Gloucester.  We’d all been involved in several discussions on the aforementioned forum and, on eventually meeting in person, we seemed to know each other better than I had expected.  Robert, Barry and Peter (who I met at the same time online, but later in person) quickly became dear friends.  Following the ensuing foundation of the Middle Way Society, I was asked if I’d like to join, which – after some hesitation (I’ve always been wary of becoming part of a ‘group’) – I agreed to do, as well as agreeing to become a member of the committee.  This has meant I’ve been able to push myself and achieve things that I never thought I would – like writing blogs, for instance.

There are many problems with Twitter (and social media in general), but there are also many  positives. We are still learning how to use this relatively new form of interaction; we are still immature and naïve.  Even so, I feel confident that we’ll learn how to make use of social media with more maturity and care than we currently do.  Part of the problem is that this new frontier of communication gives the illusion that we are not dealing with embodied human beings, but lines of text generated from an abstract source.  This has the effect of reducing our sense of social responsibility and shielding users from the effects that they have on others.  Social conventions and restrictions can, when implemented wisely, serve as cohesive and stabilising forces.  These have yet to develop fully, or effectively, in cyberspace and it remains difficult to predict what form they will eventually take, but I nonetheless believe that things will get better.  By highlighting and encouraging that which is beneficial, as well as highlighting and challenging that which is harmful, we can begin to negotiate a Middle Way between the extremes of an imagined online utopia on the one hand and an online world that is categorised as a threat to society itself on the other.


You can hear our 2016 podcast about public shaming on social media with, journalist and author, Jon Ronson here.


If you are, or know someone who is, experiencing online abuse then these links provide advice of what you can do:

http://www.stoponlineabuse.org.uk/

https://www.amnesty.org.uk/press-releases/more-quarter-uk-women-experiencing-online-abuse-and-harassment-receive-threats

https://www.vice.com/en_uk/article/bjp8ma/expert-advice-on-how-to-deal-with-online-harassment


All pictures courtesy of Wikimedia Commons and licenced for reuse.

From Medieval to Modern Medicine: A Journey of (Not So Straightforward) Progress.

During a recent(ish) podcast, in which Peter Goble and I discussed issues surrounding the experience and management of pain, I suggested that the distinctions between mind and body – which have existed in modern medicine – are beginning to be broken down; that scientific medicine was embracing holistic ideas and practices with more than mere lip service. This, in part, has been in response to the apparent rise of ‘holistic’ or ‘complementary’ therapies (I say ‘apparent rise’ because I think that therapies offering alternatives to the mainstream have been popular in one form or another for a very long time). Despite harbouring some doubts about such theories, usually regarding their underlying theories or their general efficacy, I have long thought that the tendency to treat the ‘whole person’ rather than focus solely on specific diseases is a good one. If we take lung cancer, as one obvious example, then it’s right to say that it’s a disease that can be identified in one area of the body and treated locally. If it’s found early enough it can even be surgically removed and the patient can be ‘cured’.  All of this can be achieved without much thought for the individual involved, but it shouldn’t be. There are many reasons why a holistic approach should accompany (or rather form part of) the medical approach. A person’s lifestyle or environment can be manipulated to aid recovery, or even help reduce the risk of getting lung cancer in the first place, and the person’s emotional needs should also be considered. Getting lung cancer is not just a physical event; there will likely be considerable emotional effects too – which, like physical symptoms will be different for each individual.

Such things are increasingly being contemplated and acted upon by the medical community, which is interesting when one considers that the humoral model had been doing this for centuries, before being rejected roughly 200 years ago, by the scientific model.  From the 10th to the 19th century CE the established medical orthodoxy was based, almost entirely, on the ancient ideas of thinkers such as Hippocrates, Aristotle and Aelius Galen.  This system was based on the belief that the human body consisted of four fluids (or humours): Yellow Bile, Pure Blood, Black Bile and Phlegm.  Each humour had unique properties and were related to such factors such as Aristotle’s four elements andThe_four_elements,_four_qualities,_four_humours,_four_season_Wellcome_V0048018 the four seasons of the year.  So, the properties of Yellow Bile were considered to be ‘hot and dry’ meaning that it was related to the ‘element’ fire and the summer season.  Phlegm, on the other hand, was ‘wet and cold’ and thus associated with ‘water’ and ‘winter’.  Each person had an optimum ratio of these four humours, which was specific to them; personality, emotion and physical condition were all determined by this ratio (or complexion).  One’s health was the product of one’s complexion; if the ratio of humours became deranged then ill health would follow.  Factors such as environment, food or the position of celestial bodies could all alter the amount of each humour.

Diseases were not thought to be specific entities in of themselves.  Every incidence of disease was specific to the person who was suffering, and thus treatments were tailor made to address the specific conditions that were responsible.  If a set of symptoms were thought be caused by a surplus of Yellow Bile, then any treatment would have the opposite properties of cold and wet.  Such treatments could include a prescription to change the properties of ones environment or diet, as well as for medical concoctions and surgery.  By considering psychology, physicality, lifestyle and environment as deeply interrelated factors, thereby focusing on the whole patient as an individual, humoral medicine was truly holistic.  It was impressively versatile too; from Christianity, through the emergence of human dissection, to the enlightenment, challenges to the ancient system came from many sources.  Often, such challenges would be integrated into the existing theory.  God became the primary cause of disease, causing it and allowing it to spread as a punishment for sin, and new treatments based on chemical experimentation were added to the long list of remedies and concoctions.  What did not change, and what was not readily challenged within the mainstream, were the core ideas of the classical scholars.  It was widely believed that the work of those such as Galen could not be bettered, only expanded upon (although this too was a matter of debate).  Even when human dissection showed that anatomy differed from what Galen proposed (Galen only dissected animals) it was frequently assumed that the anatomist, not Galen, had made a mistake.  Some scholars would even alter their descriptions to fit the Galenic sources.  Mainstream medicine spent over 1000 years being based on a, largely, unchallenged appeal to authority.  Those who dared practise outside of its dogmatic sphere could find themselves the unfortunate victims of persecution.

A combination of factors (theoretical, technological, political & social), occurring up to and throughout the late 18th to the mid 19th century, eventually led to the decline of this long lived classical theory.  The emergence of increasingly scientific medical theories led to a general shift in focus from the patient – as an individual to be treated as a whole – to a specific part of the body or an external, disease causing, entity.  As such the patient, in many cases, came to be viewed as an incidental part of the disease process.  That’s not to say there was a clearly defined shift from the ways of the old to those of the new; there rarely is.  Nor was there a move from a wholly holistic practice to one where such considerations were completely absent.  Nevertheless, the medical community was becoming increasingly specialised and all to often the human being was becoming lost in the detail.

I’m not going to argue that this process shouldn’t have happened.   The tendency to specialise and focus on diseases as distinct entities and specific parts of the body has given us incalculable benefits.  If faced with the prospect of a Tuberculosis outbreak, I’ll take the scientific explanation and subsequent course of action over that of a humoral practitioner any day.  Similarly, if I ever need complex cardiac surgery and I’m given the option of a surgeon that is warm, kind and empathetic with an above average mortality rate or a sociopath with a rude, unpleasant bedside manner, who has a very low mortality rate, I’ll take the latter every time.  Of course, it would be much better if I could have a surgeon who combines the best of both.  This might be a bit idealised, and it would be unrealistic to expect every practitioner to experience and radiate the same levels of empathy, just as one could not expect every surgeon to have the same technical skills, but that does not mean it shouldn’t be aimed for.  A surgeon would probably not think much of the notion that they should always endeavour to become as technically skilled as they possibly can be, but the suggestion that the same principle should apply to bedside manner might not always be met with enthusiasm.  I think that it’s an oversimplification to claim that patient’s are viewed merely as objects rather than individuals but there is some truth to this, as demonstrated by this very funny video (which is only funny because there is more than a whiff of truth, and familiarity for anybody that works in an operating department).  I’ve even fallen into such language myself:

‘Are we doing the abscess next’?

Although I’m glad to say, that in my experience, I’ve always (rightly) been pulled up on such utterances by a colleague:

‘We are not “doing an abscess”, we are treating a person who has an abscess’.

I think that there have been many improvements – from wider environmental and lifestyle concerns to the understanding that our physical or psychological conditions cannot always (if at all) be considered in isolation from each other.  Pain management services (in Britain, at least) are a good example where services are being integrated, but there is still a long way to go.  The provision for the psychological well being of those staying in hospital, for example, is often inadequate (a situation potentially made worse if you also happen to suffer from a mental illness) – of course a positive emotional experience will not fix that broken hip, but it may well assist in your recovery and help prevent you form developing a new founded, and avoidable, phobia of hospitals.  There are obviously financial and logistical factors at play here, which can be hard to overcome – but this is not an excuse for the wider needs of patients to be neglected.

Modern medicine has many advantages over humoral medicine.  It is demonstrably more effective at preventing and treating disease and it is not based upon such dogmatic appeals to authority.  Clearly, there is dogma and there are appeals to authority, but due to the requirements for evidence and expectations for innovation, such dogmas are short lived – perhaps lasting a generation or so, but falling far short of the 1000 years that Medieval medical orthodoxy managed to exist.  However, the shift away from the old ideas probably went to far and our focus became too narrow, meaning, in some respects, we have spent the last 200 or so years rediscovering some of the valuable ideas which had become obscured.  The Middle Way Philosophy is unapologetically inspired by many, sometimes apparently incompatible, sources; a ‘magpie’s nest of influences’ made up from those aspects of other ideas which, after critical analysis, have been deemed useful.  Good science and, by extension, good medicine also does this, but all too often there is hesitation, often borne from suspicion of ideas that do not fit neatly into the current orthodoxy.  There are plenty of ‘alternative/ complementary’ therapies that are widely popular and don’t hold up to scientific scrutiny.  To dismiss them all, in their entirety, because of this may be a mistake.  Yes, such and such therapy might not treat what it says it treats, in the way that it claims, but that doesn’t mean there is no value to be found.  If a GP prescribes a contraceptive pill, it will almost certainly work (if used correctly).  As far as I know there is not a Homeopathic equivalent to the contraceptive pill, but the extended consultation that one is likely to receive from a Homeopath could provide many other benefits that GP could not hope to achieve in a 5-10 minute consultation.  We shouldn’t be uncritically open to all ideas that come our way, or to the ones that are currently in vogue, but neither should we dismiss them out of hand (even if one aspect has already proven unhelpful).  This is not easy to do and we will continue to take wrong turns, just as we have in the past.  However, in general, I believe that we will continue to move in something like the right direction, albeit in a haphazard, uneven and uncertain fashion.  I also believe that the five principles of the Middle Way, and the wider philosophy that emerges from them, are well placed to help us avoid many of the hindrances of the past.

Picture: The four elements, four qualities, four humours, four season. From Wellcome Library, London (CC BY 4.0), via Wikimedia Commons

 

From Conflict to Integration – Social Change in 19th Century Britain

800px-London_2012_olympics_industrial_revolutionThe industrial revolution of Britain was not just technological in character– there was also massive social and political upheaval, of which we are still engaged in today. I have selected several examples to try and argue that this era provides a vivid example of how the integration of desires and beliefs can not only be of significant benefit to society as a whole, but provides the most effective framework from which to navigate seemingly incompatible ideas. This is written from my own, British, perspective and as such is focused on British politics and history, however I suspect that these events had significant consequences around the world and I am also sure that similar examples could easily be found in other societies. I will refer here to several individuals and ideologies, narrowly focusing on specific features; in no way do I intend to provide a full representation of any of them. Figures such as Adam Smith and were hugely influential for many reasons – of which several volumes could be written. Having said that, I sincerely hope that I have not misrepresented any individual or event in my brief summary of the economic conditions, social issues, and ideologies that have lead – in large part – to the world that we live in today.

Clearly, there has been much social and political conflict throughout the history of the British Isles, but the basic Norman social structure – of a strict hierarchical pyramid, with the vast majority at the very bottom with little concept of social mobility – survived, for centuries, in one form or another – with little significant change for the majority of the population.  Then, as technology rapidly changed throughout the late 18th and early 19th century, so did the living and working conditions of much of the working population, as they moved in ever increasing numbers to simultaneously spectacular and monstrous industrialized cities, such as Manchester – whose population had increased threefold during the first half of the 19th Century.  Life had never been easy for those living in the murky, parasite ridden depths of society.  Yet, with severe overcrowding, increased disease and dangerous, perpetually uncertain working conditions, things must have felt as if they were worse than they had ever been – a feeling that would only have been magnified by gazing, in a rare moment of free time, at the lucky people bathing in the rapidly improving, but ultimately out of reach shallows.

Those merchants, manufacturers, professionals and political classes were not blind to the suffering of their less fortunate contemporaries – and nor were they entirely unsympathetic. However, while in 21st Century Britain, I can’t help but find the general consensus for solutions, and the subsequent treatment of the labouring classes, callous and barbaric – at that time most of the people that could make a difference believed that not actively alleviating suffering was the kindest – and indeed, only legitimate – course of action. During the first half of the 19th Century it was widely believed that economic conditions should be allowed to operate freely, without restriction. This extended to fluctuations in wages and job availability – if the market demanded that wages drop, or a large proportion of the work force be laid off, then employers must be allowed to act accordingly. The belief seemed to be that, left unmolested – and pursued for ones own individual benefit – the rapid rise in manufacture and the massive profits that this generated would not only benefit the individual, but, eventually, the whole of society too. Any suffering caused along the way was regarded as unfortunate but necessary ‘collateral damage’. Displays of undue compassion and generosity might temporarily alleviate some suffering, but the collapse in the market which would surely follow would be a disaster: causing misery and suffering for an even greater number of people.

These economic ideas (which I think form part of Classical Liberal Economics) were inspired by earlier thinkers such as Adam Smith, who seemed to believe that a free market operated under the influence of absolute natural laws which were themselves regulated by a metaphysical ‘invisible hand’. Other ideas were being formed during this period, and chief among these were those of Friedrich Engels and Karl Marx. I’m not sure that either of these men disputed the claims of liberal economists, but it is clear that they anticipated a much different outcome. After visiting Manchester in 1842, and being deeply affected by what he saw, Engels first developed his ideas – of which one was the perceived inevitability of the working class rising up in violent revolution and consequently replacing the capitalist system with a new fairer, socialist society. In the following years Engels and Marx would greatly develop this idea and, as serious and violent attempts at revolution swept across the rest of Europe in 1848, they must have felt that they were right. These revolutions were not directly influenced by the writings of Marx and Engels (who probably exerted their greatest influence during the 20th century), there were probably many causes, often specific to different areas of Europe. Nevertheless, the desire for greater equality and political reform seems to have been a common theme and this often manifested in a demand for universal suffrage (although this did not yet include Women).

The events in Europe were not spontaneous; the turbulence was manifest in the decades leading up to this wave of disruption and Britain was no exception: from the Peterloo Massacre of 1819, to the action conducted and influenced by the Chartists between 1838 and 1850. The Chartist_DemonstrationChartists, who sought universal suffrage and the improvement of working and living conditions, not only inspired and encouraged industrial action but also published the Peoples Charter, which consisted of six primary demands for political change  and yet despite their popularity with the working classes, and the disruption that they inspired – they were largely ignored by the political and industrial elite, considered only as a mere nuisance. In 1848, although there had been several Factory Acts, that had legislated many improvements on working conditions (especially for women and children), there was little change to the social and economic structure; there was no revolution and the Chartists had failed – condemned to limp ineffectively along before disbanding two years later.

So what did happen and how does this all relate to the Middle Way? With the violence and turmoil erupting over the channel, the obvious suffering of a great number of people on this side of the water, and the signs of growing impatience from the labourers (who’s efforts formed the unwashed foundations of the wealth of the few) – many, from the circles of society that could make an actual difference, began to question the status-quo. There have been philanthropists and socially conscious do-gooders throughout history – with evangelical Christians and Quakers, whose efforts may have helped to inspire the thoughts of others, deserving special mention during this period of British history.  However, it was only as ideas of altruism came from wider sources that real change began to occur.

Philosophers, such as Jeremy Bentham and John Stuart Mill (who has featured in Robert M Ellis’s ‘Middle Way Thinkers’ series) were both early proponents of, what seems to me to be, an extreme free-market system – yet both became important critics who argued for increased state intervention and rights for workers (As an aside, Bentham had always been opposed to Smiths idea of natural laws governing economic systems).  Additionally, writers such as Charles Dickens and Elizabeth Gaskell were also increasingly outspokenElizabeth_Gaskell_7 about social conditions, both in their writings and – especially in Dickens case – their public engagements. As the voices of dissent, from the mouths and pens of so called reputable sources, increased, so did the interest of parliament – after all, these voices actually had a vote. Consequently, some politicians began to wonder if technological/ economic progress couldn’t also be conducted with, what we would call today, a social conscience – which would not just benefit the hypothetical population of a utopian future, but also the very population that were toiling relentlessly to make such progress possible at all.

Parliament fiercely debated these issues and what began to emerge, as a continuation of the earlier factory acts, was an integration of competing desires. Starting from accepted economic dogma, having also considered the possibility of total and violent social revolution, and musing over the possibility of altruistic policy,  a series of reforms continued to increase the electoral franchise and improve the living and working conditions of the previously unheard majority – with the development of universally available civic amenities, such as public libraries. The European social revolutions of the 19th century were largely unsuccessful, with things returning to much as they were before – perhaps even worse, and the economic and industrial systems did not collapse with the increase of certain state interventions. That is not to say that the suffering stopped, we are still navigating through these extremes and sometimes veer a little to close to the edge, yet we have so far steered the ship in a largely progressive and beneficial direction – things in Britain are much better than in the 19th Century and I am confident that we will continue precariously in this direction.

In many parts of the world, however, these issues are painfully relevant and it appears that the free-market is pursued at the tragic expense of a silent majority. Perhaps, given time, this system will provide significant benefits for all involved, but is the suffering of today worth the prize? I don’t think so, but nor do I wish for a radical overthrow of the whole system. Of the six demands made by the Chartists, and ignored by everybody that could have made a difference at the time, five have since been passed as law (and in fact society has gone much further, with the rights of women to vote as one important example). If those in power had not been so enamoured with a dogmatic status-quo and had been willing to consider the views of those that opposed it, these reforms might have happened not only sooner, but more rapidly.

I also believe there are lessons here for the leaders of multi national corporations and the leaders of the (rapidly) developing nations – if better working conditions did not cause the collapse of the Victorian economy, then why should it theirs? It is easy to feel helpless when faced with the plight of many of our international neighbours, but as individuals and consumers we can make choices; like the manufacturers and merchants of the 19th century, it is us that now benefit from the suffering of others. Perhaps by making the right kinds of choices we can play a small part in encouraging, not a radical overthrow of the system – but the nurturing of a Middle Way where profits can still be made and, more importantly, social conditions can be wilfully improved. Of course the political leaders of the more economically developed nations can also exert an obvious influence too.

Another lesson that I see here is this: as voters and participants in the political system, we should use our hard won privileges to ensure that domestic politics does not fall into a stagnant status-quo. I am deeply suspicious of the so called ‘centre ground’ – which is sold to us as a kind of Middle Way: not too far right and not too far left. As our politicians all scrabble around for this goldilocks politics, those who do not conform are pushed out of the system and as parliament appears increasingly bland, people are, understandably, attracted to the voices calling from the peripheries – these are often unpalatable, but others (as I think the Chartists were) might be making important and useful points. It is not the job of parliament to tell us where the centre is, it is the job of parliament to take a representative selection of views – often seemingly diametrically opposed – and navigate it’s way between them.

If a more explicit Middle Way approach had been employed in the eighteen-hundreds then perhaps progress would have been sooner in coming, we may never know, but we can try to apply the Middle Way to contemporary issues at home and further away. However, as with politics, we should not expect to find some mythical centre; rather we should navigate through the extremes as best we can.

Images courtesy of Wikimedia Commons

The Trouble with Revisionism

Almost everything we do is in some way an attempt to improve on what went before. Even tidying up a room involves what we see as an improvement on its previous state. When we consider traditions of human thought and activity, too, each new development of a tradition tries to address a new condition of some kind and thus also remedy a defect: for example, the Reformation was a response to dogmatic limitations and perceived abuses in the Catholic church, and new artistic movements respond to what they see as the aesthetic limitations of the previous movements that inspired them.

In many ways, then, its not surprising that both individuals and groups gradually evolve new ways of doing things in response to past tradition or custom. What creates a problem, though, is when we essentialise that tradition and try to appropriate its whole moral weight to justify our current approach: believing that we have found the ultimately right solution, the true answer, or the ultimately correct interpretation of that tradition. When we do that, we’re not just contributing to a new development that we acknowledge to be different from what went before, but also imposing that development on the past. In effect, we’re projecting the present onto the past. Revisionism - Executed Yezhov removed from photo of StalinThis is an approach to things for which ‘revisionism’ seems to be a good label, though it’s most typically been used for those who more formally impose their preconceptions on the interpretation of history, such as holocaust deniers. This photo shows such revisionism in action in the Soviet Union: the executed commissar Yezhov removed from a photo featuring Stalin.

In a sense, we’re all revisionists to some degree, since this tendency to appropriate and essentialise the past is wrapped up in common fallacies and cognitive biases that we might all slip into. We’re especially likely to do this when considering our own past, for example underestimating the extent to which our mature experience differs from our youth and projecting the benefit of hindsight onto our judgements in the past. In working on my next book Middle Way Philosophy 4: The Integration of Belief, I’ve been thinking a lot about these cognitive biases around time recently. There are many concerned with the present and the future, or with non-specific times, as well as the past, so I won’t try to discuss them all, but just a couple that focus particularly on the past.

In terms of Critical Thinking, the fallacy of absolutising the past is equivalent to the Irrelevant Appeal to History or Irrelevant Appeal to Tradition. This is when someone assumes that because something was the case in the past that necessarily makes it true or justified now. Simple examples might be “We haven’t admitted women to the club in the hundred years of our existence – we can’t start now! It would undermine everything we stand for!” Or “When we go to the pub we always take turns to pay for a round of drinks. When it’s your round you have to pay – it’s as simple as that.”

A common cognitive bias that works on the same basis is the Sunk Cost Fallacy, which Daniel Kahneman writes about. When we’ve put a lot of time, effort, or money into something, even if it’s not achieving what we hoped, we are very reluctant to let go of it. Companies who have invested money in big projects that turn out to have big cost overruns and diminishing prospects of return will nevertheless often pursue them, sending “good money after bad”. The massively expensive Concorde project in the 1970’s is a classic example of governments also doing this. But as individuals we also have an identifiable tendency to fail to let go of things we’ve invested in: whether it’s houses, relationships, books or business ventures. The Sunk Cost Fallacy involves an absolutisation of what we have done in the past, so that we fail to compare it fairly to new evidence in the present. In effect, we also revise our understanding of the present so that it fits our unexamined assumptions about the value of events in the past.

I think the Sunk Cost Fallacy also figures in revisionist attitudes to religious, philosophical and moral traditions. It’s highly understandable, perhaps, that if you’ve sunk a large portion of your life into the culture, symbolism and social context of a particular religious tradition, for example, but then you encounter a lot of conflicts between the assumptions that dominate that tradition and the conditions that need to be addressed in the present, there is going to be a strong temptation to try to revise that tradition rather than to abandon it. Since that tradition provides a lot of our meaning – our vocabulary and a whole set of ways of symbolising and conceptualising – it’s clear that we cannot just abandon what that tradition means to us. We can acknowledge that, but at the same time I think we need to resist the revisionist impulse that is likely to accompany it. The use and gradual adaptation of meaning from past traditions doesn’t have to be accompanied by claims that we have a new, true, or correct interpretation of that tradition. Instead we should just try to admit that we have a new perspective, influenced by past traditions but basically an attempt to respond to new circumstances.

That, at any rate, is what I have been trying to do with Middle Way Philosophy. I acknowledge my debt to Buddhism, as well as Christianity and various other Western traditions of thought. However, I try not to slip into the claim that I have the correct or true interpretation of any of these traditions, or indeed the true message of their founders. For example, I have a view about the most useful interpretation of the Buddha’s Middle Way – one that I think Buddhists would be wise to adopt to gain the practical benefits of the Buddha’s insights. However, I don’t claim to know what the Buddha ‘really meant’ or to have my finger on ‘true Buddhism’. Instead, all beliefs need to be judged in terms of their practical adequacy to present circumstances.

This approach also accounts for the measure of disagreement I have had with three recent contributors to our podcasts: Stephen Batchelor, Don Cupitt and Mark Vernon. I wouldn’t want to exaggerate that degree of disagreement, as our roads lie together for many miles. and in each case I think that dialogue with the society and exploration of the relationship of their ideas to the Middle Way has been, and may continue to be, fruitful. However, it seems to me on the evidence available that Batchelor, Cupitt and Vernon each want to adopt revisionist views of the Buddha, Jesus and Plato respectively. I’m not saying that any of those revisionist views are necessarily wrong, but only that I think it’s a mistake to rely on a reassessment of a highly ambiguous and debatable past as a starting-point for developing an adequate response to present conditions. In each case, we may find elements of inspiration or insight in the ‘revised’ views – but please let’s try to let go of the belief that ‘what they really meant’ is in any sense a useful thing to try to establish. In the end, this attachment to ‘what they really meant’ seems to be largely an indicator of sunk costs on our part.

The MWS Podcast: Episode 24, Paul Teed on the study of history

In this latest member profile, Paul Teed a professor of history at Saginaw Valley State University tells us why he joined the society, what history means to him and why it matters. We also discuss objectivity, how to critically assess history, what he thinks of the film ‘12 years a slave’, the importance of ‘telling a story’ and how all this relates to his understanding of the Middle Way.


MWS Podcast 24: Paul Teed as audio only:
Download audio: MWS_Podcast_24_Paul_Teed

Click here to view other podcasts