All in the Mind Computing Power Dynamics

Our Emerging Brave New World

The road to hell is paved with good intentions

Mental Health Mantra

If you believe vocal lobbies, we can never devote enough resources to tackle our ongoing mental health crisis. Politicians of all hues like to champion the rights of mental health patients to better care. They try to score points on the perceived lack of funding for mental health services. The subtext is that we should treat mental health just like physical health and it is thus the business of healthcare providers not only to check your blood pressure and heart rate, but to analyse your state of mind.

Unlike physical health, mental health is highly subjective. What kinds of moods and behavioural patterns are so dysfunctional or antisocial that they merit the proactive intervention of third-party supervisors whether in the guise of counsellors, social workers, psychiatric nurses or psychologists. This paternalistic approach raises many questions about personal independence and freedom. Until recently we just assumed that happiness is mere expression of satisfaction with life. Yet it is hard to detect any correlation between prosperity and happiness, except in a looser relative sense. Above all people need security, a sense of belonging and some love and affection. We often substitute ephemeral pleasures of temporary stupefaction or indulgence for true contentment gradually won through hard work. The abundance of consumer goods and a generous welfare state have jointly undermined the great art of delayed gratification and replaced it with a sense of entitlement that can often create an emotional void and an insatiable demand for more and better.

It seems only fair to care for vulnerable members of our community. If we were talking about paraplegics, everyone would understand why their disability, paralysis of the lower body, merits some help from the rest of us. Indeed with assistive technology most paraplegics can lead fruitful lives. However, few would choose to be cripples and most would welcome medical breakthroughs to help them walk again. If the incidence of paraplegia were to double every twenty years, we would seriously have to address the root causes for society relies on the able-bodied to assist the physically disabled. If we are unable to look after ourselves unassisted, we inevitably depend on the goodwill of others to act in our best interests. Our personal freedom is ultimately limited by our dependence on others for our basic needs. These days few of us could be truly self-sufficient, unless we adapted to a humble existence as subsistence farmers, so a paraplegic is only relatively more at the mercy of external agencies than your average able-bodied citizen. Arguably a talented cripple able to work remotely as a writer, designer or programmer may contribute more to society than an able-bodied drug-addict who cannot hold down a menial job. However, by promoting the concept of mental ill-health we greatly widen the range of people unable to fend for themselves without intrusive help.

Subjective criteria

Who exactly decides who is and who is not mentally fit? What criteria do we apply? If you can only run a hundred metres before running out of breath, are you physically disabled? Of course not, though you may be relatively unfit and should probably get some more exercise. Your doctor would probably advise you not to overdo it and set simple attainable goals and slowly adapt your lifestyle. However, if you fractured your spine in a horrific workplace accident, you may well lose control of your legs and suddenly countless everyday tasks like getting dressed or going to the bathroom become almost impossible to accomplish without some help. You are not simply unfit, but genuinely disabled. A disability, by its customary definition, prevents you from accomplishing essential life-sustaining tasks. It is not a relative handicap. If you're tone-deaf, but able to speak and understand a human language, you are not disabled, but just have a relative weakness in one facet of human creativity. Musical aptitude is certainly a nice to have and arguably gives you an advantage in natural selection, but many tone-deaf people have led fruitful lives without requiring any special help. Tone-deafness is also a rather relative concept as are relative intellectual deficits in mathematics, literacy or dexterity. While we may debate the causes of our relative strengths and weaknesses, modern society relies on functional and intellectual diversity. We cannot all be playwrights, musicians or comedians, but society would be dull without artistic creativity. However, it would cease to function without farmers, builders, engineers, plumbers, toilet cleaners or nurses. We can only relax and have fun once we have provided all infrastructure, food, clean water, shelter and other amenities essential to comfortable human existence. Technological progress and societal pressures have redefined our concept of comfort. Recent technological and economic trends have revealed two paradoxes. First automation and globalisation have displaced millions of manual workers, increasing competitiveness and lowering wages at the bottom end of the labour market. Second as material living standards have risen our emotional well-being has not. Greater labour mobility may have boosted the economy but it has led to greater job insecurity at a time when most women and men are expected to participate in the financial economy. Our personal worth is no longer measured by the roles we play in our family and community, but by our utility as a player in a dynamic consumption-driven market economy. Since the 1970s in much of Western Europe we've seen a gradual shift from practical trades to abstract tertiary sector roles involved in endless lifestyle and product promotion as well as the micromanagement of every aspect of human interaction. The UK now has more social workers than farmers, more accountants than carpenters and more IT recruiters than software developers. Yet we all need food, furniture and mobile communication. As we lose touch with the fruits of our endeavours, we begin to lose our sense of purpose in life other than the mere acquisition of money as a means of ersatz self-validation.

Not only is employment less secure, but human relationships are more volatile and communities more fluid and transient than ever before. By most measures material living standards have never been so high, but people are not only more indebted, but in the absence of paid employment or welfare payments only a few pay cheques away from financial ruin with little means to survive in the wild.

Extreme interdependence

Our current obsession with mental health is the result of extreme interdependence. A quick glance at the commonest professions in the UK reveals a rather disquieting picture. Fewer and fewer workers have any direct relationship to the production and maintenance of essential goods and services, excepts as managers, sales personnel or hauliers. In the UK over six million are employed in mainly administrative roles, some requiring some limited technical expertise or prior hands-on experience, over 3 million are employed in sales, marketing and business presentation, with only 300,000 employed in farming and fishing and around one million in manufacturing, but the biggest growth sectors are personal care and surveillance. The last-named sector encompasses not just policing, but social work and psychiatric services. An ageing population and technological innovation can partly explain this phenomenon, but not entirely, especially as older people are now fitter and many can live independently well into their 80s. A growing proportion of working age adults require assistance as a result of a learning disability, mood or personality disorder.

The Human Spectrum

Until the mid 1980s psychiatric disorders only referred to extreme cases of dysfunctional behaviour. Much of the literature on the relative merits of psychotherapy or pharmacological treatment relates to individuals who posed a direct threat to themselves and/or to wider society. They accounted for under 1% of the general population and as therapeutic care improved most could rejoin the community as normal citizens. Psychiatry had been tarnished by its association with authoritarian regimes, not least in Nazi Germany where schizophrenics were euthanised alongside the mentally handicapped, but also extensively in the Soviet Union where dissidents were routinely treated in psychiatric institutions. Freedom meant above all the freedom to be yourself, to be the master of your feelings and to act an autonomous player in a wider social reality. Of course personal behaviour is regulated by social mores and a fine balance between rights and responsibilities that we learn from our family and community. However, as we gained more free time, we could unleash our individuality and creativity in more expressive ways. Not surprisingly many of the mental ailments now falling under the broad umbrella of mental illness were first observed among the professional classes. The working classes were until recently too busy working to indulge in the kind of fantasies that would preoccupy early psychotherapists. Alcohol remained the main release valve for emotional insecurity and deviant behaviour was either managed within the community or treated as criminality.

To gain greater public acceptance, psychiatry needed a complete rebrand. As the age of self-centred narcissism deepened its roots in North American society, people became more preoccupied with their moods and feelings. New Selective Serotonin Re-uptake Inhibitors such as fluoxetine, also known as Prozac, proved a huge marketing success. By the late 1990s taking mood-enhancing medication had not just become socially acceptable, they had helped blur the boundaries between a normal range of human emotions and psychopathy. Meanwhile concerned parents and teachers began to refer boisterous children unable to pay attention in class to be diagnosed with Attention Deficit Hyperactivity Disorder with a seemingly tailor-made drug, methylphenidate better known as Ritalin. In the same period we saw a rapid rise in the diagnosis of hitherto rare neurological disorders on the autistic spectrum. This craze for psychiatric labelling spread to Europe, usually accompanied by awareness-raising campaigns. Psychiatry had now donned the clothes of the progressive left championing the cause of sufferers of these new labels and thus creating new victim groups demanding special treatment. More and more young people began to contextualise their problems in terms of a psychiatric diagnosis.

Marketing Personality Disorders

The more troublesome behavioural disorders that would have merited a psychiatric diagnosis did not lend themselves to marketing, but only to occasional awareness raising initiatives. Nobody could claim pride in psychopathic madness or subnormal idiocy. However, people can be persuaded to claim pride in geekishness, hyperactivity, obsession, sudden mood swings or certain learning challenges if celebrities share some of these traits. Indeed many high-profile media personalities have publicised their diagnosis with OCD, bipolar disorder, ADHD, Aspergers' Syndrome and even learning disabilities. These traits may have their challenges, but also their advantages especially in creative professions. Other past and present luminaries have been posthumously diagnosed. Albert Einstein is claimed to have suffered or benefited from Asperger's Syndrome. It's even been claimed that multibillionaire IT entrepreneur, Bill Gates, has this syndrome too. As the mental health industry widens the diagnostic criteria for personality disorders, we begin to uncover traits common in almost all of us. Excellence in any endeavour is impossible without focussing on the task at hand. It's thus absurd to claim that a special interest in a circumscribed subject is any way pathological. It may be relatively dysfunctional if it prevents us from doing more important things essential to our wellbeing, but we would have made little technological or social progress if some people had not dedicated their professional lives to specialist subjects that few others understand. Our complex high-tech society depends on hyper-specialisation, but as noted elsewhere, most specialists are involved in various aspects of communication, administration and supervision rather than in the hard science that makes our modern lives possible. By promoting the concept of neurological diversity, the authorities can now treat different groups of people in different ways.

Inevitably, some readers will feel a little confused. Most of us have friends or family members who face significant personal challenges. You may have had episodes of emotional distress yourself. Indeed one may argue if you have never experienced sorrow, rejection or isolation, you have led a very sheltered life and will probably struggle to understand the real-life experiences of most members of our society. Should we help an anorexic girl starving herself to death for fear of becoming morbidly obese, a severely depressed teenager confined to his bedroom or a troubled young man plotting to save humanity from a contagious virus by killing his next door neighbour because he works in a pharmaceutical testing laboratory? Of course, but we need to understand the true causes of such seemingly illogical behaviour, e.g. is the rise in eating disorders related to our obsession with perfect bodies, advertising, size-zero models and media obsession with obsesity?

Alphas, Betas, Gammas, Deltas and Epsilons

In Aldous Huxley's prescient vision of a distant technocratic future, humanity had ceased to procreate naturally and was socially and biologically organised in 5 distinctive castes, ranging from high-IQ but potentially moody Alphas to low-IQ but happy Epsilons. However, everyone took pride in their own cast identity rather than fret about their relative social or intellectual status. In Huxley's Brave New World every aspect of life from conception to death was micromanaged and any psycho-social tensions were managed by the wonder potion, Soma (Sanskrit for he body as distinct from the soul, mind, or psyche) and recreational sex. Today's Soma takes various forms. Besides obvious analogies with anti-depressants and other psychoactive drugs, the mass entertainment business and recreational stimulants play an important role in managing the general population, turning us into compliant consumers and loyal team players rather than awkward free agents. Increasingly political opinions at variance with the neoliberal globalist orthodoxy are associated with maverick personal types, i.e. rather than tackle a philosophical viewpoint head-on, the new establishment will parody it and insinuate that proponents of such views suffer from some form of paranoid delusion. Democracy thus serves no longer to reflect the true will of citizens, but to manage different groups of people in order to manufacture consent with political agendas promoted by powerful lobbies.

Joining the Dots

We should view the neuological categorisation of human beings alongside other trends for cosmetic surgery, assisted fertilisation, gender reassigment and the potential for artificial intelligence to empower the technocratic elite. Now under the pretext of combatting childhood depression and/or bullying, the authorities feel empowered to subject all children to mandatory mental health screening, while simultaneously encouraging non-traditional family structures, facilitating fertility treatment, now available on the NHS irrespective of relationship status and heavily subsidising mothers going to work, even if their earnings are less than equivalent cost of childcare. All these phenomena remove children from traditional biological families and transfer responsibility for their socialisation away from parents to corporate institutions. Natural variations in human behaviour are analysed in detail to identify individuals that fail to respond to mainstream socialisation and psychological conditioning techniques and may thus become, in the authorities' eyes, troublemakers.

Concern about mental health, while often well-intentioned, provides the ultimate pretext to expand the surveillance state. As the saying goes, the road to hell is paved with good intentions.

Power Dynamics

Should we still call the global lingua franca English?

In more innocent times we associated a language with its national community. For much of history nations and languages had a symbiotic relationship. Language is the ultimate propagator of the cultural traits that hold together communities and enable the development of hierarchical command structures. A multilingual country is effectively an empire, for it has to unite peoples unable to communicate easily except through the medium of a common higher-register language that is not their own. In a simplified multipolar world, each country would have its own language and a set of shared customs, e.g. In Denmark one speaks Danish and in France one speaks French, both languages inextricably bound to their motherlands. Admittedly French serves as a lingua franca in much of Northwestern and Central Africa and even Danish acts as a colonial language in Greenland. French is also spoken in Quebec, Walloon Belgium, Western Switzerland and a few French overseas territories dotted around the globe, but most native speakers live in metropolitan France. By contrast, only 8-9% of native English speakers live in England. The ethnic English proportion may be a little higher if we include the greater diaspora in Canada, Australia and South Africa who still identify as English, but most native English speakers are North American and many more live in Australasia, Southern Africa and elsewhere.

It's hard to measure just how many people speak English worldwide as a second language. It could be as many as 3 billion if we include everyone who has learned some basic English at school or at work to as few as 500 million if we restrict the total to those who speak the language with a high degree of proficiency and, most important, retain full mutual intelligibility with native English speakers. Other estimates relate to varying degrees of fluency and may apply different criteria, e.g. the number of school leavers with a basic English language qualification or a random sample of the general population in which participants have to engage in long conversations with varying levels of difficulty (e.g. ranging from basics such as asking for directions to discussing more challenging topics such as politics). As a rule, English as a lingua franca is much more widely spoken in cosmopolitan cities and by members of the better educated professional classes. Whichever way, recent technological and cultural changes have vastly expanded our need to communicate with people from other language communities. Global English, for all its defects, not least its inconsistent pronunciation and orthography, has succeeded where Esperanto and a handful of other neutral artificial lingua francas failed. As the pace of globalisation and cultural change accelerates, the core of native and near-native English speakers will find themselves outnumbered by those who speak the language in wildly divergent and creative ways with little reference to the original variant of English that first migrated from the British Isles in the 17th century. Indeed it was not until the mid 19th century that English gained the upper hand over French, Spanish, Portuguese, Arabic, Russian or Chinese. Although France lost the Seven Year War in 1764, having to cede Quebec and most of its Indian territories to Great Britain, and its hopes of European supremacy were dashed at the 1815 Battle of Waterloo, French remained the preferred language of diplomacy and of greatest prestige in Europe well into the early 20th century.

English means different things to different people. To the English, it may still be a symbol of ethnic identity if spoken in its insular form with its odd colloquialisms and regional pronunciations. Today you will seldom hear the clever melange of Anglo-Saxon and Norman French that characterised Shakespeare's works, but rather a mishmash of vernacular British English, Americanisms and branded neologisms interspersed with politically correct NewSpeak and catch phrases popularised by TV personalities. The Scottish and Irish tend to have a more pragmatic view of the language, but take pride in their local dialects. To Nigerians or Indians, English is the high register of their commercial lingua franca. The subtleties of regional English dialects or latest suburban slang from Merseyside or Hampshire are of little interest to your average African or Asian business person, for whom English is a vehicle of communication and expression, but not a badge of tribal identity. To continental Europeans, English was, until recently, just another foreign language, but has now become a gateway to participation in the globally integrated business world, academia and youth culture, especially of the kind that global entertainment businesses most heavily promote. At times it seems everywhere global English trumps native languages, even where they remain strong. Yet to view this as a triumph of English culture over the rest of the world is in my humble judgement to misunderstand the far-reaching consequences of rapid global cultural convergence. Indeed traditional British English may well be a victim of its own apparent success, submerged by a rapidly morphing global lingua franca that owes as much as to Bangalore, Berlin and Beijing as it does to Birmingham, Brisbane and Boston. If a Briton from the 1950s could, through the magic of a time machine, experience the linguistic reality of modern Britain, she would be very confused. While superficially many common words would be much the same, many old terms and phrases have acquired new meanings or been superseded by more politically correct neologisms. Much discourse would be unintelligible without detailed knowledge of the last 50 years of technologically driven culture replete with brand names, acronyms and adapted foreign recipes. Back in the 1950s most Britons did not even have a phone or a television set, let alone an iPhone.

Opinions on the role of global English vary. Robert Phillipson has put forward the theory of linguistic imperialism, a must-read for anyone interested in cultural change. While I find many aspects of this perspective persuasive, especially in the context of cultural imperialism, in my experience abroad the key drivers behind linguistic homogenisation are not native English speakers at all, but international business. British imperialism and later US economic supremacy merely set the stage for English to expand way beyond its core of native speakers (still only 6.5% of the world's population). I find Jean-Paul Nerriere's concept of Globish, as popularised in 2009 book of the same name, much closer to the emerging linguistic reality, although I do not share his optimism that American and British English will retain their privileged status, which will wane with their relative economic and cultural decline. While I found much of the historical research in Nicholas Ostler's The Last Lingua Franca of great interest, I cannot support his conclusion that automated simultaneous translation technology will supplant the need for global English and let everyone cultivate their own vernacular. I've no doubt natural language processing will sooner or later let us translate human speech into a machine language intelligible to computers, but it will be some time before computers will be able to interpret the full range of nuances of colloquial human speech. Like it or not, cultural convergence is the order of the day, so now the French have to learn Globish while the Brits have had to discard feet, pounds and pints in favour of metric units.

I taught English as a foreign language for three years and soon learned English syntax had more exceptions than rules. As soon as I explained a rule, some wise spark would cite an exception, often from William Shakespeare, Jane Austen, Charles Dickens or whichever pre-20th century English authors happened to be on their reading list. However, the biggest stumbling blocks for my German and later Italian students were pronunciation, especially understanding authentic native speakers, and literal translations from their own language. In the pre-Internet era my best advice was to acquire English-medium movies with the original soundtrack subtitled in English. Most could read the language much better than they could speak it. If you attempt to read subtitles in your own language, you will miss the subtleties and flavour of the source tongue. At the time the received wisdom was that English is on the whole much easier than the other main European languages. The English-is-easy meme has become a self-reinforcing mantra, which in my experience as both a language learner and teacher is more attributable to its cultural ubiquity and prestige than to any intrinsic qualities. On the surface English grammar is very simple with no confusing grammatical genders (e.g, the Sun is masculine in Italian and French but feminine in German), a limited range of verb conjugations (I do, he does, I did etc.. as opposed to faccio, fai, fa, facciamo, fate, fanno, ho fatto, feci, facevo, farò etc..), only a few dozen common irregular verbs, very uniform plurals with a few exceptions, of course, undeclined adjectives and just a barebones case system. One wonders how Czech children can cope with seven grammatical cases and three grammatical genders, but they do. Indeed even old English had five cases and three genders, very similar to modern Icelandic or German. However, by this metric, the easiest language in the world must be Chinese, in which verbs, nouns and adjectives are never suffixed and relationships between words are either implied by word order and context or emphasised with helper words. Native English syntax is not as simple as many continental learners of the language would like to believe. Word order plays a much more important role in English than it does in languages with a more clearly defined case system like German or Polish. English has special interrogative auxiliary verbs to maintain its default Subject-Verb-Object (SVO) word order (e.g. When did you live in Italy? but How many people live in Venice?) and has a vast array of verbal tenses with auxiliary words (such as I do, am doing, will do, am going to do, have done, have been doing, did, was doing, used to do, had done, had been doing etc). While English's verbal moods serve useful semantic functions for native users, their utility is lost on speakers of other languages. In Germany, the Low Countries, France and Northern Italy, past actions are typically expressed with a tense we confusingly call the present perfect, e.g. I have done.. (j'ai fait.., ich habe .. gemacht, ho fatto .. etc.) while English always uses the simple past for terminated actions (e.g. I ate an apple five minutes ago, but I've never eaten a horse ). English distinguishes continuous from simple verbal forms, e.g. I drink tea (i.e. I'm a tea drinker), but I'm drinking orange juice (at the moment). In many other languages, the same verbal form would be used in both cases.

While English syntax may be a tad quirky, the biggest challenge for most learners is pronunciation. I once suggested the best international language would be written more or less like English, but pronounced as if it were Italian or Spanish. Naturally, some sounds are easier for speakers of some languages than others. Castillian Spanish, Greek and Arabic have the dental fricatives /θ/ and /ð/ as in theft or then, often a source of ridicule for French, German and Italian speakers of English. However, consonants only form the outer shell of syllables. Vowels and stress add colour to our speech and help us distinguish thousands of short words that would otherwise be homophones. Moreover, English vowels are notorious for their indistinctness. Most languages use variants of the 5 cardinal vowels /a/, /e/, /i/, /o/ and /u/, with a few diphthongs and possibly a few extra vowels. English, by contrast, has a complex system of short, long and gliding vowels that sit midway between cardinal vowels. The cardinal /a/ may be confused with North American rendition of the short o in hot, or Southern English version of the short u or /ʌ/ in hut or the long /a:/ in heart (r is usually suppressed in modern Southern English) or the Northern and Midlands English pronunciation of hat. In unstressed positions most vowels become either schwa /ə/ or a short i /ı/, e.g. comfortable may be phonetically transcribed as /'kəmfəɾtəbəɭ/ . Indeed if English adopted phonetically accurate spelling, many would confuse it for a quaint Scandinavian dialect with a few extra letters and diacritical symbols.

Speech patterns are learned in early childhood. Each dialect has a repertoire of sounds it must distinguish to facilitate communication. Our ears are fine-tuned to differentiate the phonemes particular to our linguistic environment. Exposure to other dialects enables us to remap these phonemes to other variants. As the pronunciation of English differs quite markedly from its spelling native speakers will often associate different sounds with the same written form. Over time common terms tend to be shortened, while ambiguous short words may need a companion word to emphasise their meaning or may give way to less ambiguous alternatives. E.g. the old English term wifeman became woman, Nonetheless, some languages tend towards abbreviation much more than others. In Italy the term scontrino fiscale amused me, why would shopkeepers have to keep reminding me that the small paper receipt that had just given me was for tax purposes? Many linguistic communities prefer more complete and semantically correct terminology for cultural reasons. If we had retained the Victorian attitude to word formation, many common English-medium neologisms would be much longer. The first high-capacity horse-drawn coaches were commonly known as omnibuses, Latin for all, and only later shortened to bus. Terseness is not always an advantage as I find in my day job as a programmer, longer descriptive names are easier to interpret than concise but ambiguous names. The term iPad is the patented creation of a marketing department. It owes its success to its extreme simplicity. Yet pad has many other meanings, anything from a soft wad of material, a booklet of writing paper as in notepad, a flat-topped structure such as launchpad or heli(copter)-pad, the flat area of circuit board or a small city apartment. The correct term for a device like an iPad or a Kindle Fire, both ephemeral devices, is electronic tablet, but tablet alone has plenty of other meanings. Smartphone may be more neutral than iPhone, a trademark, but is itself a neologism that fails to adequately describe its true nature. Indeed the forerunner to modern smartphones was a personal digital assistant or PDA, which is admittedly not quite as catchy. These new coinages rely heavily on their neurolinguistic impact. They must be short, relatively easy to pronounce and distinguishable from their technical predecessors. If you want to sell a new kind of coffee, a descriptive Anglo-Saxon concoction like concentrated coffee with frothy milk would be bad marketing, cappuccino sounds much better to your average English speaker.