Home » Posts tagged 'nazism'
Tag Archives: nazism
I have just finished reading, Hitler: Ian Kershaw’s brilliant, two-volume biography on Adolf Hitler. Over the course of 1432 pages, Kershaw uncovers why Hitler, a man not all too dissimilar from other tyrants in history, has become synonymous with evil.
Kershaw also reveals the gap between Hitler’s public image and private personality. He reveals the difference between the rabble rouser capable of captivating the masses by exploiting their fears, prejudices, and desires, and the lacklustre reality. Kershaw shows how Hitler transformed Nazism into a national religion – complete with its own songs, fables, and rituals – and how he transformed himself into its demagogue.
Hitler projected a persona that embodied all the ideals of German nationalism. He presented himself as the archetype of German pride, efficiency, and self-discipline. In Hitler, the German people found the living embodiment of their fears and aspirations.
Furthermore, Hitler presented himself as the saviour of a nation on the brink of ruin. This was not entirely his doing, by the early-thirties things had grown so dire in Germany that people were willing to throw their lot in with anyone promising to restore law, order, and honour. Hitler promised all that and more. Utilising what we today would recognise as identity politics, Hitler promised to restore national pride and wreak vengeance on Germany’s enemies. He divided the world into victims (the German people), perpetrators (international Jewry and Marxists), and saviours (the Nazis).
It would be far too simplistic, however, to conclude that Hitler brainwashed the German people. Rather, Hitler and the German people became intertwined in the same unconscious conspiracy. Hitler may have been the one to espouse the kind of murderous ideas that led to Auschwitz and Stalingrad, but it was the German people who gave those ideas their full, unconscious support. As time marched on, Hitler’s sycophancy was taken as political genius.
By telling the German people what they wanted to hear, Hitler was able to present himself as a national saviour. The reality was far different. He was a man with virtually no personality. He had no connection whatsoever with ordinary people. He never held an ordinary job, never had children, and only married his mistress, Eva Braun, the day before his suicide. Albert Speer, the Nazi architect and one of the few men Hitler counted as a friend, described him as a duplicitous, insecure individual who surrounded himself with shallow and incompetent people, laughed at the misfortunes of others, and retreated into “fantastic mis-readings” or reality.
Furthermore, whilst Hitler presented himself as the hardworking political demagogue of unmatched genius, he was, in reality, a lazy, egotistical man whose rise to power rested on the cynical manipulation of national institutions. Far from being the tireless worker he presented himself to be, Hitler actually proved unable to deal with numerous major crises during the War because he was still asleep. He saw his role as being the creator of Nazi ideology. The actual running of Germany he left to his functionaries.
When Hitler toured Paris following the fall of France in 1940, he made a special visit to the tomb of Napoleon Bonaparte. Saluting the Emperor’s marble tomb, Hitler commented, in typical egotistical style that like Napoleon his tomb would only bear the name “Adolf” because “the German people would know who it was.”
He was not entirely wrong. The name Adolf Hitler is remembered today. However, far from being remembered as the founder of a thousand-year Reich, he is remembered as a genocidal fruitcake whose legacy is as inglorious as his life. Hitler and Napoleon may have been similar in many ways (both were foreigners to the countries they would end up ruling, both reigned for a short period of time, and both significantly altered the course of history), but where Napoleon left a legacy that is still very much with today, Hitler failed to leave anything of lasting significance
But perhaps that is precisely what Hitler wanted. Carl Jung has a dictum: if you want to understand someone’s motivations for doing something, look at the outcome and infer the motivation. In his brief twelve-years in power, Hitler led the German people into a war that cost fifty-million lives, presided over a Holocaust that murdered eleven million people, and oversaw the destruction of the old Europe. If Adolf Hitler could be summarised in a single quote, the line from the ancient Hindu text, Bhagavad Gita would prove sufficient: “Now I am become death, the destroyer of worlds.”
There is an old adage which states that you do not know how big a tree is until you try and cut it down. Today, as cultural forces slowly destroy it, we are beginning to understand that the same thing can be said about personal responsibility.
Society no longer believes that people ought to bear their suffering with dignity and grace. Rather, it now believes that the problems of the individual ought to be made the problems of the community. Individual problems are no longer the consequence of individual decisions, but come as the result of race, gender, class, and so forth.
The result of this move towards collective responsibility has been the invention of victim culture. According to this culture, non-whites are the victims of racism and white privilege, women are the victims of the patriarchy, homosexuals are the victims of a heteronormative society.
The 20th century is a perfect example of what happens when responsibility is taken from the hands of the individual and placed in the hands of the mob. The twin evils of communism and Nazism – which blamed the problems of the individual on economic and racial factors, respectively – led to the deaths of tens of millions of people.
Furthermore, such ideologies led otherwise decent individuals to commit acts of unspeakable violence. Whilst observing the trial of Adolf Eichmann, a former SS soldier who had been one of the architects of the Holocaust, the writer, Hannah Arendt was struck by the “banality of evil” that had characterised German war atrocities. Arendt noted that the men who conspired to commit genocide were not raving lunatics foaming at the mouth, but rather dull individuals inspired to commit evil due to a sense of duty to a toxic and corrupt ideology.
The Bolsheviks taught the Russian people that their misfortune had been caused by the wealthy. And that the wealth was gained through theft and exploitation. Likewise, the Nazis convinced the German people that their problems could be blamed on the Jews. It is not difficult to see how this philosophy led, step by step, to the gulags and the concentration camps.
The same thing is happening today. The only difference is that those who play it have become more sophisticated. Today people are encouraged to identify with identity groups ranked by so-called social privilege. Then they are taught to despise those with more social privilege than them.
Under this philosophy, crime is not caused by the actions of the individual, but by social forces like poverty, racism, and upbringing. Advocates claim that women should not be forced to take responsibility for their sexual behaviour by allowing them to essentially murder their unborn children. Sexually transmitted diseases like HIV is caused by homophobia rather than immoral and socially irresponsible behaviour. And alcoholism and drug addiction are treated as a disease rather than a behaviour the addict is supposed to take responsibility for. The list is endless.
Personal responsibility helps us take control of our lives. It means that the individual can take a certain amount of control over his own life even when the obstacles he is facing seem insurmountable.
No one, least of all me, is going to argue that individuals don’t face hardships that are not their fault. What I am going to argue, however, is that other people will respect you more if you take responsibility for your problems, especially if those problems are not your fault. Charity for aids sufferers, the impoverished, or reformed criminals is all perfectly acceptable. But we only make their plight worse by taking their personal responsibility from them.
Responsibility justifies a person’s life and helps them find meaning in their suffering. Central to the Christian faith is the idea that individuals are duty bound to bear their suffering with dignity and grace and to struggle towards being a good person. To force a man to take responsibility for himself is to treat him as one of God’s creations.
You cannot be free if other people have to take responsibility for your decisions. When you take responsibility from the hands of the individual you tarnish his soul and steal his freedom.
Freedom from responsibility is slavery, not freedom. Freedom is the ability to make decisions according to the dictates of own’s own conscience and live with the consequences of that decision. Freedom means having the choice to engage in the kind immoral behaviour that leads to an unwanted pregnancy or AIDS. What it does not do is absolve you from responsibility for those actions. Slavery disguised as kindness and compassion is still slavery.
In March of this year, the vlogger Mark Meechan was convicted in a Scottish Court of violating the Communications Act 2003 for a video he had uploaded to YouTube in April 2016. The video, which Meechan claimed had been produced for comedic purpose (he claimed he wanted to annoy his girlfriend), featured a pug dog making Hitler salutes with its paw, responding to the command “gas the Jews” by tilting its head, and watching a Nazi rally at the 1936 Berlin Olympics.
The Scottish Court that convicted Meechan (who is much better known as ‘Count Dankula’) concluded that he had been motivated to produce the video by religious prejudice. Perhaps without realising it, by convicting Meechan, the Scottish legal system has illustrated the importance of free speech and the threat that political correctness poses to it.
Unfortunately, legally and politically incited attacks against both free speech and comedy are not limited to the United Kingdom. In Canada, politically correct inspired attempts to silence comedians have been instantiated into law. In one alarming case, the Quebec Human Rights Commission awarded Jeremy Gabriel, a disabled former child star, $35,000 in damages after he was ridiculed in a comedy routine by Mike Ward.
It is little wonder, then, that some comedians have seen cause for alarm. Some, like Chris Rock, now refuse to perform on college campuses because of the oversensitivity of some of the students. Others, like legendary Monty Python star John Cleese, have warned that comedians face an “Orwellian nightmare.”
Political correctness is the antithesis of comedy. It is not that comedians have been prevented from practising their craft, but that the pressures political correctness place on them makes it difficult to do so. The comedian feels himself pressured to self-censor himself because of the way words are categorised by their supposed offensive or inoffensiveness. And he finds himself fearful of having his words twisted and misinterpreted to mean something other than what he meant it to mean.
Much of the problem arises from a culture that has elevated politics to something approximating religion. And, like all zealots, the fanatics of this new religion have attempted to conform every aspect of society to their new faith. It is the job of the comedian to make me laugh. It is not his job, as some would have you believe, to play the role of political activist.
Unfortunately, that view is not one held by many on the radical left. In an article for the Sydney Morning Herald, Judith Lucy opined that people wanted to “hear people talk about politics or race.” And it seems that there are people who agree with Lucy. Comedy is not to be used to bring joy to people, but as a platform to espouse politics. Comedy has become a form of propaganda. And it is the liberal agenda that determines what is considered funny and what isn’t.
What the politically correct offer instead of genuinely funny comedy is comedy as a form of political activism. Comedy is to be used to spread progressive ideas and political correctness is to be used to silence that which opposes those ideas. Take, for example, Tim Allen’s sitcom Last Man Standing, which revolved around a conservative protagonist, which was cancelled by the American Broadcasting Company despite its popularity.
And nowhere can this trend of comedy as political activism can be seen more readily than in the current incarnations of late-night television. Legendary comics like Johnny Carson and David Letterman established late-night television as a form of entertainment that provided light-hearted entertainment before sending its audience off to bed. It was not afraid of offending people in order to do so, either. Today, however, this willingness to offend others seems only to be targeted towards those on the right of the political spectrum. It is as though the late-night comedian has decided to use his position to preach progressive politics to its audience rather than using their talent to make insightful and hilarious observations about the world around us. The result is that late-night host places commenting on political or social matters above entertaining his audience.
It is as though the late-night host has replaced humour for indignation. The “jokes” (in reality they are tirades) contain more than a modicum of vitriol and resentment. Samantha Bee referred to Ivanka Trump as a “feckless cunt”, Stephen Colbert accused President Trump of being Vladimir Putin’s “cock holster”, so on and so forth.
While it may seem alarming, it is precisely what happens when comedians see themselves as activists rather than entertainers. As Danna Young, Associate Professor of Communication at the University of Delaware, commented:
“When comics abandon humour and go with anger instead, they come just another ‘outrage’ host. Now, if that’s cool with them, great. But if they are looking to capitalise on the special sauce of humour, then they’ll need to take their anger and use it to inform their craft, but not have it become their craft.”
Fortunately, there is a litany of comedians who refuse to conform their comedy to the morays of political correctness and progressive politics. Numerous comedians have denigrated political correctness as the “elevation of sensitivity over truth” (Bill Maher) and “America’s newest form of intolerance” (George Carlin). Jerry Seinfeld, a man whose comedy routines are considered among the least offensive in comedy, referred to political correctness as “creepy” on Late Night with Seth Meyers. Bill Burr accused social justice warriors of being bullies. Likewise, Ricky Gervais has tweeted “if you don’t believe in a person’s right to say things you find ‘grossly offensive’, you don’t believe in free speech.”
And all of this is not to say that political correctness has destroyed genuinely funny comedy, either. Netflix has spent a great deal of money producing comedy specials that are, in many cases, far for inoffensive. Ricky Gervais comedy special Humanity has featured jokes about rape, cancer, transgenderism, AIDS, and the Holocaust.
Comedy has been threatened by both progressive politics and political correctness. Mark Meechan may have found himself running afoul of the politically correct left, but as long as their people who stand committed to free speech and comedians prepared to make offensive jokes, the laughter will continue.
Society has a problem with politically-motivated violence. At a protest in Charlottesville, Virginia, a man with Nazi sympathies drove his car into a crowd of protestors, killing one and injuring many others.
Likewise, the so-called anti-fascists, Antifa (they are, of course, nothing of the sort) has resorted to using violent and intimidatory tactics at numerous protests and rallies.
Needless to say, such occurrences raise serious questions about the consequences of political polarisation and the lack of community sentiment and incivility that it brings.
One of the features of the 2010s has been the increase in political polarisation. As people become more willing to identify themselves by their political ideology, the tendency to view one’s political opponents as extremists have, likewise, increased. Consequentially, it has become easier and easier for people to demonise others because they don’t hold the same political views that they do.
Such polarisation, of course, has been fuelled by a biased and segregated news media system. The online video and podcast revolution, combined with a mainstream media that heavily slants towards the left, has meant that people are often only exposed to those views that match their own. As such, the right has been manipulated into believing that all on the left are social justice warriors, protestors, and radical feminists, whilst those on the left have been manipulated into seeing all on the right as Nazis, race baiters, white supremacists, and alt-righters.
To a large degree, political polarisation has come as a consequence of the loss of a sense of community. People no longer associate with their neighbours, and, as a result, they have come to see each other as potential enemies rather than potential friends. And, under such conditions, it becomes very easy to see another person as evil when their political views do not compliment your own.
The loss of community has occurred for three major reasons. First, the advent of social media, online shopping, video subscription services, and smartphones has meant that people are no longer required to venture out into society and interact with others. It is no longer necessary for a consumer to interact with shop staff, for instance, because they can shop in the solitude of their own living room. Modern technology, for all its benefits, has provided us with a faux sense of sociability. A kind of sociability that allows us to communicate with others but does not require genuine human interaction.
Second, past-times that were once considered neutral have been co-opted to spread politically-charged messages. People can no longer go to a football game, watch a movie, or listen to music without having political ideology preached to them. As a consequence, society lacks the entities that once allowed people to bond with one another despite differences in their political beliefs.
Third, engagement with the community has declined. People are no longer engaged with the community in the same way that their grandparents were. In the past, social clubs, community groups, sports clubs, and religious institutions provided a space where people of diverse beliefs, values, and opinions could come together. As a consequence, such entities promoted a degree of social unity and social cohesion. Today, however, people are becoming more and more willing to self-segregate. They isolate themselves, choosing only to socialise with friends and family.
What all this has amounted to is a loss of civility. It is very easy to justify all manner of bad behaviour when one sees their opponent as a threat to their very existence. Our modern society shuns manners and dismisses common courtesy and is surprised to find self-centredness and vulgarity in its wake.
Since the Industrial Revolution, scientific and technological development has progressed at an unfathomable rate. In a little over a quarter-of-a-millennium, the Western world has gone from a superstitious, agrarian society to a scientifically and technologically sophisticated one. The price of this remarkable achievement has been our alienation from the ‘dream world.’ We have lost our sense of wonder, our sense that there is something more substantial to existence than just mere crude matter.
The lack of spirituality among modern man is largely the result of an overreliance on materialism. For the philosophically challenged, ‘materialism’ is not a reference to consumerism, but to the philosophical position that regards physical matter as the fundamental substance of nature. Philosophical materialism posits that everything, including human thought and the course of history, comes as the result of physical forces. This is a philosophy which has no room for the soul, for divinity, or for God.
Philosophical materialism likely harkens back to the pre-Socratic philosophers. Epicurus, for example, believed the universe consisted of invisible and indivisible free-falling atoms that randomly collide with the world. For all intents and purposes, however, it is the Ancient Greek philosopher, Democritus (c. 460BC – c. 370BC) who is credited with the invention of philosophical materialism within the Western tradition. Democritus formulated the theory that the world was composed of ‘atoms’ – invisible chunks of matter – existing in empty space. He theorised that these microscopically small atoms would interact with one another by impacting or hooking up. Change occurs when the configuration of these atoms is altered.
In modern philosophy, materialism is referred to as a category of metaphysical theories. The French philosopher, Baron d’Holbach’s (1723 – 1789) book, Système de la Nature ou Des Loix du Monde Physique et du Monde Moral (1770) (The System of Nature, or the Laws or the Moral and Physical World) argued that everything that occurs, down to human thought and moral action, comes as the result of a causal chain that has its roots in atomic motion. The book was condemned by King Louis XVI (1754 – 1793), meaning that it was the job of the hangman to locate every copy and cut it to pieces on the beheading block.
The modern world likes to see itself as fundamentally materialistic. Being seen as “practical”, “realistic”, or “down to earth” is considered by many to be a great compliment. However, this view is largely mistaken. In reality, it is ideas, referring to the ability to think and feel and imagine, and the ability to implement them that has truly made the human race what it is. In a letter to Guillaume Gibieuf (1538 – 1650), the French philosopher, René Descartes (1596 – 1650) wrote: “I am certain I have no knowledge of what is outside me except by means of the ideas I have within me.”
It would be a great mistake, then, to suppose that human beings are naturally rational or civilised creatures. In reality, people are far more irrational, crazy, and destructive than we like to think. Modern science is really only a few hundred years old, having its roots with Francis Bacon (1561 – 1626), Rene Descartes, and Isaac Newton (1642 – 1727). Therefore, the basis for modern society is not, as often supposed, science, but religion. This is evidenced by two facts. First, the existence over thousands of years of civilisations that have their basis in religion, not science. These societies and their corresponding religions include the Japanese and Shintoism, the Chinese and Buddhism, the Middle East and Islam, and the West and Christianity. And second, by the numerous anti-science movements (most notable in today’s world are the social constructionists) that have come to the public’s attention in recent years. We are able to live in an orderly and rational manner because we live in a society that has moral rules and legal boundaries, not because it is something that comes naturally to us
The relationship between the mental and physical worlds was of great interest to the Swiss psychologist, Carl Jung (1875 – 1961). Influenced by the German Idealist School, Jung believed that “metaphysical assertions… are statements of the psyche.” He would comment: “it is the soul which, by the divine creative power inherent in it, makes the metaphysical assertion; it posits the distinction between metaphysical entities. Not only is it the condition of all metaphysical reality, it is that reality.” The central idea behind Jung’s metaphysical system was that:
“The premise that all psychological processes are necessarily conditioned on innate universal structures of subjectivity that allow for human experiences to transpire, and that these processes participate of a greater cosmic organising principle that transcends all levels of particularity or individuality.”
– Jon Mills, Jung’s Metaphysics
In his function as a psychotherapist, Carl Jung observed that western men and women often suffered from debilitating feelings of inadequacy, hopelessness, and insignificance. He believed that this was caused by a kind of spiritual problem that, even today, threatens the stability and liberty of our society. The result is that we limit ourselves only to what is socially and economically attainable. As Carl Jung wrote:
“Man feels isolated in the cosmos. He is no longer involved in nature and has lost his emotional participation in natural events, which hitherto had symbolic meaning for him. Thunder is no longer the voice of a god, nor is lightning his avenging missile. No river contains a spirit, no tree makes a man’s life, no snake is the embodiment of wisdom and no mountain still harbours a great demon. Neither do things speak to him nor can he speak to things, like stones, springs, plants and animals.”
– Carl Jung, The Earth Has a Soul
Jung noted that this problem, and the consequences associated with it, correlated with the declining influence of Christianity in the Western world and the rise of mass urbanisation that came as a result of the Industrial Revolution. As the individual surrounded himself with more and more people, his feelings of insignificance increased. The result is individuals who are highly insecure, unstable, and highly suggestible. Furthermore, the rational and scientific mindset that rose to prominence during the Enlightenment has also fooled many politicians and social reformers into believing that the same measures can be used to address social and political problems. The existence of the totalitarian systems such as fascism and communism, genocides, and mass murders that characterised the Twentieth Century stand as testaments to this reality.
The problems the West faced during the twentieth century are almost entirely spiritual by nature. The communists killed tens of millions of people in an attempt to achieve a worker’s paradise, the Cold War was as much a battle between opposing worldviews as it was one of political and economic differences, and one would have to be blind not to notice the religious overtones present in Nazism. Even today, the conflict between Western civilisation and fundamental Islam can be seen as having profound religious overtones.
Jung believed that the unconscious mind could be split into two distinct categories: the personal unconscious and the collective unconscious. The contents of the personal unconscious is comprised of both instincts (Triebe) and archetypal or primordial images. It merely refers to the memories, emotions, and knowledge that have generally been conscious but have become repressed over time. By contrast, the collective unconscious, one of Jung’s most misunderstood concepts, is distinguishable from the personal unconscious in that it is manifested separately and is therefore not a personal acquisition. It symbolises universal culture: the anthropological images, practices, edicts, traditions, mores, social norms, values, and beliefs that embodies a culture or mythos. The collective unconscious, therefore, symbolises the space that human-beings exist in.
Dreams are considered to have great psychological significance. They use mythological narratives to allow us to naturally express our unconscious fears and desires. The average person dreams between three to six times per night with each dream lasting between five and twenty minutes. Jeffrey Sumber, a clinical psychologist, has spent years studying dream mythology at Harvard University and Jungian dream interpretation at the Jung Institute in Zurich, Switzerland. Sumber argues that dreams bridge the unconscious mind with the conscious mind. “Dreaming is non-essential when it comes to survival as a body”, Sumber concluded, “but is essential with regards to our development and evolution as metaphysical beings.”
Active imagination exists to give a voice to the anima, animus, shadow, and other areas of the personality that do not typically hold our attention. When the individual engages his active imagination, let’s say through painting or writing, there is a transformation of consciousness. As Carl Jung wrote in The Conjunction:
“Although, to a certain extent, he looks on from outside, partially, he is also an acting and suffering figure in the drama of the psyche. This recognition is absolutely necessary and marks an important advance. So long as he simply looks at the pictures he is like the foolish Parsifal, who forgot the ask the vital question because he was not aware of his own participation in the action. But if you recognise your own involvement you yourself must enter into the process with your personal reactions, just as if you were one of the fantasy figures, or rather, as if the drama being enacted before your eyes were real.”
The collective unconscious manifests itself most greatly through mankind’s proclivity for telling stories. As the clinical psychologist and cultural critic, Jordan Peterson (1962 – ) explains, story-telling is an ancient and innate aspect of human nature:
“You know, we’ve been collecting stories as people we don’t know how long – hundred thousand years, maybe. There’s been creatures like us, indistinguishable from us, for a hundred thousand years. And we know that societies that appear more or less as archaic as those old societies tell stories, have rituals, have mythology. What do they mean? What are they good for? Well, imagine this: you tell a story to your husband or your wife about something interesting that you saw. Well, imagine that you could collect a thousand of the most interesting stories. And then imagine that you were some kind of literary genius like Shakespeare and you could take those thousand interesting stories and boil them down to a hundred really interesting stories. And then imagine that you had ten thousand years to gather up those most interesting stories and average them and you could come out with one perfect story: the best story, the most interesting story you could possibly tell. Well, that’s what a myth is. It’s the most interesting story you could possibly tell. Virtually every story you ever see has a mythological structure, that’s why it’s compelling to you. And when you meet someone who is charismatic or who holds your attention or who you’re interested in, the probability that they’re acting out a mythological fragment is very, very high. That’s why it is that your attention is captivated by them.”
Myths are really psychological in nature, even though they are typically misread as biographical or even historical. Myths, much like dreams, emanate from the unconscious thoughts and emotions and gives a voice to the innermost fears and desires that underlie most of our behaviour.
The purpose of mythology is to provide the individual with a mirror which he can use to assess himself and his relationship with the wider world. It exists to provide the individual with a sense of history and of his place within the cosmos. Whereas the world of fiction has to work in an alternative reality where the facts of that universe are considered irrefutable and correct, mythology works by taking the metaphorical or metaphysical-cum-spiritual truths of existence and gives them voice and meaning through the medium of a story. Therefore, it is not how factually true a mythological story may or may not be that is important, but the metaphorical truths it imparts on the reader.
There can be little doubt that the modern world has produced marvels. The price of these remarkable achievements has been a form of perverse arrogance in which modern man likes to believe he is somehow a different, more rational, creature than his ancestors. The price for our arrogance has been the loss of our sense of something more substantial and wonderous than ourselves. As a result, people limit themselves only to that which is socially and economically attainable. Seeing ourselves as eminently rational and pragmatic creatures we have managed to produce a world where the individual feels worthless and insignificant. What is required is a revitalisation of the dream world. A return to the knowledge that it is ideas, our ability to give a voice to those aspects of our personalities that lie dormant, and to venture out into the chaotic unknown and return triumphantly that makes human beings great.
There has been an alarming trend in modern culture: numerous political and social activist groups have been attempting to use the pernicious and false doctrines of political correctness, tolerance, and diversity to silence those they disagree with. Many of these groups have sought the passage of so-called “hate speech” laws designed to silence voices of dissent.
At public colleges and universities, places where free speech and open debate should be actively encouraged, measures – including protests, disruption, and, in some cases, outright violence – taken to suppress voices of dissent has become tantamount to Government censorship. This censorship prevents students from inviting the speakers they wish to hear and debate speech they disagree with. Eva Fourakis, the editor-in-chief of The Williams Record (the student newspaper of Williams College) wrote an editorial, later recanted, commenting that “some speech is too harmful to invite to campus.” The editorial went on to say: “students should not face restrictions in terms of the speakers they bring to campus, provided of course that these speakers do not participate in legally recognised forms of hate speech.”
The University of California, Berkeley, is famous for sparking the free speech movement of the 1960s. Today, however, it has become a haven for radical, anti-free speech Neo-Marxists and social justice warriors. Not only have many Republican students had their personal property destroyed, but numerous conservative speakers have had their talks disturbed, and, in some cases, halted altogether. In February, Antifa – so-called anti-fascists – set fires and vandalised building during a speech by the controversial journalist, Milo Yiannopoulos (1984 – ). In April, threats of violence aimed at members of the Young Americas Foundation forced political commentator, Ann Coulter (1961 – ), to cancel her speech. A speech by David Horowitz (1939 – ), founder and president of the David Horowitz Freedom Center, was cancelled after organisers discovered that the event would take place during normal class times (for safety, or so they claimed). Finally, the conservative journalist, Ben Shapiro (1984 – ), was forced to spend US$600,000 on security for his speech at UC Berkeley. These events show that those who wish to use disruption, vilification, threats, and outright violence to silence others can be, and often are, successful in doing so.
Like most the principles of classical liberalism, free speech developed through centuries of political, legal, and philosophical progress. And like many Western ideas, its development can be traced back to the Ancient Greeks. During his trial in Athens in 399BC, Socrates (470BC – 399BC) expressed the belief that the ability to speak was man’s most divine gift. “If you offered to let me off this time on condition I am not any longer to speak my mind”, Socrates stated, “I should say to you, ‘Men of Athens, I shall obey the Gods rather than you.”
Sixteen hundred years later, in 1215, the Magna Carta became the founding document of English liberty. In 1516, Desiderius Erasmus (1466 – 1536) wrote in the Education of a Christian Prince that “in a free state, tongues too should be free.” In 1633, the astronomist Galileo Galilei was put on trial by the Catholic Church for refusing to retract his claim of a heliocentric solar system. In 1644, the poet, John Milton (1608 – 1674), author of Paradise Lost, warned in Areopagictica that “he who destroys a good book kills reason itself.” Following the usurpation of King James II (1633 – 1701) by William III (1650 – 1702) and Mary II (1662 – 1694) in 1688, the English Parliament passed the English Bill of Rights which guaranteed free elections, regular parliaments, and freedom of speech in Parliament.
In 1789, the French Declaration of the Rights of Man and of the Citizen, an important document of the French revolution, provided for freedom of speech (needless to say, Robespierre and company were not very good at actually promoting this ideal). That same year, the philosopher Voltaire (1694 – 1778) famously wrote: “I detest what you write, but I would give my life to make it possible for you to continue to write.” Over in the United States, in 1791, the first amendment of the US Bill of Rights guaranteed freedom of religion, freedom of speech, freedom of the press, and the right to assemble:
ARTICLE [I] (AMENDMENT 1 – FREEDOM OF SPEECH AND RELIGION)
Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of people peaceably to assemble, and to petition the Government for a redress of grievances.”
During the 19th century, the British philosopher, John Stuart Mill (1806 – 1873) argued for toleration and individuality in his 1859 essay, On Liberty. “If any opinion is compelled to silence”, Mill warned, “that opinion may, for aught we can certainly know, be true. To deny this is to presume our own infallibility.” Mill believed that all doctrines, no matter how immoral or offensive, ought to be given public exposure. He stated in On Liberty:
“If the argument of the present chapter are of any validity, there ought to exist the fullest liberty of professing and discussing, as a matter of ethical conviction, any doctrine, however immoral it may be considered.”
Elsewhere in On Liberty, Mill warned that the suppression of one voice was as immoral as the suppression of all voices:
“If all mankind minus one were of one opinion, and only one person were of the contrary opinion, mankind would be no more justified in silencing that one person than he, if he had the power, would be justified in silencing mankind.”
Centuries later, in 1948, the Universal Declaration of Human Rights, accepted unilaterally by the United Nations, urged member states to promote civil, human, economic, social, and political rights – including freedom of expression and religion.
Within the American Justice System, numerous Supreme Court cases have created judicial protections for freedom of speech. In the case of the Nationalist Socialist Party of America v. Village of Stoke (1977), the Supreme Court upheld the right of neo-Nazis to march through a village with a large Jewish population and wear Nazi insignia. The Justices found that the promotion of religious hatred was not a sufficient reason to restrict free speech.
In the city of St. Paul during the early 1990s, a white teenager was arrested under the “Bias-Motivated Crime Ordinance” after he burnt a cross made of a broken chair (cross-burning is commonly used by the Ku Klux Klan to intimidate African Americans) in the front yard of an African American family. The Court ruled that the city’s Ordinance was unconstitutional. Justice Antonin Scalia (1936 – 2016), noted that the purpose of restricting fighting words was to prevent civil unrest, not to ban the content or message of the speaker’s words. Scalia wrote in the case of R.A.V. v. City of St. Paul (1992):
“The ordinance applies only to ‘fighting words’ that insult, or provoke violence, ‘on the basis of race, colour, creed, religion or gender.’ Displays containing abusive invective, no matter how vicious or severe, are permissible unless they are addressed to one of the specified disfavored topics. Those who wish to use ‘fighting words’ in connection with other ideas—to express hostility, for example, on the basis of political affiliation, union membership, or homosexuality—are not covered. The First Amendment does not permit St. Paul to impose special prohibitions on those speakers who express views on disfavored subjects.”
In the Matal v. Tam case (2017), the Supreme Court found that a provision within the Lanham Act prohibiting the registration of trademarks that disparaged persons, institutions, beliefs, or national symbols violated the First Amendment. Justice Samuel Alito (1950 – ) opined:
“[The idea that the government may restrict] speech expressing ideas that offend … strikes at the heart of the First Amendment. Speech that demeans on the basis of race, ethnicity, gender, religion, age, disability, or any other similar ground is hateful; but the proudest boast of our free speech jurisprudence is that we protect the freedom to express ‘the thought that we hate’.”
Justice Anthony Kennedy (1936 – ) opined:
“A law found to discriminate based on viewpoint is an “egregious form of content discrimination,” which is “presumptively unconstitutional.” … A law that can be directed against speech found offensive to some portion of the public can be turned against minority and dissenting views to the detriment of all. The First Amendment does not entrust that power to the government’s benevolence. Instead, our reliance must be on the substantial safeguards of free and open discussion in a democratic society.”
In recent years, numerous calls to ban speech have been justified on the basis that it is “hateful.” Much of this has come from the political left who (in what one may cynically regard as having more to do with silencing voices of dissent than with protecting vulnerable groups) argue that restrictions on hate speech must occur if minorities are to be given equal status with everyone else.
That certain types of speech can be offensive, and that some of that speech may be aimed at certain groups of people, is undeniable. Hate speech has even been criticised for undermining democracy! In an article, Alexander Tsesis, Professor of Law at Loyola University, wrote: “hate speech is a threatening form of communication that is contrary to democratic principles.” Some have even argued that hate speech violates the fourteenth amendment to the US Constitution which guarantees equal protection under the law:
Article XIV (AMENDMENT 14 – RIGHTS GUARANTEED: PRIVILEGES AND IMMUNITIES OF CITIZENSHIP, DUE PROCESS, AND EQUAL PROTECTION)
1: All persons born or naturalised in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside. No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny any person within its jurisdiction the equal protection of the laws.
That there is a historical basis for restricting hate speech is undeniable. Slavery, Jim Crow, and the Holocaust, among other atrocities, were all proceeded by violent and hateful rhetoric. (Indeed, incitement to genocide is considered a serious war crime and a serious crime against humanity under international law.) Genocide is almost always preceded by hate speech. However, what proponents of hate speech laws fail to realise is that the countries that perpetrated these atrocities did not extend the freedom to speak to the groups that they were targeting. Joseph Goebbels (1897 – 1945), the Nazi minister for public enlightenment and propaganda, for example, had such an iron grip on Germany’s media that any voice contradicting the Nazi’s anti-Semitic propaganda had no opportunity to be heard.
But who, exactly, supports hate speech laws? Analysis of survey data taken from Pew Research Center and YouGov reveals that it is primarily non-white, millennial democrats. In terms of age, the Pew Research Centre found that forty-percent of millennials supported Government censorship of hate speech, compared to twenty-seven percent of gen x-ers, twenty-four percent of baby-boomers, and only twelve percent of the silent generation.
In terms of race, research by YouGov reveals that sixty-two percent of African Americans support Government censorship of hate speech, followed by fifty percent of Hispanics, and thirty-six percent of White Americans.
In terms of political affiliation, research from YouGov taken in 2015 found that fifty-one percent of Democrats supported restrictions on hate speech, compared to thirty-seven percent of Republicans, and only thirty-five percent of independents.
The primary issue with hate speech is that determining what it does and does not constitute is very difficult. (The cynic may argue, fairly, that hate speech begins when the speaker expresses a view or states a fact or expresses an opinion that another person does not want others to hear.) As Christopher Hitchens (1949 – 2011) pointed out, the central problem with hate speech is that someone has to decide what it does and does not constitute.
The second issue with hate speech laws is that they can easily be used by one group to silence another. Often this kind of censorship is aimed at particular groups of individuals purely for ideological and/or political purposes, often with the justification that such actions increase the freedom and equality of the people the advocates claim to represent.
In Canada, Bill C-16 has sought to outlaw “hate propaganda” aimed at members of the community distinguishable by their gender identity or expression. The Bill originated with a policy paper by the Ontario Human Rights Commission which sought to determine what constituted discrimination against gender identity and expression. This included “refusing to refer to a person by their self-identified name and proper personal pronoun.” Supporters of Bill C-16 see it as an important step towards the creation of legal protections for historically marginalised groups. Detractors, however, have expressed concern that the Bill creates a precedence for Government mandated speech.
The Canadian clinical psychologist and cultural critic, Professor Jordan Peterson (1962 – ), first came to public attention when he posted a series of YouTube videos warning of the dangers of political correctness and criticising Bill C-16. In his videos, Professor Peterson warned that the law could be used to police speech and compel individuals to use ‘transgender pronouns’ (these are terms like ‘ze’ and ‘zer’, among others). For his trouble, Peterson has been accused of violence by a fellow panellist on the Agenda with Steve Palkin, received two warning letters from the University of Toronto in 2016, and was denied a social research grant from Canada’s Social Sciences and Humanities Research Council.
Europe has been experiencing similar attempts to silence speech. A law passed in the Bundestag this year will force social media companies operating in Germany to delete racist or slanderous comments and posts within twenty-four hours or face a fine of up to €50 million if they fail to do so. Additionally, numerous public figures have found themselves charged with hate speech crimes for merely pointing out the relationship between the large influx of non-European migrants and high crime rates, particularly in terms of rape and terrorism. One politician in Sweden was prosecuted for daring to post immigrant crime statistics on Facebook.
In Great Britain, British Freedom of Information documents reveal that around twenty-thousand adults and two-thousand children had been investigated by the police for comments that made online. In politics, British MP, Paul Weston (1965 – ), found himself arrested after he quoted a passage on Islam written by Winston Churchill (1874 – 1965). In Scotland, a man was charged under the 2003 Communication’s Act with the improper use of electronic communications after he filmed his dog making a Hitler salute.
In Australia, Herald Sun columnist, Andrew Bolt (1959 – ), was found to have contravened section 18C of the Racial Discrimination Act after he published articles accusing fair-skinned Aborigines of using their racial status for personal advantages. The law firm, Holding Redlich, speaking for a group of Aboriginal persons, demanded that the Herald Sun retract two Andrew Bolt articles, written in April and August of 2009, and restrain Bolt from writing similar articles in the future. Joel Zyngier, who acted for the group pro-bono, told Melbourne’s The Age:
“We see it as clarifying the issue of identity—who gets to say who is and who is not Aboriginal. Essentially, the articles by Bolt have challenged people’s identity. He’s basically arguing that the people he identified are white people pretending they’re black so they can access public benefits.”
Judge Morcedai Bromberg (1959 – ) found that the people targeted by Bolt’s articles were reasonably likely to have been “offended, insulted, humiliated, or intimidated.”
We need speech to be as free as possible because it is that which allows us to exchange and critique information. It through free speech that we are able to keep our politicians and public officials in check, that we are able to critique public policy, and that we are able to disseminate information. As the Canadian cognitive psychologist, Stephen Pinker (1954 – ), observed: “free speech is the only way to acquire knowledge about the world.” Measures taken to restrict free speech, whether it be the criminalization of hate speech or any other, is a complete contradiction of the principles that free Western democracies are founded upon.
On November 20th, 1945, twenty-four leaders of the defeated Nazi regime filed into Courtroom 600 of Nuremberg’s Palace of Justice to be tried for some of the most reprehensible crimes ever committed. Over the next ten months, the world would be shocked to learn of the depth and extent of the Nazi regime’s mechanised horrors. By the end of the trial, twelve of the defendants would be sentenced to death, seven would be sentenced to periods of imprisonment, and three would be acquitted.
Contrary to what one may believe, the perpetrators of the Holocaust were not psychopaths, sadists, or otherwise psychologically disturbed individuals. Rather, their actions arose, as psychologist Gustave Gilbert (1911 – 1977) concluded, from a culture which valued obedience. The observation that mass-horror is more likely to be committed by normal men and women influenced by social conformity would later be categorised by Hannah Arendt (1906 – 1975) as the ‘banality of evil.’
This shouldn’t be as too much of a surprise. After all, human beings are hard-wired to obey orders from people they deem superior to themselves. In 1961, Yale Psychologist Stanley Milgram (1933 – 1984) carried out a famous experiment which explored the conflict between authority and personal conscience. Milgram’s experiment was inspired by an interview with the Commandant of Auschwitz, Rudolf Höss (1900 – 1947). Höss was asked how it was possible to be directly involved in the deaths of over a million people without suffering emotional distress. Chillingly, Höss answered that he was merely following orders.
The process of the experiment was simple. Two participants, one who whom was actually a researcher, would draw to decide who would take the role of teacher and who would take the role of student. (The system, needless to say, was rigged to ensure the actual participant took the role of teacher). The teacher and student were then separated, and the teacher was taken to a room with an electric shock generator consisting of a row of switches ranging from fifteen to four-hundred-and-fifty volts. Supervising the teacher was an experimenter in a grey lab coat (an actor in reality). Through the experiment, the teacher was to ask the student questions and administer an electric shock every time the student got a question wrong. As the experiment continued the student would deliberately give wrong answers. As the shocks got more and more severe, the student would scream and beg for mercy. When the teacher expressed concern, however, the experimenter would insist that the experiment continue. By the end of the experiment, Milgram had concluded that all participants would continue to three-hundred volts whilst two-thirds would continue to full volts when pressed.
The Nazis were able to create such obedience through a well-calculated propaganda campaign. Hitler outlined the principles of this campaign in Mein Kampf:
- Keep the dogma simple. One or two points only.
- Be forthright and powerfully direct – tell or order why.
- Reduce concepts down to black and white stereotypes
- Constantly stir people’s emotions
- Use repetition.
- Forget literary beauty, scientific reasoning, balance, or novelty.
- Focus solely on convincing people and creating zealots.
- Find slogans which can be used to drive the movement forward.
Similarly, Hitler’s speeches also followed a very specific and calculated formula:
- Hitler would unify the crowd by pointing out some form of commonality.
- Hitler would stir up feelings of fear and anger by pointing out some kind of existential threat.
- Hitler would invoke himself as the agent of a higher power.
- Hitler would present his solution to the problem.
- Hitler would proclaim the utilisation of the solution as a victory for both the higher power and the commoners.
In essence, the Nazi propaganda machine facilitated feelings of group identity and then used conformity to gain control over that group. They gambled that the majority of people preferred being beholden to a group than identifying as an individual.
If there is any lesson which can be derived from the Holocaust it is that the distance between good and evil is shorter than we like to believe. As clinical psychologist Jordan Peterson is fond of pointing out, if the Holocaust was perpetrated by ordinary people and you’re an ordinary person, the only logical conclusion is that you too are capable of horrendous evil. It is not enough to be critical of those in powers, eternal vigilance means being critical of our own need to conform and obey. Our freedom depends upon it.
This week for our theological article, we will be examining Friedrich Nietzsche’s (1844 – 1900) infamous statement, “God is dead.”
Friedrich Wilhelm Nietzsche (pronounced ‘knee-cha’) was born in Röcken, near Leipzig, on October 15th, 1944. His father, Karl Ludwig Nietzsche (1813 – 1849), was a Lutheran pastor and former teacher, and his mother was Franziska Oehler (1826 – 1897). The Nietzsche family quickly grew to include a daughter, Elisabeth (1846 – 1935), and another son, Ludwig Joseph (1848 – 1850). Unfortunately, the family would be beset by tragedy. In 1849, when Nietzsche was five-years-old, Karl Nietzsche would suffer a devastating brain haemorrhage and die. Then, as if to rub in salt in their wounds, the infant Ludwig Joseph, would die unexpectedly shortly after.
Nietzsche was educated at the prestigious Schulpforta school near Naumburg. There he received an education in theology, classical languages, and the humanities. After graduating, young Nietzsche attended the University of Bonn before moving to the University of Leipzig. During his time there, Nietzsche became acquainted with the philosophy of Arthur Schopenhauer (1788 – 1860) whose work, the World as Will and Representation (1818), would have a tremendous influence. Then, aged only twenty-four, Nietzsche was awarded the position of professor of Greek language and Literature at the University of Basel in Switzerland. He had never written a doctoral dissertation.
Nietzsche left academia briefly to serve as a medical orderly in the Franco-Prussian War (1870-1871). He was discharged due to poor health. Nietzsche returned to Basel where he came acquainted with the cultural historian, Jacob Burckhardt (1818 – 1897), and the composer, Richard Wagner (1813 – 1883). Wagner’s influence on Nietzsche can most readily be seen in the Birth of Tragedy.
During the late 1870s, Nietzsche became increasingly beset with debilitating health problems: digestive problems, poor eyesight, and migraines. He was forced to spend months off work, and eventually agreed to retire with a modest pension. Nietzsche was only thirty-four years old.
From there, Nietzsche devoted the rest of his life to the study and writing of philosophy. Between 1870 and 1889, Nietzsche wrote nineteen books, including: The Birth of Tragedy (1872), Philosophy in the Tragic Age of the Greeks (1873), Human, All Too Human (1878), the Gay Science (1882), Thus Spake Zarathustra (1883), Beyond Good and Evil (1886), On the Genealogy of Morals (1887), Twilight of the Idols (1888), Ecce Homo (1888), and the Will to Power (1901, technically unpublished manuscripts published by his sister, Elisabeth).
In 1889, in Turin Italy, Nietzsche suffered a mental breakdown after seeing a horse being flogged in the Piazza Carlo Alberto. In the following days, Nietzsche sent a series of ‘madness letters’ to Cosimo Wagner (1837 – 1930) and Jacob Burckhardt in which he signed his name ‘Dionysos’, claimed to be ‘the crucified one’, and asserted that he was the creator of the world. It was quickly agreed that Nietzsche should be brought back to Basel. There he was incarcerated in a clinic in Jena.
In 1890, Nietzsche’s mother, Franziska, brought him home to Naumburg where she looked after him until her death in 1897. From there, Nietzsche was cared for by his sister, Elisabeth, in Weimar. He died on August 25th, 1900 at the age of fifty-five.
The statement, “God is dead” is Nietzsche’s most memorable and provocative statement. (Of course, he wasn’t the first one to coin the term. That was Heinrich Heine (1797 – 1856). Nietzsche merely philosophised it). It first appeared in the Gay Science in a fable entitled, the Parable of the Madman. In the parable, the madman asks, ‘where is God?’, only to be informed that God had been killed by man:
“God is dead. God remains dead. And we have killed him. How shall we, murderer of all murderers, console ourselves? That which was holiest and mightiest of all that the world has yet possessed has bled to death under our knives. Who will wipe the blood off us? With what water could we purify ourselves?”
Of course, Nietzsche wasn’t talking about the literal death of God (he was, after all, an atheist). Instead, he was referring to the death of the concept or idea of God. The statement was meant as a reference to the decline of traditional and metaphysical doctrines that had dominated European thought and culture for centuries.
Nietzsche observed, correctly, that western morality was predicated on the presumption of the truth of Judeo-Christian values. Christianity had become infused in European culture and thought. Philosophers and scientists like Copernicus (1473 – 1543), René Descartes (1596 – 1650), Isaac Newton (1643 – 1727), Saint Thomas Aquinas (1225 – 1274), George Berkeley (1685 – 1753), Saint Augustine (354-430AD), Gottfried Wilhelm Leibniz (1646-1716), and more were all deeply influenced by their belief in God. Culturally, Handel’s (1685 – 1759) Messiah, Da Vinci’s (1452 – 1519) the Last Supper, and Michelangelo’s (1475 – 1564) Statue of David are all infused with religious themes.
The decline of Christianity’s supremacy in society began with the Enlightenment. Science replaced scripture. During this time, the belief in a universe governed by God was replaced by governance through the laws of physics, the divine right to rule was replaced with rule by consent, and morality no longer had to emanate from a loving and omniscient God.
The legacy of the Enlightenment, Nietzsche rightly observed, was that Christianity lost its central place in Western culture. (Of course, it can also be argued that Christianity’s central doctrines and tenets have been so absorbed by society people no longer recognise their influence). Science, replete with its elaborate depictions of physical reality, ultimately replaced religious truth.
Nietzsche’s assertion is often seen as a triumphal or victorious statement. However, analysis reveals that Nietzsche did not necessarily see the death of God as a good thing. He recognised that as society moved closer to secularisation, the order and meaning religion gave to society would fall by the wayside. People would no longer base their lives on their religious beliefs, but on other factors. Their lives would not be grounded in anything. As Nietzsche wrote in the Twilight of the Idols:
“When one gives up the Christian faith, one pulls the right to Christian morality out from under one’s feet. This morality is by no means self-evident… Christianity is a system, a whole view of things thought out together. By breaking one main concept out of it, the faith in God, one breaks the whole.”
Nietzsche believed the solution to the problem would be to create our own, individual values. Christian morality (derided by Nietzsche as ‘slave morality’) would be replaced by ‘master morality.’ Human beings would strive to become Übermensches or overmen.
The problem with Nietzsche’s suggestion is that it is virtually impossible to keep society ordered when everyone’s values are different. Furthermore, as Carl Jung (1875 – 1961) points out, it is impossible for us to create our own values. Most of us can’t keep our new year’s resolutions, let alone create a value system that will bring order to society.
Nietzsche, along with Russian novelist, Fyodor Dostoevsky (1821 – 1881), predicted that the 20th Century would be characterised either by apocalyptic nihilism or equally apocalyptic ideological totalitarianism. In the end, the world experienced both. The wake of the Great War (1914 – 1918) saw Europe plagued by communism, fascism, Nazism, and quasi-religious nationalism. In Russia, communism, through which a person’s value was derived from his labour, arose under the Bolsheviks. In Italy, fascism, through which a person’s value was derived from his nationality, arose under Benito Mussolini (1883 – 1945). In Germany, Nazism, through which a person’s value was derived from his race, arose under Adolf Hitler (1889 – 1945). All of these systems attempted to give people’s lives meaning by replacing the state with God.
In the end, the 20th Century would be the deadliest and most destructive in human history. The legacy of two world wars, nuclear weapons, communism, and fascism has been millions of painful and unnecessary deaths. This is what we get when we remove God from society: needless pain and suffering.
The evolutionary psychologist E.O. Wilson referred to war as “humanity’s hereditary curse.” It has become infused in our collective and individual psyches. The Iliad tells the story of the Trojan War, Shakespeare’s Henry V is centred around the Battle of Agincourt, and All Quiet on the Western Front tells of the experiences of young German soldiers on the Western Front.
The purpose of war can be split into two fields: philosophical and pragmatic. Most modern wars are fought for ideological, and therefore philosophical reasons: capitalism versus communism, fascism versus democracy, and so forth. Richard Ned Lebow, a political scientist at the University of London, hypothesised that nations go to war for reasons of ‘national spirit.’ Institutions and nation-states may not have psyches per-say, but the individuals who run them do, and it is natural for these individuals to project the contents of their psyches onto the institutions and nation-states they are entrusted with.
Rationalists, on the other hand, have another perspective. War, they argue, is primarily used by nations to increase their wealth and power: allowing them to annex new territories, take control of vital resources, pillage, rape, and so forth. Bolshevism arose in the political instability and food shortages of World War One Russia. The Nazis used the spectre of Germany’s humiliating defeat in the Great War and its treatment in the Treaty of Versailles as a stepping stone to political power. In the Ancient World, Sargon of Akkad (2334-2279BC) used war to form the Akkadian Empire, and then used war to quell invasions and rebellion. Similarly, Philip II of Macedonia (382BC – 336BC) used war to unify the city states of Ancient Greece.
Another explanation may be that we engage in war because we are naturally inclined to. War speaks to our need for group identity, and to our deep predilection for conflict. And it should come as no surprise that the two are not mutually exclusive. Our strong predilection towards our own group not only makes us more willing to help other members of that group, it makes us more willing to commit evil on its behalf. Chimpanzees have been known to invade other congresses of chimps and go on killing sprees. The obvious intention being to increase territory and decrease intra-sexual competition. Similarly, our own evolutionary and primitive past is fraught with violence and conflict. It should not escape our attention that history is abundant with examples of invading soldiers slaughtering men and raping women.
Like all the profound aspects of culture, war conceptualises a facet of a deeper truth. It has been central to our history and culture capturing both the more heroic and the more frightening aspects of our individual and collective psyches. We both influence and are influenced by war.
A CBS report has claimed that Iceland has virtually eradicated down-syndrome births through their prenatal screening programs and pro-abortion policies.
According to the CBS report, virtually all Icelandic women whose unborn children test positive for down syndrome opt to have their pregnancy terminated. Icelandic law allows for abortion after sixteen weeks if the fetus is found to be suffering from a deformity. This includes down syndrome.
Iceland introduced prenatal screening tests in the early 2000s. While these tests are optional, the Icelandic Government requires all pregnant women to be informed of them. According to the Landspitali University Hospital in Reykjavik, between eight and eighty-five percent of women opt to do the screening. These screenings use the mother’s age, ultrasounds, and blood tests to determine the likelihood a child will be born suffering a chromosomal problem.
Glenn Beck, a conservative political commentator, slammed Iceland’s virtual eradication of down syndrome births as eugenics. “That’s eugenics”, Beck said, “that is Margaret Sanger’s most base dream: get rid of the undesirables. Get rid of people who can’t really work for a living, don’t really have any quality of life.” Likewise, political humorist, Jim Treacher, tweeted: “later in the show, we’ll look at the looming Nazi menace. But first: ain’t eugenics great?” Similarly, Everybody Loves Raymond star, Patricia Heaton, tweeted: “Iceland isn’t actually eliminating down syndrome. They’re just killing everybody that has it. Big difference.”
A counsellor at an Icelandic hospital commented:
We don’t look at abortion as a murder, we look at it as a thing that we ended. We ended a possible life that may have had a huge complication . . . preventing suffering for the child and for the family. And I think that is more right than seeing it as a murder — that’s so black and white. Life isn’t black and white. Life is grey.”
Except that in this case, it is. Aborting unborn babies purely because they have down syndrome (or any other problem, for that matter) is evil. It is nothing more than social cleansing. As Dr. Peter McCarland, an obstetrician at the National Maternity Hospital in Dublin, commented:
“In Britain, 90% of babies with Down’s Syndrome are aborted before birth. In Iceland, every single baby, 100% of all those diagnosed with Down’s Syndrome, are aborted. There hasn’t been a baby with Down’s Syndrome born in Iceland in the past five years. Denmark is following suit, and is expected to be “Down’s Syndrome-free” by 2030 and these cold and chilling statistics show us exactly where legal abortion is leading the rest of Europe. Legal abortion is leading us to a “Down’s Syndrome-free” world. I can barely type the words. It is utterly heartbreaking. Little wonder that, in Britain, Lord Shinkwin – a member of the House of Lords who has a congenital disability – last week gave a powerful speech pointing out, ‘the writing is on the wall for people like me. People with congenital disabilities are facing extinction’.”
It is morally repugnant to base a person’s right to life on their genetic status or how ‘normal’ they are. Every life is sacred and deserves protection, not just those who have been fortunate enough to born without problems.