Home » Posts tagged '20th century'
Tag Archives: 20th century
There is an old adage which states that you do not know how big a tree is until you try and cut it down. Today, as cultural forces slowly destroy it, we are beginning to understand that the same thing can be said about personal responsibility.
Society no longer believes that people ought to bear their suffering with dignity and grace. Rather, it now believes that the problems of the individual ought to be made the problems of the community. Individual problems are no longer the consequence of individual decisions, but come as the result of race, gender, class, and so forth.
The result of this move towards collective responsibility has been the invention of victim culture. According to this culture, non-whites are the victims of racism and white privilege, women are the victims of the patriarchy, homosexuals are the victims of a heteronormative society.
The 20th century is a perfect example of what happens when responsibility is taken from the hands of the individual and placed in the hands of the mob. The twin evils of communism and Nazism – which blamed the problems of the individual on economic and racial factors, respectively – led to the deaths of tens of millions of people.
Furthermore, such ideologies led otherwise decent individuals to commit acts of unspeakable violence. Whilst observing the trial of Adolf Eichmann, a former SS soldier who had been one of the architects of the Holocaust, the writer, Hannah Arendt was struck by the “banality of evil” that had characterised German war atrocities. Arendt noted that the men who conspired to commit genocide were not raving lunatics foaming at the mouth, but rather dull individuals inspired to commit evil due to a sense of duty to a toxic and corrupt ideology.
The Bolsheviks taught the Russian people that their misfortune had been caused by the wealthy. And that the wealth was gained through theft and exploitation. Likewise, the Nazis convinced the German people that their problems could be blamed on the Jews. It is not difficult to see how this philosophy led, step by step, to the gulags and the concentration camps.
The same thing is happening today. The only difference is that those who play it have become more sophisticated. Today people are encouraged to identify with identity groups ranked by so-called social privilege. Then they are taught to despise those with more social privilege than them.
Under this philosophy, crime is not caused by the actions of the individual, but by social forces like poverty, racism, and upbringing. Advocates claim that women should not be forced to take responsibility for their sexual behaviour by allowing them to essentially murder their unborn children. Sexually transmitted diseases like HIV is caused by homophobia rather than immoral and socially irresponsible behaviour. And alcoholism and drug addiction are treated as a disease rather than a behaviour the addict is supposed to take responsibility for. The list is endless.
Personal responsibility helps us take control of our lives. It means that the individual can take a certain amount of control over his own life even when the obstacles he is facing seem insurmountable.
No one, least of all me, is going to argue that individuals don’t face hardships that are not their fault. What I am going to argue, however, is that other people will respect you more if you take responsibility for your problems, especially if those problems are not your fault. Charity for aids sufferers, the impoverished, or reformed criminals is all perfectly acceptable. But we only make their plight worse by taking their personal responsibility from them.
Responsibility justifies a person’s life and helps them find meaning in their suffering. Central to the Christian faith is the idea that individuals are duty bound to bear their suffering with dignity and grace and to struggle towards being a good person. To force a man to take responsibility for himself is to treat him as one of God’s creations.
You cannot be free if other people have to take responsibility for your decisions. When you take responsibility from the hands of the individual you tarnish his soul and steal his freedom.
Freedom from responsibility is slavery, not freedom. Freedom is the ability to make decisions according to the dictates of own’s own conscience and live with the consequences of that decision. Freedom means having the choice to engage in the kind immoral behaviour that leads to an unwanted pregnancy or AIDS. What it does not do is absolve you from responsibility for those actions. Slavery disguised as kindness and compassion is still slavery.
Our society appears to be suffering a terminal decline. At least that’s the conclusion traditionalists and devout Christian believers like myself have been forced to conclude. As the old-world withers and vanishes, a culture of selfishness, moral relativism, and general immorality has been allowed to grow in its place. The culture that produced Vivaldi, Dickens, Shakespeare, and Aristotle has been replaced with one that has as its major ambassadors the likes of Kim Kardashian and Justin Bieber.
The first clue that a monumental change had taken place came in the guise of Princess Diana’s farce of a funeral in 1997. An event that was cynically exploited by politicians and celebrities and recorded for public consumption by round-the-clock news coverage (her funeral would be watched by two-and-a-half-billion people). As Gerry Penny of The Conversation noted, Diana’s death marked the beginning of the ‘mediated death.’ A death that is covered by the mass media in such a way that it attracts as much public attention, and therefore revenue, as possible.
Compared to Princess Diana, Winston Churchill’s funeral in 1965 was a spectacle of old world pomp and ceremony. After lying in state for three days, Churchill’s small coffin was carried by horse-drawn carriage along the historic streets of London to Saint Paul’s Cathedral. His procession was accompanied by Battle of Britain aircrews, royal marines, lifeguards, three chiefs of staff, Lord Mountbatten, and his own family. The silence that filled the air was broken only by a funerary march and the occasional honorary gunshot.
Much like Diana’s funeral, tens of thousands of people came to witness Churchill’s funeral. But unlike Diana’s mourners, who did everything they could to draw attention to themselves, Churchill’s mourners were silent and respectful. They realised, unlike Diana’s mourners, that the best way to commemorate a great man was to afford him the respect that his legacy deserved.
Cynics would dismiss Churchill’s funeral as nothing more than a ridiculous display of pomp and ceremony. However, these events serve an important cultural purpose by connecting the individual with his community, his culture, and his heritage. In doing so, they bring about order and harmony.
Winston Churchill was the great Briton of the 20th century. Like Horatio Lord Nelson in the early 19th century, it was Churchill’s leadership that saved Britain from Nazi invasion and it was his strength and resolve that gave ordinary Britons that courage to endure the worst periods of the War.
And understandably, many Britons felt something approximating a kind of personal gratitude towards him. A gratitude deep enough that when he died many felt it to be their duty to file reverently pass his body lying in state or stand in respectful silence as his funeral procession passed. What Churchill’s state funeral did was give the ordinary person the opportunity to pay their own respects and feel that they had played a part, if only in a minute way, in the celebration of his life.
Winston Churchill’s funeral and Princess Diana’s funeral represent eras that are as foreign to one another as Scotland is to Nepal. While Churchill’s funeral represented heritage and tradition, Princess Diana’s funeral symbolised mass nihilism and self-centredness.
But why has this happened? I believe the answer lies in the dual decline of Western culture and Christianity.
The French philosopher, Chantal Delsol described modern Western culture as being akin to Icarus had he survived the fall. (Icarus, of course, being the figure in Greek mythology whose wax wings melted when he flew too close to the sun). Where once it had been strong, resolute, and proud, it has now become weak, dejected, disappointed, and disillusioned. We have lost confidence in our own traditions and ideals.
Of course, the decline of Western culture has a direct correlation with the more consequential decline of Christianity. It is faith that informs culture and creates civilisation, and the faith that has informed the West has been Christianity. It is the moral ideals rooted in the Judeo-Christian tradition – that I love my neighbour, that my behaviour in this life will determine my fate in the next, that I should forgive my enemies – that form the axiomatic principles that undergird Western civilisation.
This faith has been replaced by an almost reverent belief in globalism, feminism, environmentalism, diversity, equality, and human rights. Our secularism has made us believe that those who came before us were ignorant, superstitious, and conformist. And what has the result of this loss of mass religiosity been? Mass nihilism and a decline in moral values.
But when faith falls so too does culture and civilisation. If we are to revive our civilisation, we must be prepared to acknowledge that tradition, heritage, and religion are not only integral, but vital.
Since the Industrial Revolution, scientific and technological development has progressed at an unfathomable rate. In a little over a quarter-of-a-millennium, the Western world has gone from a superstitious, agrarian society to a scientifically and technologically sophisticated one. The price of this remarkable achievement has been our alienation from the ‘dream world.’ We have lost our sense of wonder, our sense that there is something more substantial to existence than just mere crude matter.
The lack of spirituality among modern man is largely the result of an overreliance on materialism. For the philosophically challenged, ‘materialism’ is not a reference to consumerism, but to the philosophical position that regards physical matter as the fundamental substance of nature. Philosophical materialism posits that everything, including human thought and the course of history, comes as the result of physical forces. This is a philosophy which has no room for the soul, for divinity, or for God.
Philosophical materialism likely harkens back to the pre-Socratic philosophers. Epicurus, for example, believed the universe consisted of invisible and indivisible free-falling atoms that randomly collide with the world. For all intents and purposes, however, it is the Ancient Greek philosopher, Democritus (c. 460BC – c. 370BC) who is credited with the invention of philosophical materialism within the Western tradition. Democritus formulated the theory that the world was composed of ‘atoms’ – invisible chunks of matter – existing in empty space. He theorised that these microscopically small atoms would interact with one another by impacting or hooking up. Change occurs when the configuration of these atoms is altered.
In modern philosophy, materialism is referred to as a category of metaphysical theories. The French philosopher, Baron d’Holbach’s (1723 – 1789) book, Système de la Nature ou Des Loix du Monde Physique et du Monde Moral (1770) (The System of Nature, or the Laws or the Moral and Physical World) argued that everything that occurs, down to human thought and moral action, comes as the result of a causal chain that has its roots in atomic motion. The book was condemned by King Louis XVI (1754 – 1793), meaning that it was the job of the hangman to locate every copy and cut it to pieces on the beheading block.
The modern world likes to see itself as fundamentally materialistic. Being seen as “practical”, “realistic”, or “down to earth” is considered by many to be a great compliment. However, this view is largely mistaken. In reality, it is ideas, referring to the ability to think and feel and imagine, and the ability to implement them that has truly made the human race what it is. In a letter to Guillaume Gibieuf (1538 – 1650), the French philosopher, René Descartes (1596 – 1650) wrote: “I am certain I have no knowledge of what is outside me except by means of the ideas I have within me.”
It would be a great mistake, then, to suppose that human beings are naturally rational or civilised creatures. In reality, people are far more irrational, crazy, and destructive than we like to think. Modern science is really only a few hundred years old, having its roots with Francis Bacon (1561 – 1626), Rene Descartes, and Isaac Newton (1642 – 1727). Therefore, the basis for modern society is not, as often supposed, science, but religion. This is evidenced by two facts. First, the existence over thousands of years of civilisations that have their basis in religion, not science. These societies and their corresponding religions include the Japanese and Shintoism, the Chinese and Buddhism, the Middle East and Islam, and the West and Christianity. And second, by the numerous anti-science movements (most notable in today’s world are the social constructionists) that have come to the public’s attention in recent years. We are able to live in an orderly and rational manner because we live in a society that has moral rules and legal boundaries, not because it is something that comes naturally to us
The relationship between the mental and physical worlds was of great interest to the Swiss psychologist, Carl Jung (1875 – 1961). Influenced by the German Idealist School, Jung believed that “metaphysical assertions… are statements of the psyche.” He would comment: “it is the soul which, by the divine creative power inherent in it, makes the metaphysical assertion; it posits the distinction between metaphysical entities. Not only is it the condition of all metaphysical reality, it is that reality.” The central idea behind Jung’s metaphysical system was that:
“The premise that all psychological processes are necessarily conditioned on innate universal structures of subjectivity that allow for human experiences to transpire, and that these processes participate of a greater cosmic organising principle that transcends all levels of particularity or individuality.”
– Jon Mills, Jung’s Metaphysics
In his function as a psychotherapist, Carl Jung observed that western men and women often suffered from debilitating feelings of inadequacy, hopelessness, and insignificance. He believed that this was caused by a kind of spiritual problem that, even today, threatens the stability and liberty of our society. The result is that we limit ourselves only to what is socially and economically attainable. As Carl Jung wrote:
“Man feels isolated in the cosmos. He is no longer involved in nature and has lost his emotional participation in natural events, which hitherto had symbolic meaning for him. Thunder is no longer the voice of a god, nor is lightning his avenging missile. No river contains a spirit, no tree makes a man’s life, no snake is the embodiment of wisdom and no mountain still harbours a great demon. Neither do things speak to him nor can he speak to things, like stones, springs, plants and animals.”
– Carl Jung, The Earth Has a Soul
Jung noted that this problem, and the consequences associated with it, correlated with the declining influence of Christianity in the Western world and the rise of mass urbanisation that came as a result of the Industrial Revolution. As the individual surrounded himself with more and more people, his feelings of insignificance increased. The result is individuals who are highly insecure, unstable, and highly suggestible. Furthermore, the rational and scientific mindset that rose to prominence during the Enlightenment has also fooled many politicians and social reformers into believing that the same measures can be used to address social and political problems. The existence of the totalitarian systems such as fascism and communism, genocides, and mass murders that characterised the Twentieth Century stand as testaments to this reality.
The problems the West faced during the twentieth century are almost entirely spiritual by nature. The communists killed tens of millions of people in an attempt to achieve a worker’s paradise, the Cold War was as much a battle between opposing worldviews as it was one of political and economic differences, and one would have to be blind not to notice the religious overtones present in Nazism. Even today, the conflict between Western civilisation and fundamental Islam can be seen as having profound religious overtones.
Jung believed that the unconscious mind could be split into two distinct categories: the personal unconscious and the collective unconscious. The contents of the personal unconscious is comprised of both instincts (Triebe) and archetypal or primordial images. It merely refers to the memories, emotions, and knowledge that have generally been conscious but have become repressed over time. By contrast, the collective unconscious, one of Jung’s most misunderstood concepts, is distinguishable from the personal unconscious in that it is manifested separately and is therefore not a personal acquisition. It symbolises universal culture: the anthropological images, practices, edicts, traditions, mores, social norms, values, and beliefs that embodies a culture or mythos. The collective unconscious, therefore, symbolises the space that human-beings exist in.
Dreams are considered to have great psychological significance. They use mythological narratives to allow us to naturally express our unconscious fears and desires. The average person dreams between three to six times per night with each dream lasting between five and twenty minutes. Jeffrey Sumber, a clinical psychologist, has spent years studying dream mythology at Harvard University and Jungian dream interpretation at the Jung Institute in Zurich, Switzerland. Sumber argues that dreams bridge the unconscious mind with the conscious mind. “Dreaming is non-essential when it comes to survival as a body”, Sumber concluded, “but is essential with regards to our development and evolution as metaphysical beings.”
Active imagination exists to give a voice to the anima, animus, shadow, and other areas of the personality that do not typically hold our attention. When the individual engages his active imagination, let’s say through painting or writing, there is a transformation of consciousness. As Carl Jung wrote in The Conjunction:
“Although, to a certain extent, he looks on from outside, partially, he is also an acting and suffering figure in the drama of the psyche. This recognition is absolutely necessary and marks an important advance. So long as he simply looks at the pictures he is like the foolish Parsifal, who forgot the ask the vital question because he was not aware of his own participation in the action. But if you recognise your own involvement you yourself must enter into the process with your personal reactions, just as if you were one of the fantasy figures, or rather, as if the drama being enacted before your eyes were real.”
The collective unconscious manifests itself most greatly through mankind’s proclivity for telling stories. As the clinical psychologist and cultural critic, Jordan Peterson (1962 – ) explains, story-telling is an ancient and innate aspect of human nature:
“You know, we’ve been collecting stories as people we don’t know how long – hundred thousand years, maybe. There’s been creatures like us, indistinguishable from us, for a hundred thousand years. And we know that societies that appear more or less as archaic as those old societies tell stories, have rituals, have mythology. What do they mean? What are they good for? Well, imagine this: you tell a story to your husband or your wife about something interesting that you saw. Well, imagine that you could collect a thousand of the most interesting stories. And then imagine that you were some kind of literary genius like Shakespeare and you could take those thousand interesting stories and boil them down to a hundred really interesting stories. And then imagine that you had ten thousand years to gather up those most interesting stories and average them and you could come out with one perfect story: the best story, the most interesting story you could possibly tell. Well, that’s what a myth is. It’s the most interesting story you could possibly tell. Virtually every story you ever see has a mythological structure, that’s why it’s compelling to you. And when you meet someone who is charismatic or who holds your attention or who you’re interested in, the probability that they’re acting out a mythological fragment is very, very high. That’s why it is that your attention is captivated by them.”
Myths are really psychological in nature, even though they are typically misread as biographical or even historical. Myths, much like dreams, emanate from the unconscious thoughts and emotions and gives a voice to the innermost fears and desires that underlie most of our behaviour.
The purpose of mythology is to provide the individual with a mirror which he can use to assess himself and his relationship with the wider world. It exists to provide the individual with a sense of history and of his place within the cosmos. Whereas the world of fiction has to work in an alternative reality where the facts of that universe are considered irrefutable and correct, mythology works by taking the metaphorical or metaphysical-cum-spiritual truths of existence and gives them voice and meaning through the medium of a story. Therefore, it is not how factually true a mythological story may or may not be that is important, but the metaphorical truths it imparts on the reader.
There can be little doubt that the modern world has produced marvels. The price of these remarkable achievements has been a form of perverse arrogance in which modern man likes to believe he is somehow a different, more rational, creature than his ancestors. The price for our arrogance has been the loss of our sense of something more substantial and wonderous than ourselves. As a result, people limit themselves only to that which is socially and economically attainable. Seeing ourselves as eminently rational and pragmatic creatures we have managed to produce a world where the individual feels worthless and insignificant. What is required is a revitalisation of the dream world. A return to the knowledge that it is ideas, our ability to give a voice to those aspects of our personalities that lie dormant, and to venture out into the chaotic unknown and return triumphantly that makes human beings great.
This is our weekly theological article.
If there is any philosophical or moral principle that can be credited with the prosperity of the Western capitalist societies it would have to be the Protestant work ethic. This ethic asserts that a person’s success in this life is a visible sign of their salvation in the next. As a result, the Protestant work ethic encourages hard work, self-reliance, literacy, diligence, frugality, and the reinvestment profits.
Prior to the Reformation, not much spiritual stock was placed on labour. The Roman Catholic Church placed more value on monastic prayer than on manual labour. Much would change when the German monk, Martin Luther (1483 – 1546), nailed his ninety-five theses on the door of the All Saint’s Church in Wittenberg. Luther railed against the Catholic Church’s sale of indulgences as a way of avoiding purgatorial punishment. Luther asserted faith over work believing that a person could be set right with God through faith alone. It was Luther’s opinion that an individual should remain in the vocation God had called them to and should work to earn an income, rather than the accumulation of wealth. This belief stood in stark contrast to the Catholic Church’s philosophy that relief from eternal torment came from Godly rewards for good works. By contrast, the second great Protestant, John Calvin (1509 – 1564), believed that faith and hard work were inextricably linked. Calvin’s theory came from his revolutionary idea of predestination, which asserted that only certain people were called into grace and salvation. It is from this that the Protestant work ethic is borne.
As a consequence, many Protestants worked hard to prove to themselves that they had been preselected for a seat in heaven. A result of this extreme predilection towards hard-work was an increase in economic prosperity.
The French sociologist, Emile Durkheim (1858 – 1917), believed that capitalism was built on a system that encouraged a strong work ethic and delayed gratification. Similarly, the German sociologist, Max Weber (1864 – 1920), argued in The Protestant Work Ethic and the Spirit of Capitalism (1905) that America’s success boiled down to the Protestant work ethic. It was asserted as the key idea that would encourage individuals to move up the social ladder and achieve economic independence. Weber noted that Protestants – particularly Calvinists, were largely responsible for early twentieth-century business success.
The Protest work ethic is credited with the United States’ economic and political rise in the 19th and 20th centuries. As the political scientist, Alexis de Tocqueville (1805 – 1859), wrote in Democracy in America (1835):
“I see the whole destiny of America contained in the first Puritan who landed on its shore. They will to their descendants the most appropriate habits, ideas, and mores to make a republic.”
A study in the American Journal of Economics and Sociology found that nations with a majority Protestant population enjoyed higher rates of employment. The economist, Horst Feldman, analysed data from eighty countries and found that countries with majority Protestant populations – America, the United Kingdom, Denmark, Sweden, and Norway – had employment rates six-percent higher than countries where other religious beliefs were practised. (Furthermore, the female employment rate in Protestant countries is eleven-percent higher). Feldman explained how the legacy of Protestantism led to increased prosperity:
“In the early days, Protestantism promoted the virtue of hard and diligent work among its adherents, who judged one another by conformity to this standard. Originally, an intense devotion to one’s work was meant to assure oneself that one was predestined for salvation. Although the belief in predestination did not last more than a generation or two after the Reformation, the ethic of work continued.”
The Protestant work ethic is one of those Christian ideas that have helped create Western capitalist democracies in all their glory. It is yet another example of the influence that Christianity has had on the modern world.
There has been an alarming trend in modern culture: numerous political and social activist groups have been attempting to use the pernicious and false doctrines of political correctness, tolerance, and diversity to silence those they disagree with. Many of these groups have sought the passage of so-called “hate speech” laws designed to silence voices of dissent.
At public colleges and universities, places where free speech and open debate should be actively encouraged, measures – including protests, disruption, and, in some cases, outright violence – taken to suppress voices of dissent has become tantamount to Government censorship. This censorship prevents students from inviting the speakers they wish to hear and debate speech they disagree with. Eva Fourakis, the editor-in-chief of The Williams Record (the student newspaper of Williams College) wrote an editorial, later recanted, commenting that “some speech is too harmful to invite to campus.” The editorial went on to say: “students should not face restrictions in terms of the speakers they bring to campus, provided of course that these speakers do not participate in legally recognised forms of hate speech.”
The University of California, Berkeley, is famous for sparking the free speech movement of the 1960s. Today, however, it has become a haven for radical, anti-free speech Neo-Marxists and social justice warriors. Not only have many Republican students had their personal property destroyed, but numerous conservative speakers have had their talks disturbed, and, in some cases, halted altogether. In February, Antifa – so-called anti-fascists – set fires and vandalised building during a speech by the controversial journalist, Milo Yiannopoulos (1984 – ). In April, threats of violence aimed at members of the Young Americas Foundation forced political commentator, Ann Coulter (1961 – ), to cancel her speech. A speech by David Horowitz (1939 – ), founder and president of the David Horowitz Freedom Center, was cancelled after organisers discovered that the event would take place during normal class times (for safety, or so they claimed). Finally, the conservative journalist, Ben Shapiro (1984 – ), was forced to spend US$600,000 on security for his speech at UC Berkeley. These events show that those who wish to use disruption, vilification, threats, and outright violence to silence others can be, and often are, successful in doing so.
Like most the principles of classical liberalism, free speech developed through centuries of political, legal, and philosophical progress. And like many Western ideas, its development can be traced back to the Ancient Greeks. During his trial in Athens in 399BC, Socrates (470BC – 399BC) expressed the belief that the ability to speak was man’s most divine gift. “If you offered to let me off this time on condition I am not any longer to speak my mind”, Socrates stated, “I should say to you, ‘Men of Athens, I shall obey the Gods rather than you.”
Sixteen hundred years later, in 1215, the Magna Carta became the founding document of English liberty. In 1516, Desiderius Erasmus (1466 – 1536) wrote in the Education of a Christian Prince that “in a free state, tongues too should be free.” In 1633, the astronomist Galileo Galilei was put on trial by the Catholic Church for refusing to retract his claim of a heliocentric solar system. In 1644, the poet, John Milton (1608 – 1674), author of Paradise Lost, warned in Areopagictica that “he who destroys a good book kills reason itself.” Following the usurpation of King James II (1633 – 1701) by William III (1650 – 1702) and Mary II (1662 – 1694) in 1688, the English Parliament passed the English Bill of Rights which guaranteed free elections, regular parliaments, and freedom of speech in Parliament.
In 1789, the French Declaration of the Rights of Man and of the Citizen, an important document of the French revolution, provided for freedom of speech (needless to say, Robespierre and company were not very good at actually promoting this ideal). That same year, the philosopher Voltaire (1694 – 1778) famously wrote: “I detest what you write, but I would give my life to make it possible for you to continue to write.” Over in the United States, in 1791, the first amendment of the US Bill of Rights guaranteed freedom of religion, freedom of speech, freedom of the press, and the right to assemble:
ARTICLE [I] (AMENDMENT 1 – FREEDOM OF SPEECH AND RELIGION)
Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of people peaceably to assemble, and to petition the Government for a redress of grievances.”
During the 19th century, the British philosopher, John Stuart Mill (1806 – 1873) argued for toleration and individuality in his 1859 essay, On Liberty. “If any opinion is compelled to silence”, Mill warned, “that opinion may, for aught we can certainly know, be true. To deny this is to presume our own infallibility.” Mill believed that all doctrines, no matter how immoral or offensive, ought to be given public exposure. He stated in On Liberty:
“If the argument of the present chapter are of any validity, there ought to exist the fullest liberty of professing and discussing, as a matter of ethical conviction, any doctrine, however immoral it may be considered.”
Elsewhere in On Liberty, Mill warned that the suppression of one voice was as immoral as the suppression of all voices:
“If all mankind minus one were of one opinion, and only one person were of the contrary opinion, mankind would be no more justified in silencing that one person than he, if he had the power, would be justified in silencing mankind.”
Centuries later, in 1948, the Universal Declaration of Human Rights, accepted unilaterally by the United Nations, urged member states to promote civil, human, economic, social, and political rights – including freedom of expression and religion.
Within the American Justice System, numerous Supreme Court cases have created judicial protections for freedom of speech. In the case of the Nationalist Socialist Party of America v. Village of Stoke (1977), the Supreme Court upheld the right of neo-Nazis to march through a village with a large Jewish population and wear Nazi insignia. The Justices found that the promotion of religious hatred was not a sufficient reason to restrict free speech.
In the city of St. Paul during the early 1990s, a white teenager was arrested under the “Bias-Motivated Crime Ordinance” after he burnt a cross made of a broken chair (cross-burning is commonly used by the Ku Klux Klan to intimidate African Americans) in the front yard of an African American family. The Court ruled that the city’s Ordinance was unconstitutional. Justice Antonin Scalia (1936 – 2016), noted that the purpose of restricting fighting words was to prevent civil unrest, not to ban the content or message of the speaker’s words. Scalia wrote in the case of R.A.V. v. City of St. Paul (1992):
“The ordinance applies only to ‘fighting words’ that insult, or provoke violence, ‘on the basis of race, colour, creed, religion or gender.’ Displays containing abusive invective, no matter how vicious or severe, are permissible unless they are addressed to one of the specified disfavored topics. Those who wish to use ‘fighting words’ in connection with other ideas—to express hostility, for example, on the basis of political affiliation, union membership, or homosexuality—are not covered. The First Amendment does not permit St. Paul to impose special prohibitions on those speakers who express views on disfavored subjects.”
In the Matal v. Tam case (2017), the Supreme Court found that a provision within the Lanham Act prohibiting the registration of trademarks that disparaged persons, institutions, beliefs, or national symbols violated the First Amendment. Justice Samuel Alito (1950 – ) opined:
“[The idea that the government may restrict] speech expressing ideas that offend … strikes at the heart of the First Amendment. Speech that demeans on the basis of race, ethnicity, gender, religion, age, disability, or any other similar ground is hateful; but the proudest boast of our free speech jurisprudence is that we protect the freedom to express ‘the thought that we hate’.”
Justice Anthony Kennedy (1936 – ) opined:
“A law found to discriminate based on viewpoint is an “egregious form of content discrimination,” which is “presumptively unconstitutional.” … A law that can be directed against speech found offensive to some portion of the public can be turned against minority and dissenting views to the detriment of all. The First Amendment does not entrust that power to the government’s benevolence. Instead, our reliance must be on the substantial safeguards of free and open discussion in a democratic society.”
In recent years, numerous calls to ban speech have been justified on the basis that it is “hateful.” Much of this has come from the political left who (in what one may cynically regard as having more to do with silencing voices of dissent than with protecting vulnerable groups) argue that restrictions on hate speech must occur if minorities are to be given equal status with everyone else.
That certain types of speech can be offensive, and that some of that speech may be aimed at certain groups of people, is undeniable. Hate speech has even been criticised for undermining democracy! In an article, Alexander Tsesis, Professor of Law at Loyola University, wrote: “hate speech is a threatening form of communication that is contrary to democratic principles.” Some have even argued that hate speech violates the fourteenth amendment to the US Constitution which guarantees equal protection under the law:
Article XIV (AMENDMENT 14 – RIGHTS GUARANTEED: PRIVILEGES AND IMMUNITIES OF CITIZENSHIP, DUE PROCESS, AND EQUAL PROTECTION)
1: All persons born or naturalised in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside. No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny any person within its jurisdiction the equal protection of the laws.
That there is a historical basis for restricting hate speech is undeniable. Slavery, Jim Crow, and the Holocaust, among other atrocities, were all proceeded by violent and hateful rhetoric. (Indeed, incitement to genocide is considered a serious war crime and a serious crime against humanity under international law.) Genocide is almost always preceded by hate speech. However, what proponents of hate speech laws fail to realise is that the countries that perpetrated these atrocities did not extend the freedom to speak to the groups that they were targeting. Joseph Goebbels (1897 – 1945), the Nazi minister for public enlightenment and propaganda, for example, had such an iron grip on Germany’s media that any voice contradicting the Nazi’s anti-Semitic propaganda had no opportunity to be heard.
But who, exactly, supports hate speech laws? Analysis of survey data taken from Pew Research Center and YouGov reveals that it is primarily non-white, millennial democrats. In terms of age, the Pew Research Centre found that forty-percent of millennials supported Government censorship of hate speech, compared to twenty-seven percent of gen x-ers, twenty-four percent of baby-boomers, and only twelve percent of the silent generation.
In terms of race, research by YouGov reveals that sixty-two percent of African Americans support Government censorship of hate speech, followed by fifty percent of Hispanics, and thirty-six percent of White Americans.
In terms of political affiliation, research from YouGov taken in 2015 found that fifty-one percent of Democrats supported restrictions on hate speech, compared to thirty-seven percent of Republicans, and only thirty-five percent of independents.
The primary issue with hate speech is that determining what it does and does not constitute is very difficult. (The cynic may argue, fairly, that hate speech begins when the speaker expresses a view or states a fact or expresses an opinion that another person does not want others to hear.) As Christopher Hitchens (1949 – 2011) pointed out, the central problem with hate speech is that someone has to decide what it does and does not constitute.
The second issue with hate speech laws is that they can easily be used by one group to silence another. Often this kind of censorship is aimed at particular groups of individuals purely for ideological and/or political purposes, often with the justification that such actions increase the freedom and equality of the people the advocates claim to represent.
In Canada, Bill C-16 has sought to outlaw “hate propaganda” aimed at members of the community distinguishable by their gender identity or expression. The Bill originated with a policy paper by the Ontario Human Rights Commission which sought to determine what constituted discrimination against gender identity and expression. This included “refusing to refer to a person by their self-identified name and proper personal pronoun.” Supporters of Bill C-16 see it as an important step towards the creation of legal protections for historically marginalised groups. Detractors, however, have expressed concern that the Bill creates a precedence for Government mandated speech.
The Canadian clinical psychologist and cultural critic, Professor Jordan Peterson (1962 – ), first came to public attention when he posted a series of YouTube videos warning of the dangers of political correctness and criticising Bill C-16. In his videos, Professor Peterson warned that the law could be used to police speech and compel individuals to use ‘transgender pronouns’ (these are terms like ‘ze’ and ‘zer’, among others). For his trouble, Peterson has been accused of violence by a fellow panellist on the Agenda with Steve Palkin, received two warning letters from the University of Toronto in 2016, and was denied a social research grant from Canada’s Social Sciences and Humanities Research Council.
Europe has been experiencing similar attempts to silence speech. A law passed in the Bundestag this year will force social media companies operating in Germany to delete racist or slanderous comments and posts within twenty-four hours or face a fine of up to €50 million if they fail to do so. Additionally, numerous public figures have found themselves charged with hate speech crimes for merely pointing out the relationship between the large influx of non-European migrants and high crime rates, particularly in terms of rape and terrorism. One politician in Sweden was prosecuted for daring to post immigrant crime statistics on Facebook.
In Great Britain, British Freedom of Information documents reveal that around twenty-thousand adults and two-thousand children had been investigated by the police for comments that made online. In politics, British MP, Paul Weston (1965 – ), found himself arrested after he quoted a passage on Islam written by Winston Churchill (1874 – 1965). In Scotland, a man was charged under the 2003 Communication’s Act with the improper use of electronic communications after he filmed his dog making a Hitler salute.
In Australia, Herald Sun columnist, Andrew Bolt (1959 – ), was found to have contravened section 18C of the Racial Discrimination Act after he published articles accusing fair-skinned Aborigines of using their racial status for personal advantages. The law firm, Holding Redlich, speaking for a group of Aboriginal persons, demanded that the Herald Sun retract two Andrew Bolt articles, written in April and August of 2009, and restrain Bolt from writing similar articles in the future. Joel Zyngier, who acted for the group pro-bono, told Melbourne’s The Age:
“We see it as clarifying the issue of identity—who gets to say who is and who is not Aboriginal. Essentially, the articles by Bolt have challenged people’s identity. He’s basically arguing that the people he identified are white people pretending they’re black so they can access public benefits.”
Judge Morcedai Bromberg (1959 – ) found that the people targeted by Bolt’s articles were reasonably likely to have been “offended, insulted, humiliated, or intimidated.”
We need speech to be as free as possible because it is that which allows us to exchange and critique information. It through free speech that we are able to keep our politicians and public officials in check, that we are able to critique public policy, and that we are able to disseminate information. As the Canadian cognitive psychologist, Stephen Pinker (1954 – ), observed: “free speech is the only way to acquire knowledge about the world.” Measures taken to restrict free speech, whether it be the criminalization of hate speech or any other, is a complete contradiction of the principles that free Western democracies are founded upon.
Next Monday will mark fifty-five years since the Cuban Missile Crisis. For thirteen days, the world held its collective breath as tensions between the United States of America and the Union of Soviet Socialist Republics reached boiling point. Whoever averted the crisis would be glorified in the annals of history, whoever escalated it would be responsible for the annihilation of life on earth.
Our story begins in July, 1962, when Cuban dictator Fidel Castro (1926 – 2016) and Soviet premier Nikita Khrushchev (1894 – 1971) came to a secret agreement to deter another US-backed invasion attempt (the US had previously backed the disastrous Bay of Pigs operation, and were planning another invasion called ‘Operation Mongoose’) by planting nuclear missiles on Cuban soil. On September 4th, routine surveillance flights discovered the general build-up of Soviet arms, including Soviet IL-28 bombers. President John F. Kennedy (1917 – 1963) issued a public warning against the introduction of offensive weapons in Cuba.
Another surveillance flight on October 14th discovered the existence of medium-range and immediate range ballistic nuclear weapons in Cuba. President Kennedy met with his advisors to discuss options and direct a course of action. Opinions seemed to be divided between sending strong warnings to Cuba and the Soviet Union and using airstrikes to eliminate the threat followed by an immediate invasion. Kennedy chose a third option. He would use the navy to ‘quarantine Cuba’ – a word used to legally distinguish the action from a blockade (an act of war).
Kennedy then sent a letter to Khrushchev stating that the US would not tolerate offensive weapons in Cuba and demanded the immediate dismantling of the sites and the return of the missiles to the Soviet Union. Finally, Kennedy appeared on national television to explain the crisis and its potential global consequences to the American people. Directly echoing the Monroe doctrine, he told the American people: “It shall be the policy of this nation to regard any nuclear missile launched from Cuba against any nation in the Western Hemisphere as an attack by the Soviet Union on the United States, requiring a full retaliatory response upon the Soviet Union.” The Joint Chief of Staff then declared a military readiness level of DEFCON 3.
On October 23rd, Khrushchev replied to Kennedy’s letter claiming that the quarantining of Cuba was an act of aggression and that he had ordered Soviet ships to proceed to the island. When another US reconnaissance flight reported that the Cuban missile sites were nearing operational readiness, the Joint Chiefs of Staff responded by upgrading military readiness to DEFCON 2. War involving Strategic Air Command was imminent.
On October 26th, Kennedy complained to his advisors that it appeared only military action could remove the missiles from Cuba. Nevertheless, he continued to pursue a diplomatic resolution. That afternoon, ABC News correspondent, John Scali (1918 – 1995), informed the White House that he had been approached by a Soviet agent who had suggested that the Soviets were prepared to remove their missiles from Cuba if the US promised not to proceed with an invasion. The White House scrambled to determine the validity of this offer. Later that evening, Khrushchev sent Kennedy a long, emotional message which raised the spectre of nuclear holocaust and suggested a resolution similar to that of the Soviet agent: “if there is no intention to doom the world to the catastrophe of thermonuclear war, then let us not only relax the forces pulling on the ends of the rope, let us take measures to untie the knot. We are ready for this.”
Hope was short-lived. The next day Khrushchev sent Kennedy another message demanding the US remove its Jupiter missiles from Turkey as a part of any resolution. That same day, a U2 Spy Plane was shot down over Cuba.
Kennedy and his advisors now planned for an immediate invasion of Cuba. Nevertheless, slim hopes for a diplomatic resolution remained. It was decided to respond the Khrushchev’s first message. In his message, Kennedy suggested possible steps towards the removal of the missiles from Cuba, suggested the whole business take place under UN supervision, and promised the US would not invade Cuba. Meanwhile, Attorney General Robert Kennedy (1925 – 1968) met secretly with the Soviet Ambassador to America, Anatoly Dobrynin (1919 – 2010). Attorney General Kennedy indicated that the US was prepared to remove its Jupiter missiles from Turkey but that it could not be part of any public resolution.
On the morning of October 28th, Khrushchev issued a public statement. The Soviet missiles stationed in Cuba would be dismantled and returned to the Soviet Union. The United States continued its quarantine of Cuba until the missiles had been removed, and withdrew its Navy on November 20th. In April 1963, the US removed its Jupiter missiles from Turkey. The world breathed a sigh of relief.
The Cuban Missile Crisis symbolises both the terrifying spectre of nuclear holocaust, and the power of diplomacy in resolving differences. By forming an intolerable situation, the presence of nuclear weapons forced Kennedy and Khrushchev to favour diplomatic, rather than militaristic, resolutions. In the final conclusion, it must be acknowledged that nuclear weapons, and the knowledge and technology to produce them, will always exist. The answer, therefore, cannot be to rid the world of nuclear weapons but learn to live peacefully in a world that has them.
This week for our cultural article, we will be examining Robert Frost’s (1874 – 1963) poem, The Road Not Taken.
First appearing in Frost’s poetry collection, Mountain Interval, in 1916, The Road Not Taken is one of America’s most enduring poems. It has become a part of our cultural lexicon, appearing in in numerous films and books, among other mediums, including, most notably, Dead Poet’s Society (1989), as well as in advertisements for Nicorette, Mentos, AIG, Ford, and more.
Robert Lee Frost was born in San Francisco, California, on March 26th, 1874, to William Prescott Frost, Jr. (185- – 1885), a journalist, and Isabella Moodie (1844 – 1900). William Frost would die of tuberculosis when Frost was eleven years old. Shortly after, he would move with his mother and younger sister, Jeanie, to Lawrence, Massachusetts.
It was during high school that Frost first developed an interest in poetry and literature. In 1892, Frost enrolled at Dartmouth College in Hanover, New Hampshire. He dropped out after only two months and took a series of menial jobs – teacher, cobbler, and editor of the Lawrence Sentinel, among others – to support himself. Later he would attend Harvard University but would drop out due to poor health.
Robert Frost published his first poem, The Butterfly, in the New York newspaper, The Independent, in 1894. On December 19th, 1895, Frost married Elinor Miriam White (1873 -1938), with whom he had shared valedictorian honours in high school. Together, the couple would have six children, only two of whom would live to see old age. Elliot Frost, born 1896, would die of Cholera in 1900. Carol Frost, born 1902, would commit suicide in 1940. Marjorie Frost, born 1905, would die in childbirth in 1935. Elinor Frost, born 1907, would die in infancy. Only Leslie Frost, born 1899, and Irma Frost, born 1903, would live to see old age.
After failing to generate enough income as farmers in New Hampshire, the Frosts emigrated to England in 1912. There Robert Frost made numerous friends, and garnered inspiration, with various British poets and writers. Among these were Edward Thomas (1878 – 1917), Rupert Brooke (1887 – 1915), Robert Graves (1895 – 1985), and Ezra Pound (1885 – 1972) – who helped Frost publish and promote his poetry. The Frosts returned to America in 1915. By this time, Robert Frost had published two collections of his poetry, A Boy’s Hill, published 1913, and North of Boston, published in 1914.
By the 1920s, Robert Frost had become the most celebrated poet in America. He received more and more accolades, which included Pulitzer prizes, with every collection of poetry he published.
In 1938, Robert Frost was widowed when his wife, Elinor, lost her battle with breast cancer. He never remarried. Between 1958 and 1959, Frost served as the consultant for poetry at the Library of Congress. Robert Frost died in Boston, Massachusetts, on January 29th, 1963. He was eighty-eight years old.
Kofi Annan, the former Secretary-General of the United Nations, has stated that disagreeing with globalism is like disagreeing with “the laws of gravity.” Similarly, new French President, Emmanuel Macron, another supporter of globalism, wishes to deregulate France’s ailing industry and boost freedom of movement and trade. Donald Trump’s election to the US Presidency, and the UK’s decision to leave the European Union, however, have challenged the presumed supremacy of globalism as a political force.
The roots of globalism can be traced back to the 2nd Century BC when the formation of the Silk Road facilitated the trade of silk, wool, silver, and gold between Europe and China. It wasn’t until the 20th century, however, that the idea gathered momentum. Following the Second World War, world power was to be split between America, representing the capitalist west, and the Union of Soviet Socialist Republics, representing the communist east. Following the collapse of the Soviet Union in 1991, America took it upon herself to create an undivided, democratic, and peaceful Europe.
Of course, the aim for an undivided Europe, indeed an undivided world, existed long before the collapse of the Soviet Union. In 1944. Allied delegates, met at Bretton Woods, New Hampshire, to establish an economic system based on open markets and free trade. Their idea gathered momentum. Today, the Monetary Fund, World Bank, and, the World Trade Centre all exist to unite the various national economies of the world into a single, global economy.
In 1950, the French foreign minister, Robert Schuman, proposed pooling Western Europe’s coal and steel producing countries together. Originally, Schuman’s objective had been to unite France with the Federal Republic of Germany. In the end, however, the Treaty of Paris would unite Belgium, France, West Germany, Italy, Luxembourg, and the Netherlands in the European Coal and Steel Community. By 1957, the Treaty of Rome had been used to create the European Economic Community.
Globalism is an ideology which seeks to form a world where nations base their economic and foreign policies on global, rather than national, interests. It can be viewed as a blanket term for various phenomena: the pursuit of classical liberal and free market policies on the world stage, Western dominance over the political, cultural, and economic spheres, the proliferation of new technologies, and global integration.
John Lennon’s Imagine, speaking of ‘no countries’, ‘no religion’, and a ‘brotherhood of man’, acts as an almost perfect anthem for globalism. Your individual views on globalism, however, will depend largely on your personal definition of a nation. If you support globalism it is likely you believe a nation to be little more than a geographical location. If you are a nationalist, however, it is likely you believe a nation to be the accumulation of its history, culture, and traditions.
Supporters of John Lennon’s political ideology seem to suffer from a form of self-loathing. European heritage and culture are not seen as something worth celebrating, but as something to be dismissed. And it appears to be working: decades of anti-nationalist, anti-Western policies have stripped many European nations of their historical and cultural identities. In the UK, there have been calls to remove the statue of Cecil Rhodes – an important, yet controversial figure. In other countries, certain areas are have become so rife with ethnic violence they are considered ‘no-go’ zones.
Perhaps, it is the result of “white man’s burden”, Rudyard Kipling’s prophetic 1899 poem about the West’s perceived obligation to improve the lot of non-westerners. Today, many white, middle-class elites echo Kipling’s sentiments by believing that it to be their duty to save the world. These people are told at charity events, at protests, at their universities, and by their media of their obligation to their ‘fellow man.’ When it comes to immigration, they believe it to be their responsibility to save the wretched peoples of the world by importing them, and their problems, to the West.
By contrast, nationalism champions the idea that nations, as defined by a common language, ethnicity, or culture, have the right to form communities based on a shared history and/or a common destiny. The phenomenon can be described as consisting of patriotic feelings, principles, or efforts, an extreme form or patriotism characterised by feelings of national superiority, or as the advocacy of political independence. It is primarily driven by two factors. First, feelings of nationhood among members of a nation-state, and, two, the actions of a state in trying to achieve or sustain self-determination. In simplest terms, nationalism constitutes a form of human identity.
One cannot become a citizen of a nation merely by living there. Citizenship arises from the sharing of a common culture, tradition, and history. As American writer Alan Wolfe observed: “behind every citizen lies a graveyard.” The sociologist Emile Durkheim believed people to be united by their families, their religion, and their culture. In Suicide: a Study in Sociology, Durkheim surmises:
“It is not true, then, that human activity can be released from all restraint. Nothing in the world can enjoy such a privilege. All existence being a part of the universe is relative to the remainder; its nature and method of manifestation accordingly depend not only on itself but on other beings, who consequently restrain and regulate it. Here there are only differences of degree and form between the mineral realm and the thinking person.’ Man’s characteristic privilege is that the bond he accepts is not physical but moral; that is, social. He is governed not by a material environment brutally imposed on him, but by a conscience superior to his own, the superiority of which he feels.” – Suicide: a Study in Sociology (pg. 277)
Globalism has primarily manifested itself through economic means. In the economic sense, globalism began in the late 19th, early 20th centuries with the invention of the locomotive, the motor-car, the steamship, and the telegraph. Prior to the industrial revolution, a great deal of economic output was restricted to certain countries. China and India combined produced an economic output of fifty-percent, whilst Western Europe produced an economic output of eighteen percent. It was the industrial revolution of the 19th century, and the dramatic growth of industrial productivity, which caused Western Europe’s economic output to double. Today, we experience the consequences of globalism every time we enter a McDonalds Restaurant, call someone on our mobile phones, or use the internet.
Philip Lower, the Governor of the Reserve Bank of Australia, told a group of businessmen and women at the Sydney Opera House that Australia was “committed to an open international order.” Similarly, the Nobel Prize-winning economist, Amartya Sen, argued that globalisation had “enriched the world scientifically and culturally, and benefited many people economically as well.” It is certainly true that globalisation has facilitated the sharing of technological, cultural, and scientific advances between nations. However, as some economists, like Joseph Stiglitz and Ha-Joon Chang, have pointed out: globalisation can also have the effect of increasing rather than reducing inequality. In 2007, the International Monetary Fund admitted that investment in the foreign capital of developing countries and the introduction of new technologies has had the effect of increasing levels of inequality. Countries with larger populations, lower working and living standards, more advanced technology, or a combination of all three, are in a better position to compete than countries that lack these factors.
The underlying fact is that globalism has economic consequences. Under globalisation, there is little to no restrictions on the movement of goods, capital, services, people, technology, and information. Among the things championed by economic globalisation is the cross-border division of labour. Different countries become responsible different forms of labour.
The United Nations has unrealistically asserted globalism to be the key to ending poverty in the 21st Century. The Global Policy Forum, an organisation which acts as an independent policy watchdog of the United Nations, has suggested that imposition of global taxes as a means of achieving this reality. These include taxes on carbon emissions to slow climate change, taxes on currency trading to ‘dampen instability in the foreign exchange markets’, and taxes to support major initiatives like reducing poverty and hunger, increasing access to education, and fighting preventable diseases.
In one sense, the battle between globalism and nationalism can be seen as a battle between ideology and realism. Globalism appears committed to creating a ‘brotherhood of man.’ Nationalism, on the other hand, reminds us that culture and nationality form an integral part of human identity, and informs us they are sentiments worth protecting. The true value of globalism and nationalism come not from their opposition, but from how they can be made to work together. Globalism has the economic benefit of allowing countries to develop their economies through global trade. It is not beneficial, however, when it devolves into open-border policies, global taxes, or attacks on a nation’s culture or sovereignty. Nationalism, by the same token, has the benefit of providing people with a national and cultural identity, as well as the benefits and protections of citizenship. Nationalism fails when it becomes so fanatical it leads to xenophobia or war. The answer, therefore, is not to forsake one for the other, but to reconcile the two.
On November 20th, 1945, twenty-four leaders of the defeated Nazi regime filed into Courtroom 600 of Nuremberg’s Palace of Justice to be tried for some of the most reprehensible crimes ever committed. Over the next ten months, the world would be shocked to learn of the depth and extent of the Nazi regime’s mechanised horrors. By the end of the trial, twelve of the defendants would be sentenced to death, seven would be sentenced to periods of imprisonment, and three would be acquitted.
Contrary to what one may believe, the perpetrators of the Holocaust were not psychopaths, sadists, or otherwise psychologically disturbed individuals. Rather, their actions arose, as psychologist Gustave Gilbert (1911 – 1977) concluded, from a culture which valued obedience. The observation that mass-horror is more likely to be committed by normal men and women influenced by social conformity would later be categorised by Hannah Arendt (1906 – 1975) as the ‘banality of evil.’
This shouldn’t be as too much of a surprise. After all, human beings are hard-wired to obey orders from people they deem superior to themselves. In 1961, Yale Psychologist Stanley Milgram (1933 – 1984) carried out a famous experiment which explored the conflict between authority and personal conscience. Milgram’s experiment was inspired by an interview with the Commandant of Auschwitz, Rudolf Höss (1900 – 1947). Höss was asked how it was possible to be directly involved in the deaths of over a million people without suffering emotional distress. Chillingly, Höss answered that he was merely following orders.
The process of the experiment was simple. Two participants, one who whom was actually a researcher, would draw to decide who would take the role of teacher and who would take the role of student. (The system, needless to say, was rigged to ensure the actual participant took the role of teacher). The teacher and student were then separated, and the teacher was taken to a room with an electric shock generator consisting of a row of switches ranging from fifteen to four-hundred-and-fifty volts. Supervising the teacher was an experimenter in a grey lab coat (an actor in reality). Through the experiment, the teacher was to ask the student questions and administer an electric shock every time the student got a question wrong. As the experiment continued the student would deliberately give wrong answers. As the shocks got more and more severe, the student would scream and beg for mercy. When the teacher expressed concern, however, the experimenter would insist that the experiment continue. By the end of the experiment, Milgram had concluded that all participants would continue to three-hundred volts whilst two-thirds would continue to full volts when pressed.
The Nazis were able to create such obedience through a well-calculated propaganda campaign. Hitler outlined the principles of this campaign in Mein Kampf:
- Keep the dogma simple. One or two points only.
- Be forthright and powerfully direct – tell or order why.
- Reduce concepts down to black and white stereotypes
- Constantly stir people’s emotions
- Use repetition.
- Forget literary beauty, scientific reasoning, balance, or novelty.
- Focus solely on convincing people and creating zealots.
- Find slogans which can be used to drive the movement forward.
Similarly, Hitler’s speeches also followed a very specific and calculated formula:
- Hitler would unify the crowd by pointing out some form of commonality.
- Hitler would stir up feelings of fear and anger by pointing out some kind of existential threat.
- Hitler would invoke himself as the agent of a higher power.
- Hitler would present his solution to the problem.
- Hitler would proclaim the utilisation of the solution as a victory for both the higher power and the commoners.
In essence, the Nazi propaganda machine facilitated feelings of group identity and then used conformity to gain control over that group. They gambled that the majority of people preferred being beholden to a group than identifying as an individual.
If there is any lesson which can be derived from the Holocaust it is that the distance between good and evil is shorter than we like to believe. As clinical psychologist Jordan Peterson is fond of pointing out, if the Holocaust was perpetrated by ordinary people and you’re an ordinary person, the only logical conclusion is that you too are capable of horrendous evil. It is not enough to be critical of those in powers, eternal vigilance means being critical of our own need to conform and obey. Our freedom depends upon it.
This week for our theological article, we will be examining Friedrich Nietzsche’s (1844 – 1900) infamous statement, “God is dead.”
Friedrich Wilhelm Nietzsche (pronounced ‘knee-cha’) was born in Röcken, near Leipzig, on October 15th, 1944. His father, Karl Ludwig Nietzsche (1813 – 1849), was a Lutheran pastor and former teacher, and his mother was Franziska Oehler (1826 – 1897). The Nietzsche family quickly grew to include a daughter, Elisabeth (1846 – 1935), and another son, Ludwig Joseph (1848 – 1850). Unfortunately, the family would be beset by tragedy. In 1849, when Nietzsche was five-years-old, Karl Nietzsche would suffer a devastating brain haemorrhage and die. Then, as if to rub in salt in their wounds, the infant Ludwig Joseph, would die unexpectedly shortly after.
Nietzsche was educated at the prestigious Schulpforta school near Naumburg. There he received an education in theology, classical languages, and the humanities. After graduating, young Nietzsche attended the University of Bonn before moving to the University of Leipzig. During his time there, Nietzsche became acquainted with the philosophy of Arthur Schopenhauer (1788 – 1860) whose work, the World as Will and Representation (1818), would have a tremendous influence. Then, aged only twenty-four, Nietzsche was awarded the position of professor of Greek language and Literature at the University of Basel in Switzerland. He had never written a doctoral dissertation.
Nietzsche left academia briefly to serve as a medical orderly in the Franco-Prussian War (1870-1871). He was discharged due to poor health. Nietzsche returned to Basel where he came acquainted with the cultural historian, Jacob Burckhardt (1818 – 1897), and the composer, Richard Wagner (1813 – 1883). Wagner’s influence on Nietzsche can most readily be seen in the Birth of Tragedy.
During the late 1870s, Nietzsche became increasingly beset with debilitating health problems: digestive problems, poor eyesight, and migraines. He was forced to spend months off work, and eventually agreed to retire with a modest pension. Nietzsche was only thirty-four years old.
From there, Nietzsche devoted the rest of his life to the study and writing of philosophy. Between 1870 and 1889, Nietzsche wrote nineteen books, including: The Birth of Tragedy (1872), Philosophy in the Tragic Age of the Greeks (1873), Human, All Too Human (1878), the Gay Science (1882), Thus Spake Zarathustra (1883), Beyond Good and Evil (1886), On the Genealogy of Morals (1887), Twilight of the Idols (1888), Ecce Homo (1888), and the Will to Power (1901, technically unpublished manuscripts published by his sister, Elisabeth).
In 1889, in Turin Italy, Nietzsche suffered a mental breakdown after seeing a horse being flogged in the Piazza Carlo Alberto. In the following days, Nietzsche sent a series of ‘madness letters’ to Cosimo Wagner (1837 – 1930) and Jacob Burckhardt in which he signed his name ‘Dionysos’, claimed to be ‘the crucified one’, and asserted that he was the creator of the world. It was quickly agreed that Nietzsche should be brought back to Basel. There he was incarcerated in a clinic in Jena.
In 1890, Nietzsche’s mother, Franziska, brought him home to Naumburg where she looked after him until her death in 1897. From there, Nietzsche was cared for by his sister, Elisabeth, in Weimar. He died on August 25th, 1900 at the age of fifty-five.
The statement, “God is dead” is Nietzsche’s most memorable and provocative statement. (Of course, he wasn’t the first one to coin the term. That was Heinrich Heine (1797 – 1856). Nietzsche merely philosophised it). It first appeared in the Gay Science in a fable entitled, the Parable of the Madman. In the parable, the madman asks, ‘where is God?’, only to be informed that God had been killed by man:
“God is dead. God remains dead. And we have killed him. How shall we, murderer of all murderers, console ourselves? That which was holiest and mightiest of all that the world has yet possessed has bled to death under our knives. Who will wipe the blood off us? With what water could we purify ourselves?”
Of course, Nietzsche wasn’t talking about the literal death of God (he was, after all, an atheist). Instead, he was referring to the death of the concept or idea of God. The statement was meant as a reference to the decline of traditional and metaphysical doctrines that had dominated European thought and culture for centuries.
Nietzsche observed, correctly, that western morality was predicated on the presumption of the truth of Judeo-Christian values. Christianity had become infused in European culture and thought. Philosophers and scientists like Copernicus (1473 – 1543), René Descartes (1596 – 1650), Isaac Newton (1643 – 1727), Saint Thomas Aquinas (1225 – 1274), George Berkeley (1685 – 1753), Saint Augustine (354-430AD), Gottfried Wilhelm Leibniz (1646-1716), and more were all deeply influenced by their belief in God. Culturally, Handel’s (1685 – 1759) Messiah, Da Vinci’s (1452 – 1519) the Last Supper, and Michelangelo’s (1475 – 1564) Statue of David are all infused with religious themes.
The decline of Christianity’s supremacy in society began with the Enlightenment. Science replaced scripture. During this time, the belief in a universe governed by God was replaced by governance through the laws of physics, the divine right to rule was replaced with rule by consent, and morality no longer had to emanate from a loving and omniscient God.
The legacy of the Enlightenment, Nietzsche rightly observed, was that Christianity lost its central place in Western culture. (Of course, it can also be argued that Christianity’s central doctrines and tenets have been so absorbed by society people no longer recognise their influence). Science, replete with its elaborate depictions of physical reality, ultimately replaced religious truth.
Nietzsche’s assertion is often seen as a triumphal or victorious statement. However, analysis reveals that Nietzsche did not necessarily see the death of God as a good thing. He recognised that as society moved closer to secularisation, the order and meaning religion gave to society would fall by the wayside. People would no longer base their lives on their religious beliefs, but on other factors. Their lives would not be grounded in anything. As Nietzsche wrote in the Twilight of the Idols:
“When one gives up the Christian faith, one pulls the right to Christian morality out from under one’s feet. This morality is by no means self-evident… Christianity is a system, a whole view of things thought out together. By breaking one main concept out of it, the faith in God, one breaks the whole.”
Nietzsche believed the solution to the problem would be to create our own, individual values. Christian morality (derided by Nietzsche as ‘slave morality’) would be replaced by ‘master morality.’ Human beings would strive to become Übermensches or overmen.
The problem with Nietzsche’s suggestion is that it is virtually impossible to keep society ordered when everyone’s values are different. Furthermore, as Carl Jung (1875 – 1961) points out, it is impossible for us to create our own values. Most of us can’t keep our new year’s resolutions, let alone create a value system that will bring order to society.
Nietzsche, along with Russian novelist, Fyodor Dostoevsky (1821 – 1881), predicted that the 20th Century would be characterised either by apocalyptic nihilism or equally apocalyptic ideological totalitarianism. In the end, the world experienced both. The wake of the Great War (1914 – 1918) saw Europe plagued by communism, fascism, Nazism, and quasi-religious nationalism. In Russia, communism, through which a person’s value was derived from his labour, arose under the Bolsheviks. In Italy, fascism, through which a person’s value was derived from his nationality, arose under Benito Mussolini (1883 – 1945). In Germany, Nazism, through which a person’s value was derived from his race, arose under Adolf Hitler (1889 – 1945). All of these systems attempted to give people’s lives meaning by replacing the state with God.
In the end, the 20th Century would be the deadliest and most destructive in human history. The legacy of two world wars, nuclear weapons, communism, and fascism has been millions of painful and unnecessary deaths. This is what we get when we remove God from society: needless pain and suffering.