Home » Posts tagged 'Wales'
Tag Archives: Wales
Like many people, I was cynically amused to learn that the Duke and Duchess of Cambridge were “leaving” the Royal Family. According to an agreement they reached with the Palace on January 18th, they would be free to pursue business opportunities around the world and would “no longer be working members” of the British royal family, though they would lose the right to be referred to as His or Her Majesty.
It’s hardly surprising. Acting, like many art forms, has always attracted the insecure, sociopathic, and the just plain crazy. And Meghan Markle is an actress. One psychological study found that actors showed significantly higher rates of disordered personality traits than non-actors. The study, which compared 214 professional actors to a cohort of North American non-actors, also found that there was a high prevalence of anti-social personality, borderline, narcissistic, schizotypal, and obsessive-compulsive traits among actors than there was among the general population.
People become actors because they like being the centre of attention. They crave the spotlight because it makes them feel validated. The Royal Family, by contrast, performs public service by diverting attention away from themselves and onto the British nation and people. (A fact greatly contradicted by a news media who treat them as news stories in and of themselves). Poor Meghan Markle has found herself in a situation where she is not the centre of attention, and she doesn’t like it.
So, what does someone like Meghan Markle do when the spotlight is not on her? Well, the answer in Meghan’s case seems to be: leave the royal family. I will not at all be surprised in Meghan announces some kind of return to acting over the coming year. You cannot turn an actress into a princess anymore than you can make a leopard change its spots.
In February 2013, Vassar College student, Xialou “Peter” Yu was accused of sexual assault by fellow student, Mary Claire Walker. The accusation stemmed from an incident occurring twelve months previously in which Walker had accompanied Yu back to his dorm room after a party and initiated consensual sex. Walker herself broke off the coitus early. She had decided that it was too soon after ending her relationship with her boyfriend to embark on a sexual relationship with another man. She even expressed remorse for having “lead Yu on” and insisted that he had done nothing wrong.
Nevertheless, at some point, Walker decided that she had been sexually assaulted and Yu was mandated to stand before a college tribunal. At this tribunal, Yu was refused legal representation, had his attempts at cross-examining his accuser repeatedly stymied, and potential eyewitness testimonies from both Yu and Walker’s roommates were suppressed by the campus gender equality compliance officer. Supposedly because they had “nothing useful to offer.” In what can only be described as a gross miscarriage of justice, Yu was found guilty and summarily expelled.
Unfortunately, the kind of show trials that condemned Yu is not entirely uncommon in American colleges and universities (and, like many social diseases, are starting to infect Australian campuses, as well). They are the result of years of unchallenged feminist influence on upper education. These institutions have swallowed, hook, line, and sinker, the feminist lie that every single woman who claims to be sexually assaulted must be telling the truth.
The problem begins with those who make public policy. The US Department of Education has been seduced by the ludicrous idea that modern, western societies are a “rape culture.” They have brought into the lie that one-in-five women are sexually assaulted on college campuses, despite the fact that this statistic (which conveniently seems to come up with exactly the same ratio no matter where it’s used) comes from an easily disproven web-based survey.
This survey, which was conducted at two universities in 2006, took only fifteen minutes to complete and had a response rate of just 5466 undergraduate women aged between eighteen and twenty-five. Furthermore, it was poorly formulated with researchers asking women about their experiences and then deciding how many of them had been victims of sexual misconduct.
Regardless, the lack of credibility that this survey possessed did not stop the US Department of Education’s Office of Civil Rights from laying out guidelines for handling reports of sexual misconduct. Among these recommendations was that reports of sexual misconduct should be evaluated on the “preponderance of evidence” rather than the more traditional “clear and convincing evidence.” This radical shift in standards of proof means that accuser only has to prove that there is a reasonable chance that a sexual assault occurred rather than having to prove it beyond a reasonable doubt.
It would be an understatement to say the college and university rape tribunals – and the policies that inform them – violate every legal principle and tradition of western law. American colleges and universities have created an environment in which male students can be stigmatised as sexual deviants with little to no evidence aside from an accusation. These tribunals not only violate standards of proof but the presumption of innocence, as well.
That these tribunals have decided to do away with the presumption of innocence should hardly come as a surprise. After all, the mere idea of the presumption of innocence is antithetical to human nature. It is natural for human-beings to presume that someone is guilty just because they have been accused of something. As the Roman jurist, Ulpian pointed out: the presumption of innocence flies in the face of that seductive belief that a person’s actions always result in fair and fit consequences. People like to believe that someone who has been accused of a crime must have done something to deserve it.
The presumption of innocence is the greatest legal protection the individual has against the state. It means that the state cannot convict anyone unless they can prove their guilt beyond any reasonable doubt. We should be willing to pay any price to preserve it. And we certainly shouldn’t allow extra-legal tribunals to do away with it just to satisfy their ideological proclivities.
Sometimes a civilisation can become so sophisticated that it believes it can overcome truth. We have become one of those civilisations. As a consequence of our arrogance, we have come to believe that we can circumvent some of the most fundamental truths about reality. We blame inequality on the social structure even though most social animals live in hierarchies. We believe that primitive people are noble even though mankind in its primitive state is more violent than at any other stage. And we believe that we can change the way human beings eat despite the fact that it is making us unhappy.
It is our modern obsession over diet and exercise that I would like to focus on. This obsession has arisen from a society that is too safe, too free, and too prosperous for its own good. This is not to say that safety, freedom, and prosperity are bad things. Indeed, we should get down on our knees and thank God every day that we live in a country that has these things. However, it is also true that too much safety, freedom, and prosperity breeds passivity and complacency. The hardships our ancestors faced – war, poverty, disease – are no longer problems for us. Therefore, we lack the meaning that these hardships bring to our life. As a result, we have come to invent problems. Among these has been a tendency to render the consumption of certain food as something unhealthy, unethical, or both.
Our modern obsession with food is causing significant personal problems. On the one hand, the ease in which food, especially that which is laden with sugar, is causing a rise in cases of obesity. (Note: I am using the word ‘obesity’ as a blanket term for people who are overweight). It is a uniquely modern problem. Our ancestors never battled weight gain because they were only able to find or afford enough food to keep them and their families from starving. Now the quantity, cheapness, and, in many cases, poor quality of food means that the fattest amongst are also often the poorest. But obesity is less a problem that arises out of food and more of a problem arising from laziness and gluttony. (Naturally, I am excluding health problems and genetic disorders from this conclusion).
On the other hand, however, our obsession over being skinny or muscle-bound is also causing problems. I have seen plenty of people who are clearly overweight. In rare cases, I have even seen people who are so morbidly obese that it can only be described as breathtaking. However, I have also seen women (and it primarily women, by the way) who can only be described as unnaturally thin. It is as though our society, having realised that being overweight is healthy, has decided that its opposite must be good. It isn’t. Just right is just right.
And it’s not just individuals who are subjecting themselves to this kind of self-imposed torture. And it’s not limited to people in the here and now, either. In 1998, The Independent reported that many doctors in the United Kingdom were concerned that well-meaning parents were unintentionally starving their children to death by feeding them low fat, low sugar diets. These children were said to be suffering from the effects of “muesli-belt nutrition.” They had become malnourished because either they or their parents had maintained had become obsessed with maintaining a low-fat, low-sugar, low-salt diet. The article reported: “Malnutrition, once associated with slums, is said to have become an increasing problem for middle-class families in the past fifteen years. The victim of so-called ‘muesli-belt nutrition’ are at risk of stunted growth, anaemia, learning difficulties, heart disease and diabetes.”
Our obsession over diet is really a sign of how well-off our society is. Our ancestors had neither the time nor the resources to adhere to the kind of crazy-strict diets that modern people, in their infinite stupidity, decide to subject themselves to. It is high time we stopped obsessing over food and got a grip.
It has been over fourteen-year since David Reimer, the victim of an insane and evil scientific experiment, committed suicide. After his penis had been burnt off in a botched circumcision, David’s parents had turned to the infamous sexologist and social constructionist, Dr. John Money for help. Following Dr. Money’s advice, David’s parents agreed to allow a sex change operation to be performed on their young son and raised him as a girl.
Despite Dr. Money’s boasting that his experiment had been a success, however, David Reimer did not settle comfortably into his female identity. David tore up his dresses at three, asked if he could have his head shaved like his father, and engaged in all manner of boyish behaviour. David was bullied at school and, upon hitting puberty, decided that he was a homosexual (in reality, of course, he was heterosexual).
Finally, when he was fourteen David’s parents revealed the truth about his gender identity. David reverted to his masculine identity, broke off contact with Dr. Money whom he described as an abusive brainwasher, and received a non-functioning penis through phalloplasty. Unable to handle the immense psychological damage that had been inflicted upon him, David Reimer blew his brains out with a shotgun at the age of thirty-eight.
For all of human history, boy has meant boy and girl has meant girl. Traditionally, sex was used to refer to the biological markers of gender. If you were born with a penis and an XY chromosome, you were a man. If you were born with a vagina and an XX chromosome, you were a woman. One’s gender expression was thought to compliment one’s biological sex. A biological man would have masculine personality traits and a biological female would have feminine personality traits. These complimentary characteristics, among them body shape, dress, mannerisms, and personality, were thought to be produced by a mixture of natural and environmental forces.
Recently, however, gender theorists have begun to question the relationship between biological sex and gender identity. They argue that gender, which they see as distinctive from sex, is a social construct. Since gender refers to the expression of masculinity and femininity, gender is something that a person acquires. (Needless to say, this movement is driven by a pernicious post-modern, Neo-Marxist worldview). Under this philosophy, gender expression is the manner in which a person expresses their gender identity. Gender identity is expressed through dress, behaviour, speech, and nothing else besides.
Neuroplasticity provides the gender theorist with perhaps his greatest argument. If underlying brain processes are theoretically strengthened through repetitive use, it follows that gender identity comes from a narrowing down of potential gender categories through the repetitive use of certain brain processes. However, it also reveals a fatal flaw in the gender theorist’s (and social constructionist’s) philosophy. If the human brain is so malleable that an individual’s gender identity is constructed, then why can’t the brain of a transgender person be adapted out of its transgenderism?
The primary problem with gender theory is that it just plain wrong. The idea that gender is distinct from sex has absolutely no basis in science whatsoever. As Jordan Peterson, the Canadian psychology/philosopher, has stated: “the idea that gender identity is independent of biological sex is insane. It’s wrong. The scientific data is clear beyond dispute. It’s as bad as claiming that the world is flat.” Men and women differ both at the cellular and the temperamental level. Unlike men, for example, women menstruate, they can have babies, and they show a slew of personality characteristics that mark them as different from men. David C. Page, the Director of the Whitehead Institution at the Massachusetts Institute of Technology, has even claimed that genetic differences exist at the cellular level asserting that “throughout human bodies, the cells of males and females are biochemically different.” These differences even affect how men and women contract and fight diseases.
The philosopher Alain de Benoist has also strongly criticised gender theory. De Benoist argued against the scientific errors and philosophical absurdities in his work Non à la théorie de genre (No to Gender Theory).
First, De Benoist points out that the gender theorists have used the fact that some gender characteristics are socially constructed to argue that all characteristics are socially constructed.
Second, De Benoist argued that the “hormonal impregnation of the foetus” (as De Benoist puts it) causes the brain to become genderised because it has a “direct effect on the organisation of neural circuits, creating a masculine brain and a feminine brain, which can be distinguished by a variety of anatomical, physiological, and biochemical markers.”
Third, De Benoist argued that biological sex has a profound effect on the way people think, act, and feel. In order to support their theory, gender theorists are forced to deny the natural differences between men and women. De Benoist wrote:
“From the first days of life, boys look primarily at mechanized objects or objects in movement while girls most often search for visual contact with human faces. Only a few hours after birth, a girl responds to the cries of other infants while a boy shows no interest. The tendency to show empathy is stronger in girls than in boys long before any external influence (or “social expectations”) have been able to assert themselves. At all ages and stages of development, girls are more sensitive to their emotional states and to those of others than boys … From a young age, boys resort to physical strategies where girls turn to verbal ones … From the age of two, boys are more aggressive and take more risks than girls.”
Furthermore, gender theory cheapens what it means to be a man or a woman. And, by extension, it denigrates the contributions that each gender has to make to civil society. Gender values give people ideals to strive for and helps them determine the rules that govern human interactions. The idea that men and women ought to be treated the same is ludicrous beyond belief. No parent would like to see their son treat a woman the same way they treat their male friends. Men have been taught to be gentlemen and women have been taught to be ladies for a reason.
All of this is not to say, however, that those pushing transgender rights do not have a case. They are right when they claim that the transgender peoples of the world face discrimination, prejudice, and violence. Some countries treat transgenderism as a crime, and it is certainly true that transgender people are more likely to be victims of violence, including murder. A reasonable transgender rights argument would be that transgender people cannot help their affliction and that society ought to treat them with kindness, tolerance, and compassion.
Unfortunately, that is not the argument that gender activists like to make. Rather than focusing on promoting tolerance, gender activists have instead sought to do away with gender distinctions altogether (which is, more likely than not, their actual aim). Using a very tiny minority of the population as their moral basis, the gender activists are attempting to force society to sacrifice its traditional classifications of male and female.
Transgenderism is clearly a mental health disorder. In the past, it was referred to as “gender dysphoria”, considered a mental illness, and treated as such. To assert the fact that transgenderism is a mental health disorder is not a denial of an individual’s integral worth as a human being. It is merely the acknowledgement of the existence of an objective reality in which gender is both binary and distinct. Unfortunately, this is not the attitude of those who influence public opinion. Consequently, programs for LGBTQ youth have seen an increase in youth who identify as transgender. The transgender journalist, Libby Down Under, has blamed instances of rapid-onset gender dysphoria on the normalisation of transgenderism in the culture. With a slew of celebrities coming out as transgender (former Olympian Bruce Jenner being a primary example), and with transgender characters being featured on numerous television shows, many teens and tweens have suddenly decided that they are transgender despite having no prior history of gender confusion.
Transgender youth increasingly feel that it is their right to express themselves however they please. And they feel that it is their right to silence all who dare to criticise or disagree with that expression. Cross-living, hormone therapy, and sex reassignment surgery are seen as part of this self-expression. Alarmingly, the mainstream response of psychotherapists to these children and adolescents is the “immediate affirmation of [their] self-diagnosis, which often leads to support for social and even medical transition.”
It is a classic case of political posturing overshadowing the pursuit of truth. Most youth suffering from gender dysphoria grow out of their predilection. Dr. James Cantor of the University of Toronto has cited three large-scale studies, along with other smaller studies, to show that transgender children eventually grow out of their gender dysphoria. The Diagnostic and Statistics Manual 5th Edition claims that desistance rates for gender dysphoria is seventy to ninety percent in “natal males” and fifty to eighty-eight percent in “natal females.” Similarly, the American Psychological Association’s Handbook of Sexuality and Psychology concludes that the vast majority of gender dysphoria-afflicted children learn to accept their gender by the time they have reached adolescence or adulthood.
It is not a secret that transgenderism lends itself to other mental health problems. Forty-one percent of transgender people have either self-harmed or experienced suicidal ideation (this percentage, of course, does not reveal at what stage of transition suicidal ideation or attempts occur). The postmodern, neo-Marxist answer to this problem is that transgender people are an oppressed minority and that they are driven to mental illness as a result of transphobia, social exclusion, bullying, and discrimination.
It is typical of the left to presume that society is to blame for an individual’s suffering. And to a certain extent, they are right. Transgender people are the victims of discrimination, prejudice, and violence. But it is more than likely that these abuses exacerbate their problems rather than causing them. One in eight transgender people, for example, rely on sex and drug work to survive. Is that the fault of society or the fault of the individual? The National Center for Transgender Equality claims that it is common for transgender people to have their privacy violated, to experience harassment, physical and sexuality violence, and to face discrimination when it comes to employment. They claim that a quarter of all transgender people have lost their jobs and three-quarters have faced workplace discrimination because of their transgender status.
In Australia, there has been a move to allow transgender children access to hormone-blocking drugs and sex-change surgeries. Australian gender activists – surprise, surprise – support the idea of as a way to reduce the rates of suicide among transgender people. The Medical Journal of Australia has approved the use of hormone therapy on thirteen-year-olds despite the fact that the scientific community remains, as of 2018, undecided on whether or not puberty-blocking drugs are either safe or reversible.
In the United States, a great deal of debate has occurred over transgender rights. In particular, there have been debates over what bathroom they should be allowed to use, how they should be recognised on official documents, and whether they should be allowed to serve in the military. In 2016, former President Barack Obama ordered state schools to allow transgender students to use whatever bathroom they desire. Similar ordinances have been passed in hundreds of cities and counties across the United States. Seventeen states and the District of Columbia are subject to ‘non-discrimination’ laws which include gender identity and gender expression. These include restrooms, locker rooms, and change rooms.
In March of 2016, North Carolina passed a law which required people in government buildings to use the bathroom appropriate to their biological gender. The US Federal Government decried the decision as bigotry and accused the government of North Carolina of violating the Civil Rights Act. The Federal Government threatened to withhold over US$4 billion in education funding. The government of North Carolina responded by filing suit against the government of the United States. The US government responded by filing suit against North Carolina. North Carolina received support from Mississippi, Tennessee, and Texas whilst Washington received support from most of the northern states.
Pro-transgender bathroom policies are not limited to government, however. Many businesses in the United States have similar bathroom policies. Many large corporations, among them Target, allow transgender people to use the bathroom of their choice. And they are perfectly prepared to enforce these policies, as well. A Macy’s employee in Texas was fired after he refused to allow a man dressed as a woman to use the female change rooms. Similarly, Planet Fitness revoked the membership of a woman who complained that a transgender man was in the female change rooms.
The most alarming trend of the gender theory movement is the attempt to indoctrinate children through changes to the education system. In 2013, France unleashed the ABCD de l’égalité (the ABCs of Equality) on six hundred elementary schools. In their own words, the program was designed to teach students that gender was a social construct:
“Gender is a sociological concept that is based on the fact that relations between men and women are socially and culturally constructed. The theory of gender holds that there is a socially constructed sex based on differentiated social roles and stereotypes in addition to anatomical, biological sex, which is innate.”
The creators of the program are smart enough to include the disclaimer: “biological differences should not be denied, of course, but those differences should not be fate.”
Fortunately, it would seem that many people are not taken in by this race to fantasyland. They are not taken in by the idea that the program merely exists to combat gender stereotypes and teach respect, and have protested. The French Minister of Education dismissed the protestors by saying that they “have allowed themselves to be fooled by a completely false rumour… at school we are teaching little boys to become little girls. That is absolutely false, and it needs to stop.” In America, The Boston Globe dismissed the protests against the program as being motivated by fear. Judith Butler event went as far as to say that France’s financial instability was the true cause of the protests.
And such a profound misuse of the education system isn’t limited to France, either. In Scotland, teachers are given guidance by LGBT Youth Scotland, children are expected to demonstrate “understanding of diversity in sexuality and gender identity”, and children are allowed to identify as either a girl or boy, or neither. The government of the United Kingdom has mandated that transgender issues be taught as part of the sex and relationships curriculum in primary and secondary school. Justine Greening, the education secretary, said: “it is unacceptable that relationships and sex education guidance has not been updated for almost twenty years especially given the online risks, such as sexting and cyberbullying, our children and young people face.”
It is in Australia, however, that there is the most shocking case of gender theory indoctrination. A great deal of controversy has been generated over the Safe Schools program. The program, which was established by the Victorian government in 2010, is supposedly designed to provide a safe, supportive, and inclusive environment for LGBTI students. It states that schools have the responsibility to challenge “all forms of homophobia, biphobia, transphobia, intersexism to prevent discrimination and bullying.”
The Safe Schools program promotes itself as an anti-bullying resource supporting “sexual diversity, intersex and gender diversity in schools.” It requires Victorian schools to eliminate discrimination based on gender identity, intersex, and sexual orientation, including in terms of an inclusive school environment.
The program addresses the issues of sleeping and bathroom arrangements and dress code. In terms of dress code, the program states:
“An inflexible dress code policy that requires a person to wear a uniform (or assume characteristics) of the sex that they do not identify with is likely to be in breach of anti-discrimination legislation including under the Equal Opportunity Act (1984) SA”
Likewise, the program states on the issue of bathrooms and change rooms that “transgender and diverse students should have the choice of accessing a toilet/changeroom that matches their gender identity.” In addition, the program states:
“Schools may also have unisex/gender neutral facilities. While this is a helpful strategy for creating an inclusive school environment for gender diverse students broadly, it is not appropriate to insist that any student, including a transgender student, use this toilet if they are not comfortable doing so.”
The idea that a transgender boy or girl should be allowed to sleep, shower, and defecate in the same place as a group of boys or girls ought to ring alarm bells for everyone. It increases the risk of sexual activity, sexual assault, pregnancy, and the transmission of sexually-transmitted-diseases. There is a reason why schools segregate changerooms, toilets, and dormitories.
The tragedy of David Reimer reveals just how dangerous it is to ignore the truth in favour of a false and malevolent social philosophy. It is one thing to seek tolerance and compassion for those in the community who may be struggling with their identity. It is something else entirely to use the plight of transgender peoples as a means of cording society to change the way it categorises gender. And it is completely insane to allow a false philosophy like gender theory to be used as the basis of public policy. If we don’t want more tragedies like David Reimer’s, we should put gender theory out in the trash where it belongs.
Our society appears to be suffering a terminal decline. At least that’s the conclusion traditionalists and devout Christian believers like myself have been forced to conclude. As the old-world withers and vanishes, a culture of selfishness, moral relativism, and general immorality has been allowed to grow in its place. The culture that produced Vivaldi, Dickens, Shakespeare, and Aristotle has been replaced with one that has as its major ambassadors the likes of Kim Kardashian and Justin Bieber.
The first clue that a monumental change had taken place came in the guise of Princess Diana’s farce of a funeral in 1997. An event that was cynically exploited by politicians and celebrities and recorded for public consumption by round-the-clock news coverage (her funeral would be watched by two-and-a-half-billion people). As Gerry Penny of The Conversation noted, Diana’s death marked the beginning of the ‘mediated death.’ A death that is covered by the mass media in such a way that it attracts as much public attention, and therefore revenue, as possible.
Compared to Princess Diana, Winston Churchill’s funeral in 1965 was a spectacle of old world pomp and ceremony. After lying in state for three days, Churchill’s small coffin was carried by horse-drawn carriage along the historic streets of London to Saint Paul’s Cathedral. His procession was accompanied by Battle of Britain aircrews, royal marines, lifeguards, three chiefs of staff, Lord Mountbatten, and his own family. The silence that filled the air was broken only by a funerary march and the occasional honorary gunshot.
Much like Diana’s funeral, tens of thousands of people came to witness Churchill’s funeral. But unlike Diana’s mourners, who did everything they could to draw attention to themselves, Churchill’s mourners were silent and respectful. They realised, unlike Diana’s mourners, that the best way to commemorate a great man was to afford him the respect that his legacy deserved.
Cynics would dismiss Churchill’s funeral as nothing more than a ridiculous display of pomp and ceremony. However, these events serve an important cultural purpose by connecting the individual with his community, his culture, and his heritage. In doing so, they bring about order and harmony.
Winston Churchill was the great Briton of the 20th century. Like Horatio Lord Nelson in the early 19th century, it was Churchill’s leadership that saved Britain from Nazi invasion and it was his strength and resolve that gave ordinary Britons that courage to endure the worst periods of the War.
And understandably, many Britons felt something approximating a kind of personal gratitude towards him. A gratitude deep enough that when he died many felt it to be their duty to file reverently pass his body lying in state or stand in respectful silence as his funeral procession passed. What Churchill’s state funeral did was give the ordinary person the opportunity to pay their own respects and feel that they had played a part, if only in a minute way, in the celebration of his life.
Winston Churchill’s funeral and Princess Diana’s funeral represent eras that are as foreign to one another as Scotland is to Nepal. While Churchill’s funeral represented heritage and tradition, Princess Diana’s funeral symbolised mass nihilism and self-centredness.
But why has this happened? I believe the answer lies in the dual decline of Western culture and Christianity.
The French philosopher, Chantal Delsol described modern Western culture as being akin to Icarus had he survived the fall. (Icarus, of course, being the figure in Greek mythology whose wax wings melted when he flew too close to the sun). Where once it had been strong, resolute, and proud, it has now become weak, dejected, disappointed, and disillusioned. We have lost confidence in our own traditions and ideals.
Of course, the decline of Western culture has a direct correlation with the more consequential decline of Christianity. It is faith that informs culture and creates civilisation, and the faith that has informed the West has been Christianity. It is the moral ideals rooted in the Judeo-Christian tradition – that I love my neighbour, that my behaviour in this life will determine my fate in the next, that I should forgive my enemies – that form the axiomatic principles that undergird Western civilisation.
This faith has been replaced by an almost reverent belief in globalism, feminism, environmentalism, diversity, equality, and human rights. Our secularism has made us believe that those who came before us were ignorant, superstitious, and conformist. And what has the result of this loss of mass religiosity been? Mass nihilism and a decline in moral values.
But when faith falls so too does culture and civilisation. If we are to revive our civilisation, we must be prepared to acknowledge that tradition, heritage, and religion are not only integral, but vital.
Margaret Thatcher (1925 – 2013) is a titan of world politics. A conservative heavyweight who effectively championed the conservative ethos in the public sphere and, in doing so, managed to transform her country for the better.
Margaret Thatcher was born Margaret Hilda Roberts on October 13th, 1925 above a green grocer’s store in Grantham, Lincolnshire. Thatcher was an ambitious and driven student who won scholarships to Kesteven and Grantham Girls’ school and Oxford University. After university, Thatcher worked as a chemist but abandoned it to study for the legal bar after meeting her husband Dennis Thatcher (1915 -2003), whom she married in 1954. Thatcher became a fully qualified lawyer that same year. Thatcher became the Conservative member for Finchley in 1959.
During her rise to power, Thatcher was not massively popular. Facing oppositions because of her gender – when she was elected she was one of only twenty-four female Parliamentarians (out of six-hundred members) and, even more unusually, was the mother of twins – and her social class. The Conservative Party had not changed its structure since the 19th century. She was often denounced as the “grocer’s daughter”, one conservative politician even commented that she was “a good-looking woman without doubt, but common as dirt.” In spite of these barriers, Thatcher managed to rise through numerous junior ministerial positions to become the shadow education spokeswoman in 1967. She became the Secretary of State for Education and Science when Edward Heath (1916 – 2005) became Prime Minister in June of 1970. Thatcher became the leader of the Conservative Party in 1975.
Margaret Thatcher was conservative Prime Minister of Great Britain from 1979 to 1990 and in her time, she changed Britain and helped define the times she lived in. Thatcher became Prime Minister after defeating James Callaghan (1912 – 2005) with a seven percent majority. There were many reasons for the conservative victory, the main ones being economic failure and the lack of union control. Thatcher was seen as aggressive but also as something of a paradox. She was the first scientist in Downing Street and was enthusiastic in pushing Great Britain’s technological innovations forward, but was an anti-counterculture revolutionary who opposed trade unions and the socialism they represented.
During Thatcher’s first term, however, it was the economy that needed the most attention. By the late 1970s inflation in Great Britain had peaked at twenty percent due to rising oil prices and wage-push inflation. The once mighty nation had become known as the ‘sick man of Europe’. According to the Organisation for Economic Cooperation and Development, by 1980/81 Britain was suffering from downward trends in employment and productivity. The great industrial cities were in decline. Glasgow, for example, had seen a decline in its population from 1.2 million following World War One to eight hundred thousand in the early 1980s. In some areas of Glasgow, male unemployment would remain at between sixty and seventy percent throughout the 1980s. The director of the Department of Applied Economics, Wayne Godfrey, stated on the prospect of the 1980s: “it is a prospect so dreadful I cannot really believe there won’t be a sort of political revolution which will demand a basic change to policy.”
Inflation, particularly cost-push inflation, was seen as the biggest enemy. However, Thatcher knew that tackling inflation would require restricting the flow of money and causing mass job losses. It was a sacrifice she was willing to make. The government had a three-step process for tackling the issue. First, they increased interest rates. Second, they reduced the budget deficit by raising taxes and cutting government spending. Third, they pursued monetarist policies to control the supply of money. Despite great job losses, the economy slowly improved over Thatcher’s first two years in power.
In 1981, however, her policies caused a recession and unemployment peaked at three million. In fact, unemployment would remain a characteristic of the 1980s. Following the recession, Great Britain saw a period of economic growth with inflation dropping below four percent, although unemployment soared to 3.2 million before easing off a little. It is also of note that despite the mass unemployment, average earnings were, in fact, rising twice as fast inflation and those in employment had it better than ever. The Secretary of Transport, David Howell (1936 – ), stated in 1983: “if the conservative revolution has an infantry, it is the self-employed. It is in the growth of the self-employed, spreading out to small family businesses, that the job opportunities of the future are going to come.” Thatcher’s biggest achievement in her first term, and the one which endeared her most to the British public was the Falklands War. Following the Argentinean surrender in 1982, Thatcher stated: “today has put the great back into Britain.” The Falklands War rekindled the British public’s pride in her navy and in the nation, itself.
The Conservative Party won the 1983 election by an overwhelming majority. Thatcher had become the uncontested leader and saviour of the Conservative Party. Thatcher used the victory as an opportunity to change the configuration of the Conservative Party and reshape it in her image. She fired Foreign Secretary, Francis Pym (1922 – 2008) and sent the Home Secretary, William Whitelaw (1918 – 1999) to the House of Lords. Having ended the ancien regime, she refilled the front bench with dedicated Thatcherites. Only one old Etonian remained: Lord Chancellor Hailsham (1907 – 2001), who was eighty-five at the time. Thatcher then embarked on a policy of privatisation and deregulation with the intention of decreasing dependency on the government and encouraging personal responsibility. Critics accused Thatcher of attempting to dismantle the welfare state and refusing to provide a base safety net for those down on their luck. Unusually for an anti-socialist, Thatcher established the Greater London Council along with six metropolitan councils in an attempt to control local councils from Whitehall.
The conservatives won the 1987 election having lost twenty-one seats, but with a majority of more than one hundred. Thatcher focused on social issues and embarked on a program for social engineering. This was a seven-step process. First, the program actively encouraged women to stay at home and look after their children rather than join the workforce. Second, the program suggested putting the care of the old, unemployed and disabled into the hands of families. Third, the program suggested helping parents set up their own schools. Fourth, the program suggested providing support for schools with a clear, moral base, including religious schools. Fifth, the program suggested creating a voucher system to encourage parents to send their children to private schools. Sixth, the program suggested training children in the management of pocket money and the setting up of savings accounts. Seventh, the program wished to alter the way the public viewed wealth creation so that it would be seen as an admirable pursuit. Thatcher’s tenor as Prime Minister ended when she stood down from cabinet after her party refused to support her in a second round of leadership challenges. She was replaced by John Major (1943 – ).
After leaving office, Thatcher wrote two memoirs: The Downing Street Years (1993) and The Path to Power (1995). Thatcher was known as many things, including ‘The Last of the Eminent Victorians’, ‘New Britannia’, and, most famously, ‘The Iron Lady’. However, despite her many years in politics and her eleven years as Prime Minister, Thatcher was never a populist. This was probably because of her deep personal convictions which were stronger than her fear of the consequences. Thatcher did, however, demand and receive respect from the public. Satire almost always focused on her husband Dennis rather than on her. It is also worth noting that in her time Thatcher never lost an election. As a politician, Thatcher revolutionised political debate, transformed the Conservative Party, and altered many aspects of British life that had long been deemed permanent. Paul Johnson (1928 – ), a prominent English journalist, stated on Thatcher’s abilities as a politician: “though it is true in Margaret Thatcher’s case, she does have two advantages. She did start quite young. She does possess the most remarkable physical stamina of any politician I’ve come across.” In her time, Thatcher was determined to curb government subsidies to industry and to end the power of the trade unions. She made the trade unions liable for damages if their actions became unlawful and forced the Labour Party to modernise itself. Margaret Thatcher was an impressive and important Prime Minister whose political career and personality helped change Great Britain for the better.
- British Broadcasting Corporation., 2001. Dome Woes Haunt Blair. [Online]
Available at: http://news.bbc.co.uk/2/hi/uk_news/politics/1172367.stm
[Accessed 8 10 2014].
- British Broadcasting Corporation., 2008. 1979: Thatcher Wins Tory Landslide. [Online]
Available at: http://news.bbc.co.uk/2/hi/uk_news/politics/vote_2005/basics/4393311.stm
[Accessed 10 8 2014].
- British Broadcasting Corporation., 2008. 1983: Thatcher Triumphs Again. [Online]
Available at: http://news.bbc.co.uk/2/hi/uk_news/politics/vote_2005/basics/4393313.stm
[Accessed 8 10 2014].
- British Broadcasting Corporation., 2008. 1987: Thatcher’s Third Victory. [Online]
Available at: http://news.bbc.co.uk/2/hi/uk_news/politics/vote_2005/basics/4393315.stm
[Accessed 8 10 2014].
- British Broadcasting Corporation., 2008. 1989: Malta Summit Ends Cold War. [Online]
Available at: http://news.bbc.co.uk/onthisday/hi/dates/stories/december/3/newsid_4119000/4119950.stm
[Accessed 12 10 2014].
- British Broadcasting Corporation., 2008. 1990: Thatcher Quits as Prime Minister. [Online]
Available at: http://news.bbc.co.uk/onthisday/hi/dates/stories/november/22/newsid_2549000/2549189.stm
[Accessed 8 10 2014].
- British Broadcasting Corporation., 2001. Dome Woes Haunt Blair. [Online]
Available at: http://news.bbc.co.uk/2/hi/uk_news/politics/1172367.stm
[Accessed 8 10 2014].
- Chaline, E., 2011. Iron Maiden: Margaret Thatcher. In: History’s Worst Predictions and the People Who Made Them. England: Quid Publishing , pp. 194 – 199.
- Crewe, I and Searing D.D., 1988. Ideological Change in the British Conservative Party. The American Political Science Review, 82(2), pp. 361 – 384.
- Davies, S., 1993. Margaret Thatcher and the Rebirth of Conservatism. [Online]
Available at: http://ashbrook.org/publications/onprin-v1n2-davies/
[Accessed 28 09 2014].
- Elnaugh, R., 2013. Thatcher’s Children: Growing Up in 1980s Britain. [Online]
Available at: http://www.channel4.com/news/thatchers-children-growing-up-in-1980s-britain
[Accessed 5 10 2014].
- Garrett, G., 1992. The Political Consequences of Thatcherism. Political Behaviour, 14(4), pp. 361 – 382.
- Gray, J., 2004. Blair’s Project in Retrospect. International Affairs (Royal Institute of International Affairs 1944 -) , 80(1), pp. 39 – 48.
- Heffer, S., 2013. Kevin Rudd is Just Like Tony Blair. [Online]
Available at: http://www.spectator.co.uk/australia/australia-features/8996621/kevin-rudd-is-just-like-tony-blair/
[Accessed 29 09 2014].
- Jones, M., 1984. Thatcher’s Kingdom a View of Britain in the Eighties. Sydney: William Collins Pty Ltd. .
- King, A., 2002. The Outsider as Political Leader: The Case of Margaret Thatcher. British Journal of Political Science, 32(3), pp. 435 – 454.
- Kirkup J and Prince, R., 2008. Labour Party Membership Falls to Lowest Level Since it was Founded in 1900. [Online]
Available at: http://www.telegraph.co.uk/news/politics/2475301/Labour-membership-falls-to-historic-low.html
[Accessed 8 10 2014].
- Maxwell, S. a., 2007. Tony Blair’s Legacy 20% Jump in Amount of Legislation Introduced Per Year. [Online]
Available at: https://www.sweetandmaxwell.co.uk/about-us/press-releases/010607.pdf
[Accessed 8 10 2014].
- Merriam-Webster, 2014. Spin Doctor. [Online]
Available at: http://www.merriam-webster.com/dictionary/spin%20doctor
[Accessed 8 10 2014].
- McSmith, A, Chu, B, Garner, R, and Laurance, J., 2013. Margaret Thatcher’s Legacy: Spilt Milk, New Labour, and the Big Bang – She Changed Everything. [Online]
Available at: http://www.independent.co.uk/news/uk/politics/margaret-thatchers-legacy-spilt-milk-new-labour-and-the-big-bang–she-changed-everything-8564541.html
[Accessed 8 10 2014].
- McTernan, J., 2014. Tony Blair: His Legacy will be Debated But Not Forgotten. [Online]
Available at: http://www.telegraph.co.uk/news/politics/tony-blair/10977884/Tony-Blair-His-legacy-will-be-debated-but-not-forgotten.html
[Accessed 5 10 2014].
- Palmer, A., 1964. Conservative Partyy. In: The Penguin Dictionary of Modern History. Victoria: Penguin Books, pp. 90 – 90.
- Palmer, A., 1964. Labour Party. In: The Penguin Dictionary of Modern History 1789 – 1945. Victoria: Penguin Books , pp. 181 – 182.
- Pettinger, T., 2012. UK Economy in the 1980s. [Online]
Available at: http://www.economicshelp.org/blog/630/economics/economy-in-1980s/
[Accessed 5 10 2014].
- Purvis, J., 2013. What was Margaret Thatcher’s Legacy for Women?. Women’s History Review, 22(6), pp. 1014 – 1018.
- Silverman, J., 2007. Blair’s New Look Civil Liberties. [Online]
Available at: http://news.bbc.co.uk/2/hi/uk_news/politics/4838684.stm
[Accessed 8 10 2014].
- Thatcher, M., 1960. Public Bodies (Admission of the Press to Meetings) Bill. [Online]
Available at: http://www.margaretthatcher.org/document/101055
[Accessed 12 10 2014].
- Turner, L., 2011. Chariots of Fire: Tony Blair’s Legacy. [Online]
Available at: http://www.themonthly.com.au/tony-blair-s-legacy-chariots-fire-lindsay-tanner-3183
[Accessed 29 09 2014].
- K Government., 2014. Baroness Margaret Thatcher. [Online]
Available at: https://www.gov.uk/government/history/past-prime-ministers/margaret-thatcher
[Accessed 29 09 2014].
- UK Government., 2014. Tony Blair. [Online]
Available at: https://www.gov.uk/government/history/past-prime-ministers/tony-blair
[Accessed 29 09 2014].
- Warrell, M., 2013. Margaret Thatcher: An Icon of Leadership Courage. [Online]
Available at: http://www.forbes.com/sites/margiewarrell/2013/04/08/margaret-thatcher-an-icon-of-leadership-courage/
[Accessed 28 09 14].
- Younge, G., 2013. How Did Margaret Thatcher Do It?. [Online]
Available at: http://www.thenation.com/article/173732/how-did-margaret-thatcher-do-it
[Accessed 28 09 2014].
This is our weekly theological article.
If there is any philosophical or moral principle that can be credited with the prosperity of the Western capitalist societies it would have to be the Protestant work ethic. This ethic asserts that a person’s success in this life is a visible sign of their salvation in the next. As a result, the Protestant work ethic encourages hard work, self-reliance, literacy, diligence, frugality, and the reinvestment profits.
Prior to the Reformation, not much spiritual stock was placed on labour. The Roman Catholic Church placed more value on monastic prayer than on manual labour. Much would change when the German monk, Martin Luther (1483 – 1546), nailed his ninety-five theses on the door of the All Saint’s Church in Wittenberg. Luther railed against the Catholic Church’s sale of indulgences as a way of avoiding purgatorial punishment. Luther asserted faith over work believing that a person could be set right with God through faith alone. It was Luther’s opinion that an individual should remain in the vocation God had called them to and should work to earn an income, rather than the accumulation of wealth. This belief stood in stark contrast to the Catholic Church’s philosophy that relief from eternal torment came from Godly rewards for good works. By contrast, the second great Protestant, John Calvin (1509 – 1564), believed that faith and hard work were inextricably linked. Calvin’s theory came from his revolutionary idea of predestination, which asserted that only certain people were called into grace and salvation. It is from this that the Protestant work ethic is borne.
As a consequence, many Protestants worked hard to prove to themselves that they had been preselected for a seat in heaven. A result of this extreme predilection towards hard-work was an increase in economic prosperity.
The French sociologist, Emile Durkheim (1858 – 1917), believed that capitalism was built on a system that encouraged a strong work ethic and delayed gratification. Similarly, the German sociologist, Max Weber (1864 – 1920), argued in The Protestant Work Ethic and the Spirit of Capitalism (1905) that America’s success boiled down to the Protestant work ethic. It was asserted as the key idea that would encourage individuals to move up the social ladder and achieve economic independence. Weber noted that Protestants – particularly Calvinists, were largely responsible for early twentieth-century business success.
The Protest work ethic is credited with the United States’ economic and political rise in the 19th and 20th centuries. As the political scientist, Alexis de Tocqueville (1805 – 1859), wrote in Democracy in America (1835):
“I see the whole destiny of America contained in the first Puritan who landed on its shore. They will to their descendants the most appropriate habits, ideas, and mores to make a republic.”
A study in the American Journal of Economics and Sociology found that nations with a majority Protestant population enjoyed higher rates of employment. The economist, Horst Feldman, analysed data from eighty countries and found that countries with majority Protestant populations – America, the United Kingdom, Denmark, Sweden, and Norway – had employment rates six-percent higher than countries where other religious beliefs were practised. (Furthermore, the female employment rate in Protestant countries is eleven-percent higher). Feldman explained how the legacy of Protestantism led to increased prosperity:
“In the early days, Protestantism promoted the virtue of hard and diligent work among its adherents, who judged one another by conformity to this standard. Originally, an intense devotion to one’s work was meant to assure oneself that one was predestined for salvation. Although the belief in predestination did not last more than a generation or two after the Reformation, the ethic of work continued.”
The Protestant work ethic is one of those Christian ideas that have helped create Western capitalist democracies in all their glory. It is yet another example of the influence that Christianity has had on the modern world.
There has been an alarming trend in modern culture: numerous political and social activist groups have been attempting to use the pernicious and false doctrines of political correctness, tolerance, and diversity to silence those they disagree with. Many of these groups have sought the passage of so-called “hate speech” laws designed to silence voices of dissent.
At public colleges and universities, places where free speech and open debate should be actively encouraged, measures – including protests, disruption, and, in some cases, outright violence – taken to suppress voices of dissent has become tantamount to Government censorship. This censorship prevents students from inviting the speakers they wish to hear and debate speech they disagree with. Eva Fourakis, the editor-in-chief of The Williams Record (the student newspaper of Williams College) wrote an editorial, later recanted, commenting that “some speech is too harmful to invite to campus.” The editorial went on to say: “students should not face restrictions in terms of the speakers they bring to campus, provided of course that these speakers do not participate in legally recognised forms of hate speech.”
The University of California, Berkeley, is famous for sparking the free speech movement of the 1960s. Today, however, it has become a haven for radical, anti-free speech Neo-Marxists and social justice warriors. Not only have many Republican students had their personal property destroyed, but numerous conservative speakers have had their talks disturbed, and, in some cases, halted altogether. In February, Antifa – so-called anti-fascists – set fires and vandalised building during a speech by the controversial journalist, Milo Yiannopoulos (1984 – ). In April, threats of violence aimed at members of the Young Americas Foundation forced political commentator, Ann Coulter (1961 – ), to cancel her speech. A speech by David Horowitz (1939 – ), founder and president of the David Horowitz Freedom Center, was cancelled after organisers discovered that the event would take place during normal class times (for safety, or so they claimed). Finally, the conservative journalist, Ben Shapiro (1984 – ), was forced to spend US$600,000 on security for his speech at UC Berkeley. These events show that those who wish to use disruption, vilification, threats, and outright violence to silence others can be, and often are, successful in doing so.
Like most the principles of classical liberalism, free speech developed through centuries of political, legal, and philosophical progress. And like many Western ideas, its development can be traced back to the Ancient Greeks. During his trial in Athens in 399BC, Socrates (470BC – 399BC) expressed the belief that the ability to speak was man’s most divine gift. “If you offered to let me off this time on condition I am not any longer to speak my mind”, Socrates stated, “I should say to you, ‘Men of Athens, I shall obey the Gods rather than you.”
Sixteen hundred years later, in 1215, the Magna Carta became the founding document of English liberty. In 1516, Desiderius Erasmus (1466 – 1536) wrote in the Education of a Christian Prince that “in a free state, tongues too should be free.” In 1633, the astronomist Galileo Galilei was put on trial by the Catholic Church for refusing to retract his claim of a heliocentric solar system. In 1644, the poet, John Milton (1608 – 1674), author of Paradise Lost, warned in Areopagictica that “he who destroys a good book kills reason itself.” Following the usurpation of King James II (1633 – 1701) by William III (1650 – 1702) and Mary II (1662 – 1694) in 1688, the English Parliament passed the English Bill of Rights which guaranteed free elections, regular parliaments, and freedom of speech in Parliament.
In 1789, the French Declaration of the Rights of Man and of the Citizen, an important document of the French revolution, provided for freedom of speech (needless to say, Robespierre and company were not very good at actually promoting this ideal). That same year, the philosopher Voltaire (1694 – 1778) famously wrote: “I detest what you write, but I would give my life to make it possible for you to continue to write.” Over in the United States, in 1791, the first amendment of the US Bill of Rights guaranteed freedom of religion, freedom of speech, freedom of the press, and the right to assemble:
ARTICLE [I] (AMENDMENT 1 – FREEDOM OF SPEECH AND RELIGION)
Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of people peaceably to assemble, and to petition the Government for a redress of grievances.”
During the 19th century, the British philosopher, John Stuart Mill (1806 – 1873) argued for toleration and individuality in his 1859 essay, On Liberty. “If any opinion is compelled to silence”, Mill warned, “that opinion may, for aught we can certainly know, be true. To deny this is to presume our own infallibility.” Mill believed that all doctrines, no matter how immoral or offensive, ought to be given public exposure. He stated in On Liberty:
“If the argument of the present chapter are of any validity, there ought to exist the fullest liberty of professing and discussing, as a matter of ethical conviction, any doctrine, however immoral it may be considered.”
Elsewhere in On Liberty, Mill warned that the suppression of one voice was as immoral as the suppression of all voices:
“If all mankind minus one were of one opinion, and only one person were of the contrary opinion, mankind would be no more justified in silencing that one person than he, if he had the power, would be justified in silencing mankind.”
Centuries later, in 1948, the Universal Declaration of Human Rights, accepted unilaterally by the United Nations, urged member states to promote civil, human, economic, social, and political rights – including freedom of expression and religion.
Within the American Justice System, numerous Supreme Court cases have created judicial protections for freedom of speech. In the case of the Nationalist Socialist Party of America v. Village of Stoke (1977), the Supreme Court upheld the right of neo-Nazis to march through a village with a large Jewish population and wear Nazi insignia. The Justices found that the promotion of religious hatred was not a sufficient reason to restrict free speech.
In the city of St. Paul during the early 1990s, a white teenager was arrested under the “Bias-Motivated Crime Ordinance” after he burnt a cross made of a broken chair (cross-burning is commonly used by the Ku Klux Klan to intimidate African Americans) in the front yard of an African American family. The Court ruled that the city’s Ordinance was unconstitutional. Justice Antonin Scalia (1936 – 2016), noted that the purpose of restricting fighting words was to prevent civil unrest, not to ban the content or message of the speaker’s words. Scalia wrote in the case of R.A.V. v. City of St. Paul (1992):
“The ordinance applies only to ‘fighting words’ that insult, or provoke violence, ‘on the basis of race, colour, creed, religion or gender.’ Displays containing abusive invective, no matter how vicious or severe, are permissible unless they are addressed to one of the specified disfavored topics. Those who wish to use ‘fighting words’ in connection with other ideas—to express hostility, for example, on the basis of political affiliation, union membership, or homosexuality—are not covered. The First Amendment does not permit St. Paul to impose special prohibitions on those speakers who express views on disfavored subjects.”
In the Matal v. Tam case (2017), the Supreme Court found that a provision within the Lanham Act prohibiting the registration of trademarks that disparaged persons, institutions, beliefs, or national symbols violated the First Amendment. Justice Samuel Alito (1950 – ) opined:
“[The idea that the government may restrict] speech expressing ideas that offend … strikes at the heart of the First Amendment. Speech that demeans on the basis of race, ethnicity, gender, religion, age, disability, or any other similar ground is hateful; but the proudest boast of our free speech jurisprudence is that we protect the freedom to express ‘the thought that we hate’.”
Justice Anthony Kennedy (1936 – ) opined:
“A law found to discriminate based on viewpoint is an “egregious form of content discrimination,” which is “presumptively unconstitutional.” … A law that can be directed against speech found offensive to some portion of the public can be turned against minority and dissenting views to the detriment of all. The First Amendment does not entrust that power to the government’s benevolence. Instead, our reliance must be on the substantial safeguards of free and open discussion in a democratic society.”
In recent years, numerous calls to ban speech have been justified on the basis that it is “hateful.” Much of this has come from the political left who (in what one may cynically regard as having more to do with silencing voices of dissent than with protecting vulnerable groups) argue that restrictions on hate speech must occur if minorities are to be given equal status with everyone else.
That certain types of speech can be offensive, and that some of that speech may be aimed at certain groups of people, is undeniable. Hate speech has even been criticised for undermining democracy! In an article, Alexander Tsesis, Professor of Law at Loyola University, wrote: “hate speech is a threatening form of communication that is contrary to democratic principles.” Some have even argued that hate speech violates the fourteenth amendment to the US Constitution which guarantees equal protection under the law:
Article XIV (AMENDMENT 14 – RIGHTS GUARANTEED: PRIVILEGES AND IMMUNITIES OF CITIZENSHIP, DUE PROCESS, AND EQUAL PROTECTION)
1: All persons born or naturalised in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside. No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny any person within its jurisdiction the equal protection of the laws.
That there is a historical basis for restricting hate speech is undeniable. Slavery, Jim Crow, and the Holocaust, among other atrocities, were all proceeded by violent and hateful rhetoric. (Indeed, incitement to genocide is considered a serious war crime and a serious crime against humanity under international law.) Genocide is almost always preceded by hate speech. However, what proponents of hate speech laws fail to realise is that the countries that perpetrated these atrocities did not extend the freedom to speak to the groups that they were targeting. Joseph Goebbels (1897 – 1945), the Nazi minister for public enlightenment and propaganda, for example, had such an iron grip on Germany’s media that any voice contradicting the Nazi’s anti-Semitic propaganda had no opportunity to be heard.
But who, exactly, supports hate speech laws? Analysis of survey data taken from Pew Research Center and YouGov reveals that it is primarily non-white, millennial democrats. In terms of age, the Pew Research Centre found that forty-percent of millennials supported Government censorship of hate speech, compared to twenty-seven percent of gen x-ers, twenty-four percent of baby-boomers, and only twelve percent of the silent generation.
In terms of race, research by YouGov reveals that sixty-two percent of African Americans support Government censorship of hate speech, followed by fifty percent of Hispanics, and thirty-six percent of White Americans.
In terms of political affiliation, research from YouGov taken in 2015 found that fifty-one percent of Democrats supported restrictions on hate speech, compared to thirty-seven percent of Republicans, and only thirty-five percent of independents.
The primary issue with hate speech is that determining what it does and does not constitute is very difficult. (The cynic may argue, fairly, that hate speech begins when the speaker expresses a view or states a fact or expresses an opinion that another person does not want others to hear.) As Christopher Hitchens (1949 – 2011) pointed out, the central problem with hate speech is that someone has to decide what it does and does not constitute.
The second issue with hate speech laws is that they can easily be used by one group to silence another. Often this kind of censorship is aimed at particular groups of individuals purely for ideological and/or political purposes, often with the justification that such actions increase the freedom and equality of the people the advocates claim to represent.
In Canada, Bill C-16 has sought to outlaw “hate propaganda” aimed at members of the community distinguishable by their gender identity or expression. The Bill originated with a policy paper by the Ontario Human Rights Commission which sought to determine what constituted discrimination against gender identity and expression. This included “refusing to refer to a person by their self-identified name and proper personal pronoun.” Supporters of Bill C-16 see it as an important step towards the creation of legal protections for historically marginalised groups. Detractors, however, have expressed concern that the Bill creates a precedence for Government mandated speech.
The Canadian clinical psychologist and cultural critic, Professor Jordan Peterson (1962 – ), first came to public attention when he posted a series of YouTube videos warning of the dangers of political correctness and criticising Bill C-16. In his videos, Professor Peterson warned that the law could be used to police speech and compel individuals to use ‘transgender pronouns’ (these are terms like ‘ze’ and ‘zer’, among others). For his trouble, Peterson has been accused of violence by a fellow panellist on the Agenda with Steve Palkin, received two warning letters from the University of Toronto in 2016, and was denied a social research grant from Canada’s Social Sciences and Humanities Research Council.
Europe has been experiencing similar attempts to silence speech. A law passed in the Bundestag this year will force social media companies operating in Germany to delete racist or slanderous comments and posts within twenty-four hours or face a fine of up to €50 million if they fail to do so. Additionally, numerous public figures have found themselves charged with hate speech crimes for merely pointing out the relationship between the large influx of non-European migrants and high crime rates, particularly in terms of rape and terrorism. One politician in Sweden was prosecuted for daring to post immigrant crime statistics on Facebook.
In Great Britain, British Freedom of Information documents reveal that around twenty-thousand adults and two-thousand children had been investigated by the police for comments that made online. In politics, British MP, Paul Weston (1965 – ), found himself arrested after he quoted a passage on Islam written by Winston Churchill (1874 – 1965). In Scotland, a man was charged under the 2003 Communication’s Act with the improper use of electronic communications after he filmed his dog making a Hitler salute.
In Australia, Herald Sun columnist, Andrew Bolt (1959 – ), was found to have contravened section 18C of the Racial Discrimination Act after he published articles accusing fair-skinned Aborigines of using their racial status for personal advantages. The law firm, Holding Redlich, speaking for a group of Aboriginal persons, demanded that the Herald Sun retract two Andrew Bolt articles, written in April and August of 2009, and restrain Bolt from writing similar articles in the future. Joel Zyngier, who acted for the group pro-bono, told Melbourne’s The Age:
“We see it as clarifying the issue of identity—who gets to say who is and who is not Aboriginal. Essentially, the articles by Bolt have challenged people’s identity. He’s basically arguing that the people he identified are white people pretending they’re black so they can access public benefits.”
Judge Morcedai Bromberg (1959 – ) found that the people targeted by Bolt’s articles were reasonably likely to have been “offended, insulted, humiliated, or intimidated.”
We need speech to be as free as possible because it is that which allows us to exchange and critique information. It through free speech that we are able to keep our politicians and public officials in check, that we are able to critique public policy, and that we are able to disseminate information. As the Canadian cognitive psychologist, Stephen Pinker (1954 – ), observed: “free speech is the only way to acquire knowledge about the world.” Measures taken to restrict free speech, whether it be the criminalization of hate speech or any other, is a complete contradiction of the principles that free Western democracies are founded upon.
This week for our weekly cultural article we will be examining the David Lean’s (1908 – 1991) 1946 film Great Expectations, considered to be one of the greatest British films ever made. When it was released in 1946, it was met with glowing reviews. Today, seventy years later, it has been described by Criterion as “one of the greatest translations of literature into film.”
David Lean’s Great Expectations captures the essence of Charles Dicken’s (1812 – 1870) literary genius by juxtaposing his memorable characters with the artful film direction of David Lean. Leans use of black and white to add to the foreboding and melancholy atmosphere of the film. Then there are the numerous dark, creepy, rundown, and ultimately human locations that burn themselves into the memory: the creepy graveyard where a young Pip first meets escaped convict Abel Magwitch, the Kentish marshes, Miss Havisham’s dilapidated and macabre home whose clocks are stopped at the exact time Miss Havisham learnt of her fiance’s betrayal, 19th century London, Mr. Jagger’s offices whose walls are decorated with the death masks of defendants lost to the gallows, the prison where Magwitch dies, and so forth.
Then there are the wealth of memorable characters the film presents to us. The most notable of these is Pip through whom we see all of the tragedy and injustice of early 19th century England. Pip acts as more of an observer to the world around him than an actual protagonist. We first meet Pip (Anthony Wager, 1932 – 1990) as a young orphan being raised by his overbearing older sister (Freda Jackson, 1907 – 1990) and her kindly blacksmith husband (Bernard Miles, 1907 – 1991). It is during this time that Pip first encounters Abel Magwitch (Finlay Currie, 1878 – 1960), a kind yet ultimately decent escaped convict, in the cemetery, and when his heart is broken by the coquettish Estella (Jean Simmons, 1929 – 2010) and the dishevelled and deranged Miss Havisham (Martita Hunt, 1900 – 1969).
As the film progresses, we see Pip grow into a young man, played by John Mills (1908 – 2005), who, it could be argued, was perhaps a little too old to play Pip in his early twenties. This Pip has been bequeathed a large allowance by an unknown benefactor and travels to London with the view of becoming a gentleman. There he forms a friendship with the Herbert Pocket (Alec Guinness, 1914 – 2000) who helps him refine his manners. The adult Pip finds himself corrupted by the ponce and ceremony of the English upper-class and is ashamed to admit that he would have paid money to keep Joe Gargery, dressed in his cheap suit and awkward manner, away. Pip is forced to reexamine his views after discovering that his benefactor is none other than the escaped convict Magwitch, who was so struck by Pip’s childhood compassion that he was inspired to make something of himself and become his benefactor. Magwitch’s kindness and gratitude cause Pip to regain his lost humanity.
Great expectations represents a type of film that no longer exists: one that deals entirely with the human condition. These films are no longer made because they often do not involve exciting elements, but rather present characters that are flawed and suffering and places these characters in stories that are essentially tragic in nature. These films are no longer made because they do not fit into the blockbuster formula. Rather than the larger-than-life heroes of the blockbuster, films on the human condition feature characters that are ultimately flawed and suffering. These characters are placed in stories that are ultimately tragic in their nature. A far cry from the often over-the-top plots of the modern blockbuster. There is something deeply satisfying about films like Great Expectations which shows normal people to be capable of leading a dignified existence regardless of the tragedy and suffering they are forced to face.
Kofi Annan, the former Secretary-General of the United Nations, has stated that disagreeing with globalism is like disagreeing with “the laws of gravity.” Similarly, new French President, Emmanuel Macron, another supporter of globalism, wishes to deregulate France’s ailing industry and boost freedom of movement and trade. Donald Trump’s election to the US Presidency, and the UK’s decision to leave the European Union, however, have challenged the presumed supremacy of globalism as a political force.
The roots of globalism can be traced back to the 2nd Century BC when the formation of the Silk Road facilitated the trade of silk, wool, silver, and gold between Europe and China. It wasn’t until the 20th century, however, that the idea gathered momentum. Following the Second World War, world power was to be split between America, representing the capitalist west, and the Union of Soviet Socialist Republics, representing the communist east. Following the collapse of the Soviet Union in 1991, America took it upon herself to create an undivided, democratic, and peaceful Europe.
Of course, the aim for an undivided Europe, indeed an undivided world, existed long before the collapse of the Soviet Union. In 1944. Allied delegates, met at Bretton Woods, New Hampshire, to establish an economic system based on open markets and free trade. Their idea gathered momentum. Today, the Monetary Fund, World Bank, and, the World Trade Centre all exist to unite the various national economies of the world into a single, global economy.
In 1950, the French foreign minister, Robert Schuman, proposed pooling Western Europe’s coal and steel producing countries together. Originally, Schuman’s objective had been to unite France with the Federal Republic of Germany. In the end, however, the Treaty of Paris would unite Belgium, France, West Germany, Italy, Luxembourg, and the Netherlands in the European Coal and Steel Community. By 1957, the Treaty of Rome had been used to create the European Economic Community.
Globalism is an ideology which seeks to form a world where nations base their economic and foreign policies on global, rather than national, interests. It can be viewed as a blanket term for various phenomena: the pursuit of classical liberal and free market policies on the world stage, Western dominance over the political, cultural, and economic spheres, the proliferation of new technologies, and global integration.
John Lennon’s Imagine, speaking of ‘no countries’, ‘no religion’, and a ‘brotherhood of man’, acts as an almost perfect anthem for globalism. Your individual views on globalism, however, will depend largely on your personal definition of a nation. If you support globalism it is likely you believe a nation to be little more than a geographical location. If you are a nationalist, however, it is likely you believe a nation to be the accumulation of its history, culture, and traditions.
Supporters of John Lennon’s political ideology seem to suffer from a form of self-loathing. European heritage and culture are not seen as something worth celebrating, but as something to be dismissed. And it appears to be working: decades of anti-nationalist, anti-Western policies have stripped many European nations of their historical and cultural identities. In the UK, there have been calls to remove the statue of Cecil Rhodes – an important, yet controversial figure. In other countries, certain areas are have become so rife with ethnic violence they are considered ‘no-go’ zones.
Perhaps, it is the result of “white man’s burden”, Rudyard Kipling’s prophetic 1899 poem about the West’s perceived obligation to improve the lot of non-westerners. Today, many white, middle-class elites echo Kipling’s sentiments by believing that it to be their duty to save the world. These people are told at charity events, at protests, at their universities, and by their media of their obligation to their ‘fellow man.’ When it comes to immigration, they believe it to be their responsibility to save the wretched peoples of the world by importing them, and their problems, to the West.
By contrast, nationalism champions the idea that nations, as defined by a common language, ethnicity, or culture, have the right to form communities based on a shared history and/or a common destiny. The phenomenon can be described as consisting of patriotic feelings, principles, or efforts, an extreme form or patriotism characterised by feelings of national superiority, or as the advocacy of political independence. It is primarily driven by two factors. First, feelings of nationhood among members of a nation-state, and, two, the actions of a state in trying to achieve or sustain self-determination. In simplest terms, nationalism constitutes a form of human identity.
One cannot become a citizen of a nation merely by living there. Citizenship arises from the sharing of a common culture, tradition, and history. As American writer Alan Wolfe observed: “behind every citizen lies a graveyard.” The sociologist Emile Durkheim believed people to be united by their families, their religion, and their culture. In Suicide: a Study in Sociology, Durkheim surmises:
“It is not true, then, that human activity can be released from all restraint. Nothing in the world can enjoy such a privilege. All existence being a part of the universe is relative to the remainder; its nature and method of manifestation accordingly depend not only on itself but on other beings, who consequently restrain and regulate it. Here there are only differences of degree and form between the mineral realm and the thinking person.’ Man’s characteristic privilege is that the bond he accepts is not physical but moral; that is, social. He is governed not by a material environment brutally imposed on him, but by a conscience superior to his own, the superiority of which he feels.” – Suicide: a Study in Sociology (pg. 277)
Globalism has primarily manifested itself through economic means. In the economic sense, globalism began in the late 19th, early 20th centuries with the invention of the locomotive, the motor-car, the steamship, and the telegraph. Prior to the industrial revolution, a great deal of economic output was restricted to certain countries. China and India combined produced an economic output of fifty-percent, whilst Western Europe produced an economic output of eighteen percent. It was the industrial revolution of the 19th century, and the dramatic growth of industrial productivity, which caused Western Europe’s economic output to double. Today, we experience the consequences of globalism every time we enter a McDonalds Restaurant, call someone on our mobile phones, or use the internet.
Philip Lower, the Governor of the Reserve Bank of Australia, told a group of businessmen and women at the Sydney Opera House that Australia was “committed to an open international order.” Similarly, the Nobel Prize-winning economist, Amartya Sen, argued that globalisation had “enriched the world scientifically and culturally, and benefited many people economically as well.” It is certainly true that globalisation has facilitated the sharing of technological, cultural, and scientific advances between nations. However, as some economists, like Joseph Stiglitz and Ha-Joon Chang, have pointed out: globalisation can also have the effect of increasing rather than reducing inequality. In 2007, the International Monetary Fund admitted that investment in the foreign capital of developing countries and the introduction of new technologies has had the effect of increasing levels of inequality. Countries with larger populations, lower working and living standards, more advanced technology, or a combination of all three, are in a better position to compete than countries that lack these factors.
The underlying fact is that globalism has economic consequences. Under globalisation, there is little to no restrictions on the movement of goods, capital, services, people, technology, and information. Among the things championed by economic globalisation is the cross-border division of labour. Different countries become responsible different forms of labour.
The United Nations has unrealistically asserted globalism to be the key to ending poverty in the 21st Century. The Global Policy Forum, an organisation which acts as an independent policy watchdog of the United Nations, has suggested that imposition of global taxes as a means of achieving this reality. These include taxes on carbon emissions to slow climate change, taxes on currency trading to ‘dampen instability in the foreign exchange markets’, and taxes to support major initiatives like reducing poverty and hunger, increasing access to education, and fighting preventable diseases.
In one sense, the battle between globalism and nationalism can be seen as a battle between ideology and realism. Globalism appears committed to creating a ‘brotherhood of man.’ Nationalism, on the other hand, reminds us that culture and nationality form an integral part of human identity, and informs us they are sentiments worth protecting. The true value of globalism and nationalism come not from their opposition, but from how they can be made to work together. Globalism has the economic benefit of allowing countries to develop their economies through global trade. It is not beneficial, however, when it devolves into open-border policies, global taxes, or attacks on a nation’s culture or sovereignty. Nationalism, by the same token, has the benefit of providing people with a national and cultural identity, as well as the benefits and protections of citizenship. Nationalism fails when it becomes so fanatical it leads to xenophobia or war. The answer, therefore, is not to forsake one for the other, but to reconcile the two.