King Alfred Press

The Problem with Actors and Actresses

960x0

Like many people, I was cynically amused to learn that the Duke and Duchess of Cambridge were “leaving” the Royal Family. According to an agreement they reached with the Palace on January 18th, they would be free to pursue business opportunities around the world and would “no longer be working members” of the British royal family, though they would lose the right to be referred to as His or Her Majesty.

It’s hardly surprising. Acting, like many art forms, has always attracted the insecure, sociopathic, and the just plain crazy. And Meghan Markle is an actress. One psychological study found that actors showed significantly higher rates of disordered personality traits than non-actors. The study, which compared 214 professional actors to a cohort of North American non-actors, also found that there was a high prevalence of anti-social personality, borderline, narcissistic, schizotypal, and obsessive-compulsive traits among actors than there was among the general population.

People become actors because they like being the centre of attention. They crave the spotlight because it makes them feel validated. The Royal Family, by contrast, performs public service by diverting attention away from themselves and onto the British nation and people. (A fact greatly contradicted by a news media who treat them as news stories in and of themselves). Poor Meghan Markle has found herself in a situation where she is not the centre of attention, and she doesn’t like it.

So, what does someone like Meghan Markle do when the spotlight is not on her? Well, the answer in Meghan’s case seems to be: leave the royal family. I will not at all be surprised in Meghan announces some kind of return to acting over the coming year. You cannot turn an actress into a princess anymore than you can make a leopard change its spots.

Anti-Catholic Bigotry Masquerades as Common Decency

maxresdefault

Last month, the Catholic Archbishop of Queensland, Mark Coleridge voiced his opposition to calls for Priests to become mandatory reporters, a move that would destroy the seal of the confessional. Coleridge warned that forcing Priests to break the seal of the confessional would have the effect of turning them into “agents of the state” rather than “servants of God.”

That, of course, is precisely the point. It is beyond doubt that many of the accusations of child abuse leveled against the Church have been well-founded. It is also beyond doubt that the Catholic Church has not always responded to such accusations with the seriousness they ought to have. However, it would be equally true to claim that the spectre of child abuse has been used as an excuse to conjure up anti-Catholicism.

Of the 409 individual recommendations generated by the Royal Commission on Child Abuse, several are targeted directly at religious institutions (and the Catholic Church specifically). First, it has been recommended that Priests be mandated to report confessions of child abuse. Second, that children’s confessions should occur in a public place where Priest and child can be observed by an adult. Third, that “the Australian Catholic Church should request permission from the Vatican to introduce voluntary celibacy for diocesan clergy.” Fourth, that candidates for religious ministry undergo independent psychological evaluation. And fifth, that “any person in religious ministry who is the subject of a complaint of child sex abuse which is sustained, or who is convicted of an offence relating the child sex abuse, should be permanently removed from ministry.”

Such proposals are not only impractical, but dangerous. They would have the effect of not only destroying the seal of the confessional, but of destroying the separation of Church and State. It would give the authorities the power to place the Church under observation and to stack it with clergymen who support their political and social agenda.

Nobody says anything about this blatant disregard for our most common civil liberties and democratic values. The fact of the matter is that the Catholic Church has always been an easy target. It is neither progressive nor nationalistic making it a target of condemnation for both the far left and the far right. The far left hates the Catholic Church because it stands in favour of traditionalism. The far-right hates members of the Catholic Church because they see it as something akin to fealty to a foreign power.

And like all bigots, anti-Catholics have chosen to target and destroy a high-profile target. Cardinal George Pell has become a scapegoat for child sex abuse committed within the Catholic Church. The mainstream media has been quick to paint Pell as a power-mad, sexually depraved Cardinal rather than the reformer that he actually was.

As Archbishop of Melbourne, Pell was instrumental in instigating investigations into allegations of child abuse and providing compensation for victims. That, however, made not the slightest difference, nor did the improbability of the accusations. (As Pell’s own defence team pointed out: not only did the security and layout of Melbourne’s Catholic Cathedral render such abuse impossible, Pell had no opportunity to commit such crimes). When he was accused of abusing two boys in the 1990s, Pell’s guilt was assumed for no other reason than that he was a Catholic Archbishop.

Archbishop Mark Coleridge is right to criticise anti-religious measures embedded in the Royal Commission’s report. The reality is that Australia’s modern, secular institutions are focused primarily on destroying the influence of the Catholic Church in Australia. The idea that they care about the safety and well-being of children is patently absurd.

 

JURIES ARE WORTH KEEPING

12angrymen

The Jury System is a cornerstone of justice and liberty. However, they are also controversial. On the one hand, there are those who see the jury system as an integral part of a free and impartial justice system. On the other hand, there are those who doubt the jury’s ability to deliver fair and honest verdicts.

Let’s start with the obvious fact that juries are far from perfect. They are imperfect because the people who make them up are imperfect. Ignorance is one major problem. Opponents of the jury system argue, with some justification, that it is too dangerous to place the fate of another human being in the hands of people incapable of understanding the complexities of the cases they are judging. Often those tasked with deciding the outcome of cases lack the technical or legal knowledge to adequately interpret the evidence and testimony being presented to them. It has been suggested that in these cases individual jurors will often resort to pre-conceived beliefs or allow themselves to be influenced by jurors with more knowledge – whether real or perceived – than they have.

Ignorance, however, is an easily solved problem. Why not select jury members based on their familiarity with the subject matters under discussion? Someone who works in the finance industry – bankers, financial advisors, accountants, and so forth – would be more equipped to judge financial-based crimes than the layperson.

Then there’s the question of who can sit on a jury. In the United Kingdom an individual needs to be aged between eighteen and seventy, have been a resident of the UK for at least five years since the age of thirteen, and must be mentally stable to serve on a jury. It would more than reasonable to suggest that qualifications for jury duty ought to be more stringent than they are. It is more than reasonable to suggest that the age limit ought to be raised from eighteen to perhaps twenty-five (if not older) and that jurors under the age of forty ought to have certain intellectual qualifications. This would ensure that those tasked with determining guilt or innocence would have the wisdom and/or intelligence to comprehend the grave nature of the responsibility they have been burdened with.

Those who criticise juries also argue that they are prone to bias and prejudice. In one shocking case, Kasim Davey was jailed for contempt when he boasted: “I wasn’t expecting to be in a jury deciding a paedophile’s fate. I’ve always wanted to fuck up a paedophile and now I’m within the law.” (Seemingly it never occurred to Mr. Davey that the man he was judging may have been innocent). Likewise, it is well known that many African American defendants were condemned by all-white juries in the Jim Crow South.

However, much of this is a red-herring. Professor Cheryl Thomas, the director of the Jury Program at University College of London, spent ten years analysing every jury verdict in England and Wales taking into account the race and gender of both defendants and jurors. Professor Thomas concluded that:

“There’s no evidence of systematic bias, for instance, against members of ethnic minorities, or that men are treated differently than women, that if you live in a particular part of the country or you have a certain background that you’re more likely to be convicted than others.”

Besides, those who criticise the jury system forget that juries reflect the values and principles of their society. If juries repeatedly deliver unjust verdicts it is because there is a sickness in that society. The fact that all-white juries tended to convict African American defendants merely because they were black is a reflection on the virulently racist nature of that society, not of the jury system itself. Today, the legal system is careful to disqualify those jurors who may harbour prejudices that will inhibit their ability to judge the facts impartially. Courts are very quick to disqualify jurors who may know the defendant or alleged victim, those with emotional links to the case (i.e. a victim of rape sitting on the jury of a rape trial), and so forth.

Lord Devlin, the second-youngest man to be appointed to the English High Court in the 20th century, once described the jury system as “the lamp which shows where freedom lives.” The principle behind juries is that the individual ought to be judged by his peers based on community standards, not by the politically elite. Without juries, our legal system would be dominated by judges and lawyers. What lies at the centre of the debate over juries is the question of whether the whole of society or just the elite should be involved in the dispensation of justice.

The Presumption of Innocence is Worth Protecting No Matter What the Cost

jemma-beale-rape

Jemma Beale was sentenced to ten years imprisonment after it was found she had made repeated false rape allegations. 

In February 2013, Vassar College student, Xialou “Peter” Yu was accused of sexual assault by fellow student, Mary Claire Walker. The accusation stemmed from an incident occurring twelve months previously in which Walker had accompanied Yu back to his dorm room after a party and initiated consensual sex. Walker herself broke off the coitus early. She had decided that it was too soon after ending her relationship with her boyfriend to embark on a sexual relationship with another man. She even expressed remorse for having “lead Yu on” and insisted that he had done nothing wrong.

Nevertheless, at some point, Walker decided that she had been sexually assaulted and Yu was mandated to stand before a college tribunal. At this tribunal, Yu was refused legal representation, had his attempts at cross-examining his accuser repeatedly stymied, and potential eyewitness testimonies from both Yu and Walker’s roommates were suppressed by the campus gender equality compliance officer. Supposedly because they had “nothing useful to offer.” In what can only be described as a gross miscarriage of justice, Yu was found guilty and summarily expelled.

Unfortunately, the kind of show trials that condemned Yu is not entirely uncommon in American colleges and universities (and, like many social diseases, are starting to infect Australian campuses, as well). They are the result of years of unchallenged feminist influence on upper education. These institutions have swallowed, hook, line, and sinker, the feminist lie that every single woman who claims to be sexually assaulted must be telling the truth.

The problem begins with those who make public policy. The US Department of Education has been seduced by the ludicrous idea that modern, western societies are a “rape culture.” They have brought into the lie that one-in-five women are sexually assaulted on college campuses, despite the fact that this statistic (which conveniently seems to come up with exactly the same ratio no matter where it’s used) comes from an easily disproven web-based survey.

This survey, which was conducted at two universities in 2006, took only fifteen minutes to complete and had a response rate of just 5466 undergraduate women aged between eighteen and twenty-five. Furthermore, it was poorly formulated with researchers asking women about their experiences and then deciding how many of them had been victims of sexual misconduct.

Regardless, the lack of credibility that this survey possessed did not stop the US Department of Education’s Office of Civil Rights from laying out guidelines for handling reports of sexual misconduct. Among these recommendations was that reports of sexual misconduct should be evaluated on the “preponderance of evidence” rather than the more traditional “clear and convincing evidence.” This radical shift in standards of proof means that accuser only has to prove that there is a reasonable chance that a sexual assault occurred rather than having to prove it beyond a reasonable doubt.

It would be an understatement to say the college and university rape tribunals – and the policies that inform them – violate every legal principle and tradition of western law. American colleges and universities have created an environment in which male students can be stigmatised as sexual deviants with little to no evidence aside from an accusation. These tribunals not only violate standards of proof but the presumption of innocence, as well.

That these tribunals have decided to do away with the presumption of innocence should hardly come as a surprise. After all, the mere idea of the presumption of innocence is antithetical to human nature. It is natural for human-beings to presume that someone is guilty just because they have been accused of something. As the Roman jurist, Ulpian pointed out: the presumption of innocence flies in the face of that seductive belief that a person’s actions always result in fair and fit consequences. People like to believe that someone who has been accused of a crime must have done something to deserve it.

The presumption of innocence is the greatest legal protection the individual has against the state. It means that the state cannot convict anyone unless they can prove their guilt beyond any reasonable doubt. We should be willing to pay any price to preserve it. And we certainly shouldn’t allow extra-legal tribunals to do away with it just to satisfy their ideological proclivities.

Contemporary Arrogance is the Perfect Fodder for Human Evil

irancouple

At this present moment there are three Australians sitting in Iranian prisons. Kylie Moore-Gilbert, Jolie King, and Mark Firkin have all been charged (and, in Kylie Moore-Gilbert’s case, convicted) with espionage. Jolie King and Mark Firkin have been accused of flying a drone over a military installation without a permit whilst the charges against Kylie Moore-Gilbert remain unclear.

To say that Jolie King and Mark Firkin were naïve would be an understatement. The couple, who raise money for their global adventures on Patreon, stated on their vlog that their ambition is to “inspire anyone wanting to travel and also to try to break the stigma of travelling to countries which get a bad rap in the media.”

Some countries have a bad reputation for a reason, a fact Jolie King and Mark Firkin seemed unwilling to comprehend. Iran, in particular, has a bad reputation for political repression, human rights violations, and corruption. Iran has been noted for using excessive violence against political dissidents, suppressing the media, carrying out arbitrary arrests, and using inhumane punishments.

No wonder Amnesty International has stated that the human rights situation in Iran had “severely deteriorated.” Iranian prisoners lack access to adequate medical care, trials can hardly be described as fair, and confessions obtained using torture are freely admitted in court. It was even reported in June 2018 that defendants accused of breeching Iran’s national security laws were being forced to choose from a list of just twenty state-approved lawyers.

There is nothing new about Jolie King and Mark Firkin. History is filled with people who deny the existence of evil. And many of them have paid the ultimate price. Jay Austin and Lauren Geogehan claimed in their blog that “evil is a make-believe concept we’ve invented to deal with the complexities of fellow humans holding values and beliefs and perspectives that are not our own.” This beautiful sentiment didn’t stop them being stabbed to death by Islamic State jihadists in Tajikistan.

A large part of this problem comes from the social disease of moral relativism. We have lived with peace, prosperity, and freedom for so long that we’ve forgotten what it is like not to have them. Our complacency has led us to believe that all moral beliefs are equally valid. And it has led us to believe that there is no such thing as evil.

The problem with moral relativism is that it is not true. Actions have consequences and some consequences just happen to be bad. Saying that all moral beliefs are equally valid is no different than saying that one cannot make judgements about the behaviour of others because there is no absolute standard of good and evil. It’s a rather convenient argument when people are doing the wrong thing and know it.

There are two fundamental problems with moral relativism. The first is that it is a self-defeating argument. By saying that there is no absolute morality you are, in fact, making an absolute claim. The second is that hardly anyone actually believes that morality is relative. If they did, they would regard rape and murder as being equally acceptable behaviour as charity and kindness.

Rather, people use moral relativism to justify their own immoral behaviour. It gives people an easy way out by allowing them to behave in whatever manner they please without moral justification. And this, when you think about it, is precisely what people want: the freedom to do whatever they please without having to feel guilty about it.

Socially progressive people like to see themselves as so sophisticated that they can do away with good and evil. Jolie King and Mark Firkin bought into such a worldview. They now find themselves sitting in Iranian prisons for their troubles. Such is the price of modern arrogance.

The Celebration of Ignorance

10657960-16x9-large

One of the great joys of my life is watching speeches and interviews given by great intellectuals. It was in pursuing this pleasure that I happened upon an episode of the ABC’s panel discussion show, Question and Answers. Coming out of the Festival of Dangerous Ideas, the four people on the panel – the traditional conservative, Peter Hitchens; the feminist writer, Germaine Greer; the American writer, Hanna Rosin; and the gay rights activist, Dan Savage – spent an hour discussing tops ranging from western civilisation to modern hook-up culture.

It became quickly apparent that the intellectual stature of the four panellists was not evenly matched. Hanna Rosin and Dan Savage were less rational, less mature, and more ignorant than Peter Hitchens and Germaine Greer. By comparison, Hitchens and Greer gave carefully considered answers to most of the questions asked. Hitchens, in particular, gave responses based on careful consideration, rational thought, fact, and wisdom. (This is not to say one is required to agree with him)

It was the behaviour of the audience that proved the most alarming, however. Like most Questions and Answers audiences, it was comprised mostly of idealistically left-wing youth. Their primary purpose for being there was to have their ideological presuppositions reinforced. With no apparent motivation to listen to the answers to their questions, these youngsters would clap and cheer like trained seals whenever someone makes an ideologically-correct statement.

How has our society become so stupid? Why do we no longer see being wise and knowledgeable as virtues in and of themselves? Part of the answer comes from a culture of self-hate and contempt promulgated by left-wing intellectuals. Accordingly, Christianity is regarded as archaic (unless, of course, it promotes left-wing beliefs), inequality is caused by capitalism, and the problems of women come as the result of the “patriarchy.” Even the Killing Fields of the Khmer Rouge are rather conveniently blamed on “trauma” emanating from the Vietnam War (rather than the actions of Pol Pot and his band of murderous, communist brutes).

This continuous, unrelenting assault on Western civilisation has led to a general estrangement from Western culture. The common people have been robbed of their inheritance because scholars and intellectuals have reduced their culture into a caricature to be dismantled at will. As a result, they are no longer exposed to the great works of art, architecture, literature, music, philosophy, poetry, sculpture, theology, and theatre that the Western world has produced.

The modern proclivity for ignorance and stupidity comes out of a very special kind of arrogance. It is the kind of arrogance that makes people believe that all those who came before them must be dumber than they are. It does not acknowledge that our modern “enlightenment” is built on the works of those who came before us. Our forebears would be dumbfounded to find a world where, despite having greater access to information than anyone else in history, people have closed their minds to learning.

What all this boils down to is a rejection of wisdom. If you believe that all those who came before you are dumber than yourself you are unlikely to believe they have anything worthwhile to contribute. As such, you are unlikely to believe in wisdom as a universal good. As Neel Burton over at Psychology Today pointed out: “in an age dominated by science and technology, by specialisation and compartmentalisation, it [wisdom] is too loose, too grand, and too mysterious a concept.”

We have made phenomenal advancements in all areas of human knowledge. Sadly, our successes have also made us arrogant and self-righteous. If we are to take full advantage of our potential, we need to reignite our cultural past and find the humility to learn from those who went before us.

Is Our Lifestyle Killing Us?

r0_0_800_600_w1200_h678_fmax

The biggest health crisis facing the modern world is obesity. According to the World Health Organisation, obesity rates have tripled since 1975. As of 2016, 650 million adults, 340 million children aged between five and nineteen, and 41 million children under five were obese.

And it’s affecting Australia, too. Between 1995 and 2014/15, the number of obese Australians rose from 18.7% to 27.9%. The Sydney Morning Herald even reported that nearly a third of all adult Australians can now be considered obese. According to the Heart Foundation, approximately 42.7% of adult men and 28.8% of adult women are overweight. More alarmingly, 28.4% of men and 27.4% of women are considered obese.

We are poisoning ourselves and we don’t even know it. Among the health problems caused by obesity are diabetes, heart disease, stroke, high blood pressure, high cholesterol, gall bladder disease, a multitude of cancers, fatty liver, and arthritis.

We are poisoning ourselves in two distinct ways. Firstly, we are eating far too many carbohydrates. Carbohydrate-rich foods like bread and pasta cause blood sugar levels to rise. This creates an excess of sugar that causes the body to crave more carbohydrates. The result is that the body stores fat.

Whether or not bread is good or bad for us is up for debate. Lynid Polivnick, the so-called “nude nutritionist”, has defended bread stating that “it’s much healthier than people make it out to be. It’s often demonised as being a cause of weight gain but in truth, bread does not actually make us gain weight.” And she’s probably right. There is nothing wrong with bread provided that it is eaten in moderation. The problem is that many of us don’t eat bread in moderation.

Many health experts do not share Lynid Polivnick’s view. The website Healthy Simple Life claims that bread is mostly devoid of any real nutrients. Bread tends to be ‘fortified’ with vitamins and minerals because its original nutrients have been stripped from it and added back later. These nutritional elements are unlikely to be absorbed by our bodies.

Secondly, we are consuming far too much sugar. This is a relatively new problem. Our ancestors had little access to refined sugars. If they were lucky, they were able to enjoy a tiny amount of fruit during vanishingly small periods of the year. Otherwise, they were relegated to a diet rich in vegetables with a small smattering of meat.

By contrast, people in modern, wealthy society have access to seemingly endless amounts of sugar. Added sugar accounts for seventeen-percent of the average American adult’s diet. Sugar is now present in everything from cereal to chocolate bars.

Over-consumption of sugar is a leading cause of obesity and its related illnesses. It has been found to increase the risk of certain types of cancer – namely, oesophageal, pleural, small intestine, and endometrial. And it has been linked to the doubled prevalence of diabetes over the past three decades.

Over-consumption of sugar has also been found to correlate positively with an increased risk of heart disease. A study involving thirty thousand people found that those whose diets were comprised of seventeen to twenty-one percent added sugar had a thirty-eight percent greater risk of dying from heart disease than those whose diets were comprised of only eight percent sugar.

The modern western man is living in the most prosperous times in history. There is less abject poverty and less starvation today than at any other period in history. The downside of this has been an increased proclivity for greed, sloth, and, as a consequence, ever-expanding waistbands. The answer to the obesity crisis is to improve our lifestyles.

ISRAEL FALOU’S BATTLE WITH RUGBY AUSTRALIA IS A TEST FOR ALL AUSTRALIANS

israelfolau_hdv

Where does society end and the rights of the individual begin? That is the true question that lies at the bottom of the Israel Folau controversy. The courts have been given the unenvious task of determining whether an organisation has the right to punish those members who don’t share its views, or if the rights of the individual should be upheld.

Former rugby player, Israel Folau and his lawyers are seeking up to AuS$15 million (including Aus$5m for the irreparable damage done to Folau’s reputation) from Rugby Australia. Folau had had his contract with Rugby Australia terminated after he was found guilty of a high-level breach (the only kind that can result in termination) of their code of conduct. This high-level breach came from Folau’s decision to post a picture on Instagram stating that hell awaited “drunks, homosexuals, liars, fornicators, thieves, atheists, and idolaters.”

Having failed to reach a settlement with Rugby Australia at a Fair Work hearing, Folau and his lawyers have moved their case on to the Federal Court. Folau himself has merely expressed his desire for Rugby Australia to admit they terminated his contract because of his religious beliefs. In a video, Folau stated: “Hopefully, Rugby Australia will accept that my termination was unlawful and we can reach an agreement about how they can fix that mistake. First and foremost, I am hoping for an apology from Rugby Australia and an acknowledgement that even if they disagree with my views, I should be free to peacefully express my religious beliefs without fear of retribution or punishment.”

According to Rugby Australia’s, Folau’s contract was terminated on the basis that he had violated their requirement to “treat everyone equally, fairly and with dignity regardless of gender or gender identity, sexual orientation, ethnicity, cultural or religious background, age or disability.”

Of course, what really lies at the centre of the Folau case is not homophobia, but freedom of speech and freedom of religion. It is really a question of whether Israel Folau should be allowed to express his religious views without suffering economic or judicial penalty.

Both the US Supreme Court and the Australian Law Reform Commission have placed a special emphasis on freedom of speech. The US Supreme Court has noted that all other rights and freedoms are put in peril when freedom of speech is not protected. Similarly, the Australian Law Reform Commission has stated: “freedom of speech is a fundamental common law right. It has been described as the ‘freedom part excellence: for without it, no other freedom can survive.’

Likewise, the Australian Magna Carta Institute stated:

“Freedom of speech is an essential aspect of the rule of law and ensures there is accountability in government. People must be free to express their opinion about the content of laws, as well as the decisions of government or accountability is greatly reduced. Freedom of expression is a boarder term which incorporates free speech, the right to assemble, and other important ways of expressing ideas and opinions. The balance the law of Australia strikes between protecting and restricting freedom expression generally is very important to understand the health of the rule of law in Australia.”

It is remarkable to note, however, that freedom of speech is protected by neither the Constitution of Australia nor by Federal Legislation. In fact, there is a wide array of laws and regulations that place legal restrictions on expression. One cannot publish military secrets, incite criminal activity, or defame or libel another person.

Rather, freedom of speech is considered a common-law right adopted from the Westminster system. It is a feature of our political and legal traditions. The Australian High Court has stated that there is an implied right to freedom of expression embedded in the Australian Constitution (they did not say anything, however, about non-political expression). Likewise, Australia is also a signatory of the International Covenant on Civil and Political Rights which lists freedom of expression as a fundamental right.

Freedom of religion is a natural extension of freedom of speech, expression, and association. It is derived from the simple fact that the government has no right to dictate what my beliefs should be. The government has no right to force me, a Christian, to accept gay marriage, abortion, or anything else I find incompatible with my beliefs.

Unlike freedom of speech, freedom of religion is a right guaranteed by the Australian Constitution. Section 116 of the Australian Constitution reads:

Commonwealth not to legislate in respect of religion

The Commonwealth shall not make any law for establishing any religion, or for imposing any religious observance, or for prohibiting the free exercise of any religion, and no religious test shall be required for any office or public trust under the Commonwealth.”

Similarly, freedom of religion is protected by Australian case law. In the case of Church of the New Faith v. Commissioner for Payroll Tax (Vic), the Judges Mason ACJ and Brennan J. commented: “freedom of religion, the paradigm freedom of conscience, is the essence of a free society.” Similarly, in the case of Evans v. New South Wales, the Federal Court decreed that religious freedom as an “important freedom generally accepted in society.”

The road to hell is paved with good intentions. A decision that favours Rugby Australia will give large organisations the legal mandate to bully and intimidate those that don’t agree with their views. If Australia’s Federal Court truly believes in freedom, it will uphold Israel Folau’s right to freedom of speech and religion, and rule against Rugby Australia.

On Constitutional Monarchy

i12510

I would like to begin this essay by reciting a poem by the English Romantic poet, William Wordsworth (1770 – 1850):

 

     Milton! thou shouldst be living at this hour:

            England hath need for thee: she is a fen

            Of stagnant waters: altar, sword, and pen,

            Fireside, the heroic wealth of hall and bower,

            Have forfeited their ancient English dower

            Of inward happiness. We are selfish men;

            Oh! raise us up, return to us again;

            And give us manners, virtue, freedom, power.

            Thy soul was like a star, and dwelt apart:

            Thou hadst a voice whose sound was like the sea:

            Pure as the naked heavens, majestic, free

            So didst thou travel on life’s common way,

            In cheerful godliness; and yet thy heart

            The lowliest duties on herself did lay.

 

The poem, entitled London 1802, is Wordsworth’s ode to an older, nobler time. In it he attempts to conjure up the spirit of John Milton (1608 – 1674), the writer and civil servant immortalised for all time as the writer of Paradise Lost.

Milton acts as the embodiment for a nobler form of humanity. He symbolises a time when honour and duty played far greater a role in the human soul than it did in Wordsworth’s time, or even today. It is these themes of honour, duty, and nobility that will provide the spiritual basis for constitutional monarchy.

It is a subject that I will return to much later in this essay. But, to begin, it would perhaps be more prudent to begin this essay in earnest by examining those aspects of English history that allowed both constitutional monarchy and English liberty to be borne.

The English monarchy has existed for over eleven-hundred years. Stretching from King Alfred the Great in the 9th century to Elizabeth II in the 21st, the English people have seen more than their fair share of heroes and villains, wise kings and despotic tyrants. Through their historical and political evolution, the British have developed, and championed, ideals of liberty, justice, and good governance. The English have gifted these ideals to most of the Western World through the importation of their culture to most of the former colonies.

It is a sad reality that there are many people, particularly left-wing intellectuals, who need to reminded of the contributions the English have made to world culture. The journalist, Peter Hitchens (1951 – ) noted in his book, The Abolition of Britain that abhorrence for one’s own country was a unique trait of the English intellectual. Similarly, George Orwell (1903 – 1950) once observed, an English intellectual would sooner be seen stealing from the poor box than standing for “God Save the King.”

However, these intellectuals fail to notice, in their arrogance, that “God save the King” is actually a celebration of constitutional monarchy and not symbolic reverence to an archaic and rather powerless royal family. It is intended to celebrate the nation as embodied in the form of a single person or family and the fact that the common man and woman can live in freedom because there are constitutional restraints placed on the monarch’s power.

If one’s understanding of history has come from films like Braveheart, it is easy to believe that all people in all times have yearned to be free. A real understanding of history, one that comes from books, however, reveals that this has not always been the case. For most of history, people lived under the subjugation of one ruler or another. They lived as feudal serfs, subjects of a king or emperor, or in some other such arrangement. They had little reason to expect such arrangements to change and little motivation to try and change them.

At the turn of the 17th century, the monarchs of Europe began establishing absolute rule by undermining the traditional feudal institutions that had been in place for centuries. These monarchs became all-powerful wielding their jurisdiction over all forms of authority: political, social, economic, and so forth.

To justify their mad dash for power, Europe’s monarchs required a philosophical argument that vindicated their actions. They found it in a political doctrine known as ‘the divine rights of kings.’ This doctrine, formulated by the Catholic Bishop, Jacques Bossuet (1627 – 1704) in his book, Politics Derived from Sacred Scripture, argued that monarchs were ordained by God and therefore represented His will. It was the duty of the people to obey that individual without question. As such, no limitations could be put on a monarch’s power.

What Bossuet was suggesting was hardly a new, but it did provide the justification many monarchs needed to centralise power in themselves. King James I (1566 – 1625) of England and Scotland saw monarchs as God’s lieutenants and believed that their actions should be tempered by the fear of God since they would be called to account at the Last Judgement. On the basis of this belief, King James felt perfectly justified in proclaiming laws without the consent of parliament and involving himself in cases being tried before the court.

When King James died in 1625, he was succeeded by his second-eldest son, Charles (1600 – 1649). King Charles I assumed the throne during a time of political change. He was an ardent believer in the divine rights of kings, a belief that caused friction between the monarch and parliament from whom he had to get approval to raise funds.

In 1629, Charles outraged much of the population, as well as many nobles, when he elected to raise funds for his rule using outdated taxes and fines, and stopped calling parliament altogether. Charles had been frustrated by Parliament’s constant attacks on him and their refusal to furnish him with money. The ensuing period would become known as the eleven years tyranny.

By November 1640, Charles had become so bereft of funds that he was forced to recall Parliament. The newly assembled Parliament immediately began clamouring for change. They asserted the need for a regular parliament and sought changes that would make it illegal for the King to dissolve the political body without the consent of its members. In addition, the Parliament ordered the king to execute his friend and advisor, Thomas Wentworth (1593 – 1641), the 1st Earl of Stafford, for treason.

The result was a succession of civil wars that pitted King Charles against the forces of Parliament, led by the country gentlemen, Oliver Cromwell (1599 – 1658). Hailing from Huntingdon, Cromwell was a descendant of Henry VIII’s (1491 – 1547) chief minister, Thomas Cromwell (1485 – 1550). In the end, it would decimate the English population and forever alter England’s political character.

The English Civil War began in January 1642 when King Charles marched on Parliament with a force of four-hundred-thousand men. He withdrew to Oxford after being denied entry. Trouble was brewing. Throughout the summer, people aligned themselves with either the monarchists or the Parliamentarians.

The forces of King Charles and the forces of Parliament would meet at the Battle of Edgehill in October. What would follow is several years of bitter and bloody conflict.

Ultimately, it was Parliament that prevailed. Charles was captured, tried for treason, and beheaded on January 30th, 1642. England was transformed into a republic or “commonwealth.” The English Civil War had claimed the lives of two-hundred-thousand peoples, divided families, and facilitated enormous social and political change. Most importantly, however, it set the precedent that a monarch could not rule without the consent of parliament.

The powers of parliament had been steadily increasing since the conclusion of the English Civil War. However, total Parliamentary supremacy had proven unpopular. The Commonwealth created in the wake of the Civil War had collapsed shortly after Oliver Cromwell’s death. When this happened, it was decided to restore the Stuart dynasty.

The exiled Prince Charles returned to France and was crowned King Charles II (1630 – 1685). Like his father and grandfather, Charles was an ardent believer in the divine rights of kings. This view put him at odds with those of the Enlightenment which challenged the validity of absolute monarchy, questioned traditional authority, and idealised liberty.

By the third quarter of the 17th century, Protestantism had triumphed in both England and Scotland. Ninety-percent of the British population was Protestant. The Catholic minority was seen as odd, sinister, and, in extreme cases, outright dangerous. People equated Catholicism with tyranny linking French-Style autocracy with popery.

It should come as no surprise, then, that Catholics became the target of persecution. Parliament barred them from holding offices of state and banned Catholic forms of worship. Catholics were barred from becoming members of Parliament, justices of the peace, officers in the army, or hold any other position in Parliament unless they were granted a special dispensation by the King.

It is believed that Charles II may have been a closet Catholic. He was known for pardoning Catholics for crimes (controversial considering Great Britain was a protestant country) and ignoring Parliament.

However, Charles’ brother and successor, James (1633 – 1701) was a Catholic beyond any shadow of a doubt. He had secretly converted in 1669 and was forthright in his faith. After his first wife, Anne Hyde (1637 – 1671) died, James had even married the Italian Catholic, Mary of Modena (1658 – 1718). A decision that hardly endeared him to the populace.

The English people became alarmed when it became obvious that Charles II’s wife, Catherine of Braganza (1638 – 1705) would not produce a Protestant heir. It meant that Charles’ Catholic brother, James was almost certainly guaranteed to succeed him on the throne. So incensed was Parliament at having a Catholic on the throne, they attempted to pass the Crown onto one of Charles’ Anglican relatives.

Their concern was understandable, too. The English people had suffered the disastrous effects of religious intolerance since Henry VIII had broken away from the Catholic Church and established the Church of England. The result had been over a hundred years of religious conflict and persecution. Mary I (1516 – 1558), a devout Catholic, had earnt the moniker “bloody Mary” for burning Protestants the stake. During the reign of King James, Guy Fawkes (1570 – 1606), along with a group of Catholic terrorists, had attempted to blow up Parliament in the infamous “gunpowder plot.”

Unlike Charles II, James made his faith publicly known. He desired greater tolerance for Catholics and non-Anglican dissenters like Quakers and Baptists. The official documents he issued, designed to bring about the end of religious persecution, were met with considerable objection from both Bishops and Europe’s protestant monarchs.

Following the passing of the Test Act in 1672, James had briefly been forced to abandon his royal titles. The Act required officers and members of the nobility to take the Holy Communion as spelt out by the Church of England. It was designed to prevent Catholics from taking public office.

Now, as King, James was attempting to repeal the Test Act by placing Catholics in positions of power. His Court featured many Catholics and he became infamous for approaching hundreds of men – justices, wealthy merchants, and minor landowners – to stand as future MPs and, in a process known as ‘closeting’, attempting to persuade them to support his legal reforms. Most refused.

That was not the limits of James’ activities, either. He passed two Declarations of Indulgences to be read from every stage for two Sundays, and put those who opposed it on trial for seditious libel. Additionally, he had imprisoned seven Bishops for opposing him, made sweeping changes to the Church of England, and built an army comprising mainly of Catholics.

The people permitted James II to rule as long as his daughter, the Protestant Prince Mary (1662 – 1694) remained his heir. All this changed, however, when Mary Modena produced a Catholic heir: James Francis Edward Stuart (1688 – 1766). When James declared that the infant would be raised Catholic, it immediately became apparent that a Catholic dynasty was about to be established. Riots broke out. Conspiracy theorists posited that the child was a pawn in a Popish plot. The child, the theory went, was not the King’s son but rather a substitute who had been smuggled into the birthing chamber in a bed-warming pan.

In reality, it was the officers of the Army and Navy who were beginning to plot and scheme in their taverns and drinking clubs. They were annoyed that James had introduced Papist officers into the military. The Irish Army, for example, had seen much of its Protestant officer corps dismissed and replaced with Catholics who had little to no military experience.

James dissolved Parliament in July 1688. Around this time, a Bishop and six prominent politicians wrote to Mary and her Dutch husband, William of Orange (1650 – 1702) and invited them to raise an army, invade London, and seize the throne. They accepted.

William landed in Dorset on Guy Fawkes’ day accompanied by an army of fifteen-thousand Dutchmen and other Protestant Europeans. He quickly seized Exeter before marching eastward towards London. James II called for troops to confront William.

Things were not looking good for James, however. Large parts of his officer corps were defecting to the enemy and taking their soldiers with them. Without the leadership of their officers, many soldiers simply went home. English magnates started declaring for William. And his own daughter, Princess Anne (1665 – 1714) left Whitehall to join the rebels in Yorkshire. James, abandoned by everyone, fled to exile in France. He would die there twelve-years-later.

On January 22nd, 1689, William called the first ‘convention parliament.’ At this ‘convention’, Parliament passed two resolutions. First, it was decided that James’ flight into exile constituted an act of abdication. And second, it was declared a war against public policy for the throne to be occupied by a Catholic. As such, the throne was passed over James Francis Edward Stuart, and William and Mary were invited to take the Crown as co-monarchs.

They would be constrained, however, by the 1689 Bill of Rights and, later, by the 1701 Act of Settlement. The 1689 Bill of Rights made Great Britain a constitutional monarchy as opposed to an absolute one. It established Parliament, not the crown, as the supreme source of law. And it set out the most basic rights of the people.

Likewise, the 1701 Act of Settlement helped to strengthen the Parliamentary system of governance and secured a Protestant line of succession. Not only did it prevent Catholics from assuming the throne, but it also gave Parliament the ability to dictate who could ascend to the throne and who could not.

The Glorious Revolution was one of the most important events in Britain’s political evolution. It made William and Mary, and all monarchs after them, elected monarchs. It established the concept of Parliamentary sovereignty granting that political body the power to make or unmake any law it chose to. The establishment of Parliamentary sovereignty brought with it the ideas of responsible and representative government.

The British philosopher, Roger Scruton (1944 – ) described British constitutional monarchy as a “light above politics which shines down [on] the human bustle from a calmer and more exalted sphere.” A constitutional monarchy unites the people for a nation under a monarch who symbolises their shared history, culture, and traditions.

Constitutional monarchy is a compromise between autocracy and democracy. Power is shared between the monarch and the government, both of whom have their powers restricted by a written, or unwritten, constitution. This arrangement separates the theatre of power from the realities of power. The monarch is able to represent the nation whilst the politician is able to represent his constituency (or, more accurately, his party).

In the Need for Roots, the French philosopher, Simone Weils (1909 – 1943) wrote that Britain had managed to maintain a “centuries-old tradition of liberty guaranteed by the authorities.” Weils was astounded to find that chief power in the British constitution lay in the hands of a lifelong, unelected monarch. For Weils, it was this arrangement that allowed the British to retain its tradition of liberty when other countries – Russia, France, and Germany, among others – lost theirs when they abolished their monarchies.

sir_isaac_isaacs_and_lady_isaacs

Great Britain’s great legacy is not their once vast and now non-existent Empire, but the ideas of liberty and governance that they have gifted to most of their former colonies. Even the United States, who separated themselves from the British by means of war, inherited most of their ideas about “life, liberty, and the pursuit of happiness” from their English forebears.

The word “Commonwealth” was adopted at the Sixth Imperial Conference held between October 19th and November 26th, 1926. The Conference, which brought together the Prime Ministers of the various dominions of the British Empire, led to the formation of the Inter-Imperial Relations Committee. The Committee, headed for former British Prime Minister, Arthur Balfour (1848 – 1930), was designed to look into future constitutional arrangements within the commonwealth.

Four years later, at the Seventh Imperial Conference, the committee delivered the Balfour Report. It stated:

“We refer to the group of self-governing communities composed of Great Britain and the Dominions. Their position and mutual relation may be readily defined. They are autonomous Communities within the British Empire, equal in status, in no way subordinate one to another in any aspect of their domestic or external affairs, though united by a common allegiance to the Crown, and freely associated as members of the British Commonwealth of Nations.”

It continued:

“Every self-governing member of the Empire is now the master of its destiny. In fact, if not always in form, it is subject to no compulsion whatsoever.”

Then, in 1931, the Parliament of the United Kingdom passed the Statute of Westminster. It became one of two laws that would secure Australia’s political and legal independence from Great Britain.

The Statute of Westminster gave legal recognition to the de-facto independence of the British dominions. Under the law, Australia, Canada, the Irish Free State, Newfoundland (which would relinquish its dominion status and be absorbed into Canada in 1949), New Zealand and South Africa were granted legal independence.

Furthermore, the law abolished the Colonial Validity Act 1865. A law which had been enacted with the intention of removing “doubts as to the validity of colonial laws.” According to the act, a Colonial Law was void when it “is or shall be in any respect repugnant to the provisions of any Act of Parliament extending to the colony to which such laws may relate, or repugnant to any order or regulation under authority of such act of Parliament or having in the colony the force and effect of such act, shall be read subject to such act, or regulation, and shall, to the extent of such repugnancy, but not otherwise, be and remain absolutely void and inoperative.”

The Statute of Westminster was quickly adopted by Canada, South Africa, and the Irish Free State. Australia, on the other hand, did not adopt it until 1942, and New Zealand did not adopt it until 1947.

More than forty-years-later, the Hawke Labor government passed the Australia Act 1986. This law effectively made the Australian legal system independent from Great Britain. It had three major achievements. First, it ended appeals to the Privy Council thereby establishing the High Court as the highest court in the land. Second, it ended the influence the British government had over the states of Australia. And third, it allowed Australia to update or repeal those imperial laws that applied to them by ending British legislative restrictions.

What the law did not do, however, was withdraw the Queen’s status as Australia’s Head of State:

“Her Majesty’s Representative in each State shall be the Governor.

Subject to subsections (3) and (4) below, all powers and functions of Her Majesty in respect of a State are exercisable only by the Governor of the State.

Subsection (2) above does not apply in relation to the power to appoint, and the power to terminate the appointment of, the Governor of a State.

While her Majesty is personally present in a State, Her Majesty is not precluded from exercising any of Her powers and functions in respect of the State that are the subject of subsection (2) above.

The advice of Her Majesty in relation to the exercise of powers and functions of Her Majesty in respect of a State shall be tendered by the Premier of the State.”

These two laws reveal an important miscomprehension that is often exploited by Australian Republicans. That myth is the idea that Australia does not have legal and political independence because its Head of State is the British monarch. The passage of the Statute of Westminster in 1931 and the Australia Act in 1986 effectively ended any real political or legal power the British government had over Australia.

In Australia, the monarch (who is our head of state by law) is represented by a Governor General. This individual – who has been an Australian since 1965 – is required to take an oath of allegiance and an oath of office that is administered by a Justice (typically the Chief Justice) of the High Court. The Governor-General holds his or her position at the Crown’s pleasure with appointments typically lasting five years.

The monarch issues letters patent to appoint the Governor General based on the advice of Australian ministers. Prior to 1924, Governor Generals were appointed on the advice of both the British government and the Australian government. This is because the Governor General at that time represented both the monarch and the British government. This arrangement changed, however, at the Imperial Conferences of 1926 and 1930. The Balfour Report produced by these conferences stated that the Governor General should only be the representative of the crown.

The Governor General’s role is almost entirely ceremonial. It has been argued that such an arrangement could work with an elected Head of State. However, such an arrangement would have the effect of politicising and thereby corrupting the Head of State. A Presidential candidate in the United States, for example, is required to raise millions of dollars for his campaign and often finds himself beholden to those donors who made his ascent possible. The beauty of having an unelected Head of State, aside from the fact that it prevents the government from assuming total power, is that they can avoid the snares that trap other political actors.

image-20151106-16263-1t48s2d

The 1975 Constitutional Crisis is a perfect example of the importance of having an independent and impartial Head of State. The crises stemmed from the Loans Affair which forced Dr. Jim Cairns (1914 – 2003), Deputy Prime Minister, Treasurer, and intellectual leader of the political left, and Rex Connor (1907 – 1977) out of the cabinet. As a consequence of the constitutional crisis, Gough Whitlam (1916 – 2014) was dismissed as Prime Minister and the 24th federal parliament was dissolved.

The Loan’s affair began when Rex Connor attempted to borrow money, up to US$4b, to fund a series of proposed national development projects. Connor deliberately flouted the rules of the Australian Constitution which required him to take such non-temporary government borrowing to the Loan Council (a ministerial council consisting of both Commonwealth and state elements which existed to coordinate public sector borrowing) for approval. Instead, on December 13th, 1974, Gough Whitlam, Attorney-General Lionel Murphy (1922 – 1986), and Dr. Jim Cairns authorised Connor to seek a loan without the council’s approval.

When news of the Loans Affair was leaked, the Liberal Party, led by Malcolm Fraser (1930 – 2015), began questioning the government. Whitlam attempted to brush the scandal aside by claiming that the loans had merely been “matters of energy” and claiming that the Loans Council would only be advised once a loan had been made. Then, on May 21st, Whitlam informed Fraser that the authority for the plan had been revoked.

Despite this, Connor continued to liaise with the Pakistani financial broker, Tirath Khemlani (1920 – 1991). Khemlani was tracked down and interviewed by Herald Journalist, Peter Game (1927 – ) in mid-to-late 1975. Khemlani claimed that Connor had asked for a twenty-year loan with an interest of 7.7% and a 2.5% commission for Khemlani. The claim threw serious doubt on Dr. Jim Cairn’s claim that the government had not offered Khemlani a commission on a loan. Game also revealed that Connor and Khemlani were still in contact, something Connor denied in the Sydney Morning Herald.

Unfortunately, Khemlani had stalled on the loan, most notably when he had been asked to go to Zurich with Australian Reserve Bank officials to prove the funds were in the Union Bank of Switzerland. When it became apparent that Khemlani would never deliver Whitlam was forced to secure the loan through a major American investment bank. As a condition of that loan, the Australian government was required to cease all other loans activities. Consequentially, Connor had his loan raising authority revoked on May 20th, 1975.

The combination of existing economic difficulties with the political impact of the Loan’s Affair severely damaged to the Whitlam government. At a special one day sitting of the Parliament held on July 9th, Whitlam attempted to defend the actions of his government and tabled evidence concerning the loan. It was an exercise in futility, however. Malcolm Fraser authorised Liberal party senators – who held the majority in the upper house at the time – to force a general election by blocking supply.

And things were only about to get worse. In October 1975, Khemlani flew to Australia and provided Peter Game with telexes and statutory declarations Connor had sent him as proof that he and Connor had been in frequent contact between December 1974 and May 1975. When a copy of this incriminating evidence found its way to Whitlam, the Prime Minister had no other choice but to dismiss Connor and Cairns (though he did briefly make Cairns Minister for the Environment).

By mid-October, every metropolitan newspaper in Australia was calling on the government to resign. Encouraged by this support, the Liberals in the Senate deferred the Whitlam budget on October 16th. Whitlam warned Fraser that the Liberal party would be “responsible for bills not being paid, for salaries not being paid, for utter financial chaos.” Whitlam was alluding to the fact that blocking supply threatened essential services, Medicare rebates, the budgets of government departments and the salaries of public servants. Fraser responded by accusing Whitlam of bringing his own government to ruin by engaging in “massive illegalities.”

On October 21st, Australian’s longest-serving Prime Minister, Sir Robert Menzies (1894 – 1978) signalled his support for Fraser and the Liberals. The next day, Treasurer, Bill Hayden (1933 – ) reintroduced the budget bills and warned that further delay would increase unemployment and deepen a recession that had blighted the western world since 1973.

The crisis would come to a head on Remembrance Day 1975. Whitlam had asserted for weeks that the Senate could not force him into an election by claiming that the House of Representatives had an independence and an authority separate from the Senate.

Whitlam had decided that he would end the stalemate by seeking a half-senate election. Little did he know, however, that the Governor-General, Sir John Kerr (1914 – 1991) had been seeking legal advice from the Chief Justice of the High Court on how he could use his Constitutional Powers to end the deadlock. Kerr had come to the conclusion that should Whitlam refuse to call a general election, he would have no other alternative but to dismiss him.

And this is precisely what happened. With the necessary documents drafted, Whitlam arranged to meet Kerr during the lunch recess. When Whitlam refused to call a general election, Kerr dismissed him and, shortly after, swore in Malcolm Fraser as caretaker Prime Minister. Fraser assured Kerr that he would immediately pass the supply bills and dissolve both houses in preparation for a general election.

Whitlam returned to the Lodge to eat lunch and plan his next movie. He informed his advisors that he had been dismissed. It was decided that Whitlam’s best option was to assert Labor’s legitimacy as the largest party in the House of Representatives. However, fate was already moving against Whitlam. The Senate had already passed the supply bills and Fraser was drafting documents that would dissolve the Parliament.

At 2pm, Deputy Prime Minister, Frank Crean (1916 – 2008) defended the government against a censure motion started by the opposition. “What would happen, for argument’s sake, if someone else were to come here today and say he was now the Prime Minister of this country”, Crean asked. In fact, Crean was stalling for time while Whitlam prepared his response.

At 3pm, Whitlam made a last-ditch effort to save his government by addressing the House. Removing references to the Queen, he asked that the “House expresses its want of confidence in the Prime Minister and requests, Mr. Speaker, forthwith to advice His Excellency, the Governor-General to call the member of Wannon to form a government.” Whitlam’s motion was passed with a majority of ten.

The speaker, Gordon Scholes (1931 – 2018) expressed his intention to “convey the message of the House to His Excellency at the first opportunity.” It was a race that Whitlam was not supposed to win. Scholes was unable to arrange an appointment until quarter-to-five in the afternoon.

Behind the scenes, departmental officials were working to provide Fraser with the paperwork he needed to proclaim a double dissolution. By ten-to-four, Fraser left for government house. Ten minutes later, Sir John Kerr had signed the proclamation dissolving both Houses of Parliament and set the date for the upcoming election for December 13th, 1975. Shortly after, Kerr’s official secretary, David Smith (1933) drove to Parliament House and, with Whitlam looming behind him, read the Governor General’s proclamation.

The combination of economic strife, political scandal, and Whitlam’s dismissal signed the death warrant for Whitlam’s government. At the 1975 Federal Election, the Liberal-National coalition won by a landslide, gaining a majority of ninety-one seats and obtaining a popular vote of 4,102,078. In the final analysis, it seems that the Australian people had agreed with Kerr’s decision and had voted to remove Whitlam’s failed government from power once and for all.

23163929155_9f41dc691d_h

Most of the arguments levelled against constitutional monarchies can be described as petty, childish, and ignorant. The biggest faux pas those who oppose constitutional monarchies make is a failure to separate the royal family (who are certainly not above reproach) from the institution of monarchy itself. Dislike for the Windsor family is not a sufficient reason to disagree with constitutional monarchy. It would be as if I decided to argue for the abolition of the office of Prime Minister just because I didn’t like the person who held that office.

One accusation frequently levelled against the monarchy is that they are an undue financial burden on the British taxpaying public. This is a hollow argument, however. It is certainly true that the monarchy costs the British taxpayer £299.4 million every year. And it is certainly true that the German Presidency costs only £26 million every year. However, it is not true that all monarchies are necessarily more expensive than Presidencies. The Spanish monarchy costs only £8 million per year, less than the Presidencies of Germany, Finland, and Portugal.

Australia has always had a small but vocal republican movement. The National Director of the Republican Movement, Michael Cooney has stated: “no one thinks it ain’t broken, that we should fix it. And no one thinks we have enough say over our future, and so, no matter what people think about in the sense of the immediate of the republic everyone knows that something is not quite working.”

History, however, suggests that the Australian people do not necessarily agree with Cooney’s assessment. The Republican referendum of 1999 was designed to facilitate two constitutional changes: first, the establishment of a republic, and, second, the insertion of a preamble in the Constitution.

The Referendum was held on November 6th, 1999. Around 99.14%, or 11,683,811 people, of the Australian voting public participated. 45.13%, or 5,273,024 voted yes. However, 54.87%, or 6,410,787 voted no. The Australian people had decided to maintain Australia’s constitutional monarchy.

All things considered, it was probably a wise decision. The chaos caused by establishing a republic would pose a greater threat to our liberties than a relatively powerless old lady. Several problems would need to be addressed. How often should elections occur? How would these elections be held? What powers should a President have? Will a President be just the head of state, or will he be the head of the government as well? Australian republicans appear unwilling to answer these questions.

Margaret Tavits of Washington University in St. Louis once observed that: “monarchs can truly be above politics. They usually have no party connections and have not been involved in daily politics before assuming the post of Head of State.” It is the job of the monarch to become the human embodiment of the nation. It is the monarch who becomes the centrepiece of pageantry and spectacle. And it the monarch who symbolises a nation’s history, tradition, and values.

Countries with elected, or even unelected, Presidents can be quite monarchical in style. Americans, for example, often regard their President (who is both the Head of State and the head of the government) with an almost monarchical reverence. A constitutional monarch might be a lifelong, unelected Head of State, but unlike a President, that is generally where their power ends. It is rather ironic that the Oxford political scientists, Petra Schleiter and Edward Morgan-Jones have noted that allow governments to change without democratic input like elections than monarchs are. Furthermore, by occupying his or her position as Head of State, the monarch is able to prevent other, less desirable people from doing so.

The second great advantage of constitutional monarchies is that they provide their nation with stability and continuity. It is an effective means to bridging the past and future. A successful monarchy must evolve with the times whilst simultaneously keeping itself rooted in tradition. All three of my surviving grandparents have lived through the reign of King George VI, Queen Elizabeth II, and may possibly live to see the coronation of King Charles III. I know that I will live through the reigns of Charles, King William V, and possibly survive to see the coronation of King George VII (though he will certainly outlive me).

It would be easy to dismiss stability and continuity as manifestations of mere sentimentality, but such things also have a positive effect on the economy, as well. In a study entitled Symbolic Unity, Dynastic Continuity, and Countervailing Power: Monarchies, Republics and the Economy Mauro F. Guillén found that monarchies had a positive impact on economies and living standards over the long term. The study, which examined data from one-hundred-and-thirty-seven countries including different kinds of republics and dictatorships, found that individuals and businesses felt more confident that the government was not going to interfere with their property in constitutional monarchies than in republics. As a consequence, they are more willing to invest in their respective economies.

When Wordsworth wrote his ode to Milton, he was mourning the loss of chivalry he felt had pervaded English society. Today, the West is once again in serious danger of losing those two entities that is giving them a connection to the chivalry of the past: a belief in God and a submission to a higher authority.

Western culture is balanced between an adherence to reason and freedom on the one hand and a submission to God and authority on the other. It has been this delicate balance that has allowed the West to become what it is. Without it, we become like Shakespeare’s Hamlet: doomed to a life of moral and philosophical uncertainty.

It is here that the special relationship between freedom and authority that constitutional monarchy implies becomes so important. It satisfies the desire for personal autonomy and the need for submission simultaneously.

The Christian apologist and novelist, C.S. Lewis (1898 – 1964) once argued that most people no more deserved a share in governing a hen-roost than they do in governing a nation:

“I am a democrat because I believe in the fall of man. I think most people are democrats for the opposite reason. A great deal of democratic enthusiasm descends from the idea of people like Rousseau who believed in democracy because they thought mankind so wise and good that everyone deserved a share in the government. The danger of defending democracy on those grounds is that they’re not true and whenever their weakness is exposed the people who prefer tyranny make capital out of the exposure.”

The necessity for limited government, much like the necessity for authority, comes from our fallen nature. Democracy did not arise because people are so naturally good (which they are not) that they ought to be given unchecked power over their fellows. Aristotle (384BC – 322BC) may have been right when he stated that some people are only fit to be slaves, but unlimited power is wrong because there is no one person who is perfect enough to be a master.

Legal and economic equality are necessary bulwarks against corruption and cruelty. (Economic equality, of course, refers to the freedom to engage in lawful economic activity, not to socialist policies of redistributing wealth that inevitably lead to tyranny). Legal and economic equality, however, does not provide spiritual sustenance. The ability to vote, buy a mobile phone, or work a job without being discriminated against may increase the joy in your life, but it is not a pathway to genuine meaning in life.

Equality serves the same purpose that clothing does. We are required to wear clothing because we are no longer innocent. The necessity of clothes, however, does not mean that we do not sometimes desire the naked body. Likewise, just because we adhere to the idea that God made all people equal does not mean that there is not a part of us that does not wish for inequality to present itself in certain situations.

Chivalry symbolises the best human beings can be. It helps us realise the best in ourselves by reconciling fealty and command, inferiority and superiority. However, the ideal of chivalry is a paradox. When the veil of innocence has been lifted from our eyes, we are forced to reconcile ourselves to the fact that bullies are not always cowards and heroes are not always modest. Chivalry, then, is not a natural state, but an ideal to be aimed for.

The chivalric ideal marries the virtues of humility and meekness with those of valour, bravery, and firmness. “Thou wert the meekest man who ever ate in hall among ladies”, said Sir Ector to the dead Lancelot. “And thou wert the sternest knight to thy mortal foe that ever-put spear in the rest.”

Constitutional monarchy, like chivalry, makes a two-fold demand on the human spirit. Its democratic element, which upholds liberty, demands civil participation from all its citizens. And its monarchical element, which champions tradition and authority, demands that the individual subjugate himself to that tradition.

It has been my aim in this essay to provide a historical, practical, and spiritual justification for constitutional monarchy. I have demonstrated that the British have developed ideals of liberty, justice, and good governance. The two revolutions of the 17th century – the English Civil War and the Glorious Revolution – established Great Britain as a constitutional monarchy. It meant that the monarch could not rule without the consent of parliament, established parliament as the supreme source of law, and allowed them to determine the line of succession. I have demonstrated that constitutional monarchs are more likely to uphold democratic principles and that the stability they produce encourages robust economies. And I have demonstrated that monarchies enrich our souls because it awakens in us the need for both freedom and obedience.

Our world has become so very vulgar. We have turned our backs on God, truth, beauty, and virtue. Perhaps we, like Wordsworth before us, should seek virtue, manners, freedom, and power. We can begin to do this by retaining the monarchy.

I’m Done with Modern Movies

bvstrio.0

For the life of me, I cannot remember the last time I saw a contemporary movie that was memorable in any way. Despite having access to both television and Netflix, I have found it virtually impossible to find a movie that I actually thought was worth watching.

It would be wrong, however, to lay the entirety of the blame on either mainstream television or Netflix. (Although it is entirely fair to argue that the litany of rubbish offered by television is a symptom of a dying medium). Rather, it is indicative of a problem that has pervaded the entire filmmaking industry. Modern filmmakers appear to be content with making defective movies. Movies that feature predictable stories, two-dimensional characters, and an over-reliance on visual effects.

This was not always the case. For years Hollywood was known for producing great, culture-defining films. The classical period of American cinema (which lasted from the 1930s to the 1960s) produced films like Gone with the Wind, Casablanca, and Ben Hur, among many, many others.

Similarly, the 1960s and 1970s saw a renaissance in film as filmmakers like Martin Scorsese, Stanley Kubrick, Steven Spielberg, Francis Ford Coppola, and many others reinvented and reinvigorated motion picture. This became the era that produced films like the Godfather, the French Connection, and the Good, the Bad, and the Ugly.

Hollywood’s total lack of artistic brilliance has been caused by three problems: the lack of originality, the lack of artistic merit, and the saturation of progressive politics in the industry.

Modern Movies Lack Originality

ben-hur-rpt-in

The most conspicuous problem inflicting Hollywood today is a total lack of originality. Neither their stories nor their characters appear to have any originality or depth whatsoever. Most films today are either remakes, reboots, sequels, are based on comic books, or are about superheroes. Now there is nothing wrong with these films in and of themselves, but when every single movie made is one of these five things, it starts to get a little tiresome.

The problem doesn’t stop at just narrative, either. Modern film characters are often two-dimensional and, as a result, rather dull. They are mouthpieces for certain ideological beliefs and are therefore often presented in entirely black or white terms. The problem with this, of course, is that people in real life are usually complicated. They make mistakes, hold contradictory views, and often behave in irrational ways. One would never see an obvious racist like Ethan Edwards (John Wayne) in The Searchers or Jett Rink (James Dean) in Giant. These characters, though they reflect real life, are just too politically incorrect, too human to be presented in any real or sympathetic manner.

A lot of this comes from the travesty that was Star Wars and the litany of ‘blockbuster’ movies it left in its wake. Taken on its own merits, Star Wars is an excellent movie. However, it convinced Hollywood’s film producers that they should devote more time and money to producing shallow, unsophisticated movies that movies of genuine depth and meaning.

Big blockbuster movies are all well and good, but I am an adult and I would like to see movies with a certain level of maturity.

Modern Movies Lack Artistic Merit

Lawrence Of Arabia - 1962

The next glaring problem (though it is one that many people without a knowledge of film or film history would fail to notice) is the total lack of artistic merit in modern filmmaking. The films of the past often prided themselves on their creative and technical brilliance. Modern filmmakers, by contrast, seem more than happy to rest on their laurels and make easy cliched movies.

With the possible exception of Martin Scorsese’s, The Aviator, I cannot remember the last time I saw a movie that made me marvel at its cinematography or that had a score which riled my spirit. I can, however, remember classic movies that managed to do all those things and more. I can remember marvelling at the cinematography in Lawrence of Arabia and sitting in awe of the chariot race – which utilised real stuntmen – in Ben Hur.

Modern filmmakers seem content with spending all their time and money on hey-wow visual effects and completely neglect the most important elements of film: story and character. As a consequence, they cheat their audience by offering sub-par films.

Modern filmmakers rely on visual effects because it is easier than trying to create compelling storylines and memorable characters. They choose to rely on computer-generated-imagery and blue screen because it is easier and safer (cowards) than using real stuntmen and practical effects.

The problem with all this is that the audience knows it’s being cheated. The car chase in Bullit looked so realistic was because, well, it was realistic. It used real cars driven by real people on real streets. A lot of modern movies, by contrast, look fake because, well, they are fake.

Modern Movies are Left-Wing Propaganda

1482171796

The third problem, and the one most egregious, is that Hollywood has become a propaganda outlet for progressive politics. They produce films that are so ideologically driven that one can virtually predict everything that is going to happen before it occurs. And, much like people who have been ideologically possessed, these films tend to be so boring they’re not worth wasting your time on.

The fact that Hollywood has become infected with ideologically possessed, far-left individuals is, to some extent, understandable. Filmmaking is an enterprise that attracts highly creative people who, for the most part, tend to be on the political left. The problem, rather, lies in the fact that all the films Hollywood now produces carry a left-wing bias.

Hollywood has become an echo chamber in which “woke” vies are communicated and no other views are allowed to get in. Those associated with the movies compete at the Oscars and at the Academy Awards to see who can be the most virtuous. And they criticise and demean anyone who doesn’t agree with them. They are like Marie Antoinette saying “let them eat cake” as the peasants starve to death in the streets. They are completely out of touch.

The problem with the films being produced today is that their left-wing bias has made them completely shallow and totally predictable.