Home » Posts tagged 'society'
Tag Archives: society
Where does society end and the rights of the individual begin? That is the true question that lies at the bottom of the Israel Folau controversy. The courts have been given the unenvious task of determining whether an organisation has the right to punish those members who don’t share its views, or if the rights of the individual should be upheld.
Former rugby player, Israel Folau and his lawyers are seeking up to AuS$15 million (including Aus$5m for the irreparable damage done to Folau’s reputation) from Rugby Australia. Folau had had his contract with Rugby Australia terminated after he was found guilty of a high-level breach (the only kind that can result in termination) of their code of conduct. This high-level breach came from Folau’s decision to post a picture on Instagram stating that hell awaited “drunks, homosexuals, liars, fornicators, thieves, atheists, and idolaters.”
Having failed to reach a settlement with Rugby Australia at a Fair Work hearing, Folau and his lawyers have moved their case on to the Federal Court. Folau himself has merely expressed his desire for Rugby Australia to admit they terminated his contract because of his religious beliefs. In a video, Folau stated: “Hopefully, Rugby Australia will accept that my termination was unlawful and we can reach an agreement about how they can fix that mistake. First and foremost, I am hoping for an apology from Rugby Australia and an acknowledgement that even if they disagree with my views, I should be free to peacefully express my religious beliefs without fear of retribution or punishment.”
According to Rugby Australia’s, Folau’s contract was terminated on the basis that he had violated their requirement to “treat everyone equally, fairly and with dignity regardless of gender or gender identity, sexual orientation, ethnicity, cultural or religious background, age or disability.”
Of course, what really lies at the centre of the Folau case is not homophobia, but freedom of speech and freedom of religion. It is really a question of whether Israel Folau should be allowed to express his religious views without suffering economic or judicial penalty.
Both the US Supreme Court and the Australian Law Reform Commission have placed a special emphasis on freedom of speech. The US Supreme Court has noted that all other rights and freedoms are put in peril when freedom of speech is not protected. Similarly, the Australian Law Reform Commission has stated: “freedom of speech is a fundamental common law right. It has been described as the ‘freedom part excellence: for without it, no other freedom can survive.’
Likewise, the Australian Magna Carta Institute stated:
“Freedom of speech is an essential aspect of the rule of law and ensures there is accountability in government. People must be free to express their opinion about the content of laws, as well as the decisions of government or accountability is greatly reduced. Freedom of expression is a boarder term which incorporates free speech, the right to assemble, and other important ways of expressing ideas and opinions. The balance the law of Australia strikes between protecting and restricting freedom expression generally is very important to understand the health of the rule of law in Australia.”
It is remarkable to note, however, that freedom of speech is protected by neither the Constitution of Australia nor by Federal Legislation. In fact, there is a wide array of laws and regulations that place legal restrictions on expression. One cannot publish military secrets, incite criminal activity, or defame or libel another person.
Rather, freedom of speech is considered a common-law right adopted from the Westminster system. It is a feature of our political and legal traditions. The Australian High Court has stated that there is an implied right to freedom of expression embedded in the Australian Constitution (they did not say anything, however, about non-political expression). Likewise, Australia is also a signatory of the International Covenant on Civil and Political Rights which lists freedom of expression as a fundamental right.
Freedom of religion is a natural extension of freedom of speech, expression, and association. It is derived from the simple fact that the government has no right to dictate what my beliefs should be. The government has no right to force me, a Christian, to accept gay marriage, abortion, or anything else I find incompatible with my beliefs.
Unlike freedom of speech, freedom of religion is a right guaranteed by the Australian Constitution. Section 116 of the Australian Constitution reads:
Commonwealth not to legislate in respect of religion
The Commonwealth shall not make any law for establishing any religion, or for imposing any religious observance, or for prohibiting the free exercise of any religion, and no religious test shall be required for any office or public trust under the Commonwealth.”
Similarly, freedom of religion is protected by Australian case law. In the case of Church of the New Faith v. Commissioner for Payroll Tax (Vic), the Judges Mason ACJ and Brennan J. commented: “freedom of religion, the paradigm freedom of conscience, is the essence of a free society.” Similarly, in the case of Evans v. New South Wales, the Federal Court decreed that religious freedom as an “important freedom generally accepted in society.”
The road to hell is paved with good intentions. A decision that favours Rugby Australia will give large organisations the legal mandate to bully and intimidate those that don’t agree with their views. If Australia’s Federal Court truly believes in freedom, it will uphold Israel Folau’s right to freedom of speech and religion, and rule against Rugby Australia.
I would like to begin this essay by reciting a poem by the English Romantic poet, William Wordsworth (1770 – 1850):
Milton! thou shouldst be living at this hour:
England hath need for thee: she is a fen
Of stagnant waters: altar, sword, and pen,
Fireside, the heroic wealth of hall and bower,
Have forfeited their ancient English dower
Of inward happiness. We are selfish men;
Oh! raise us up, return to us again;
And give us manners, virtue, freedom, power.
Thy soul was like a star, and dwelt apart:
Thou hadst a voice whose sound was like the sea:
Pure as the naked heavens, majestic, free
So didst thou travel on life’s common way,
In cheerful godliness; and yet thy heart
The lowliest duties on herself did lay.
The poem, entitled London 1802, is Wordsworth’s ode to an older, nobler time. In it he attempts to conjure up the spirit of John Milton (1608 – 1674), the writer and civil servant immortalised for all time as the writer of Paradise Lost.
Milton acts as the embodiment for a nobler form of humanity. He symbolises a time when honour and duty played far greater a role in the human soul than it did in Wordsworth’s time, or even today. It is these themes of honour, duty, and nobility that will provide the spiritual basis for constitutional monarchy.
It is a subject that I will return to much later in this essay. But, to begin, it would perhaps be more prudent to begin this essay in earnest by examining those aspects of English history that allowed both constitutional monarchy and English liberty to be borne.
The English monarchy has existed for over eleven-hundred years. Stretching from King Alfred the Great in the 9th century to Elizabeth II in the 21st, the English people have seen more than their fair share of heroes and villains, wise kings and despotic tyrants. Through their historical and political evolution, the British have developed, and championed, ideals of liberty, justice, and good governance. The English have gifted these ideals to most of the Western World through the importation of their culture to most of the former colonies.
It is a sad reality that there are many people, particularly left-wing intellectuals, who need to reminded of the contributions the English have made to world culture. The journalist, Peter Hitchens (1951 – ) noted in his book, The Abolition of Britain that abhorrence for one’s own country was a unique trait of the English intellectual. Similarly, George Orwell (1903 – 1950) once observed, an English intellectual would sooner be seen stealing from the poor box than standing for “God Save the King.”
However, these intellectuals fail to notice, in their arrogance, that “God save the King” is actually a celebration of constitutional monarchy and not symbolic reverence to an archaic and rather powerless royal family. It is intended to celebrate the nation as embodied in the form of a single person or family and the fact that the common man and woman can live in freedom because there are constitutional restraints placed on the monarch’s power.
If one’s understanding of history has come from films like Braveheart, it is easy to believe that all people in all times have yearned to be free. A real understanding of history, one that comes from books, however, reveals that this has not always been the case. For most of history, people lived under the subjugation of one ruler or another. They lived as feudal serfs, subjects of a king or emperor, or in some other such arrangement. They had little reason to expect such arrangements to change and little motivation to try and change them.
At the turn of the 17th century, the monarchs of Europe began establishing absolute rule by undermining the traditional feudal institutions that had been in place for centuries. These monarchs became all-powerful wielding their jurisdiction over all forms of authority: political, social, economic, and so forth.
To justify their mad dash for power, Europe’s monarchs required a philosophical argument that vindicated their actions. They found it in a political doctrine known as ‘the divine rights of kings.’ This doctrine, formulated by the Catholic Bishop, Jacques Bossuet (1627 – 1704) in his book, Politics Derived from Sacred Scripture, argued that monarchs were ordained by God and therefore represented His will. It was the duty of the people to obey that individual without question. As such, no limitations could be put on a monarch’s power.
What Bossuet was suggesting was hardly a new, but it did provide the justification many monarchs needed to centralise power in themselves. King James I (1566 – 1625) of England and Scotland saw monarchs as God’s lieutenants and believed that their actions should be tempered by the fear of God since they would be called to account at the Last Judgement. On the basis of this belief, King James felt perfectly justified in proclaiming laws without the consent of parliament and involving himself in cases being tried before the court.
When King James died in 1625, he was succeeded by his second-eldest son, Charles (1600 – 1649). King Charles I assumed the throne during a time of political change. He was an ardent believer in the divine rights of kings, a belief that caused friction between the monarch and parliament from whom he had to get approval to raise funds.
In 1629, Charles outraged much of the population, as well as many nobles, when he elected to raise funds for his rule using outdated taxes and fines, and stopped calling parliament altogether. Charles had been frustrated by Parliament’s constant attacks on him and their refusal to furnish him with money. The ensuing period would become known as the eleven years tyranny.
By November 1640, Charles had become so bereft of funds that he was forced to recall Parliament. The newly assembled Parliament immediately began clamouring for change. They asserted the need for a regular parliament and sought changes that would make it illegal for the King to dissolve the political body without the consent of its members. In addition, the Parliament ordered the king to execute his friend and advisor, Thomas Wentworth (1593 – 1641), the 1st Earl of Stafford, for treason.
The result was a succession of civil wars that pitted King Charles against the forces of Parliament, led by the country gentlemen, Oliver Cromwell (1599 – 1658). Hailing from Huntingdon, Cromwell was a descendant of Henry VIII’s (1491 – 1547) chief minister, Thomas Cromwell (1485 – 1550). In the end, it would decimate the English population and forever alter England’s political character.
The English Civil War began in January 1642 when King Charles marched on Parliament with a force of four-hundred-thousand men. He withdrew to Oxford after being denied entry. Trouble was brewing. Throughout the summer, people aligned themselves with either the monarchists or the Parliamentarians.
The forces of King Charles and the forces of Parliament would meet at the Battle of Edgehill in October. What would follow is several years of bitter and bloody conflict.
Ultimately, it was Parliament that prevailed. Charles was captured, tried for treason, and beheaded on January 30th, 1642. England was transformed into a republic or “commonwealth.” The English Civil War had claimed the lives of two-hundred-thousand peoples, divided families, and facilitated enormous social and political change. Most importantly, however, it set the precedent that a monarch could not rule without the consent of parliament.
The powers of parliament had been steadily increasing since the conclusion of the English Civil War. However, total Parliamentary supremacy had proven unpopular. The Commonwealth created in the wake of the Civil War had collapsed shortly after Oliver Cromwell’s death. When this happened, it was decided to restore the Stuart dynasty.
The exiled Prince Charles returned to France and was crowned King Charles II (1630 – 1685). Like his father and grandfather, Charles was an ardent believer in the divine rights of kings. This view put him at odds with those of the Enlightenment which challenged the validity of absolute monarchy, questioned traditional authority, and idealised liberty.
By the third quarter of the 17th century, Protestantism had triumphed in both England and Scotland. Ninety-percent of the British population was Protestant. The Catholic minority was seen as odd, sinister, and, in extreme cases, outright dangerous. People equated Catholicism with tyranny linking French-Style autocracy with popery.
It should come as no surprise, then, that Catholics became the target of persecution. Parliament barred them from holding offices of state and banned Catholic forms of worship. Catholics were barred from becoming members of Parliament, justices of the peace, officers in the army, or hold any other position in Parliament unless they were granted a special dispensation by the King.
It is believed that Charles II may have been a closet Catholic. He was known for pardoning Catholics for crimes (controversial considering Great Britain was a protestant country) and ignoring Parliament.
However, Charles’ brother and successor, James (1633 – 1701) was a Catholic beyond any shadow of a doubt. He had secretly converted in 1669 and was forthright in his faith. After his first wife, Anne Hyde (1637 – 1671) died, James had even married the Italian Catholic, Mary of Modena (1658 – 1718). A decision that hardly endeared him to the populace.
The English people became alarmed when it became obvious that Charles II’s wife, Catherine of Braganza (1638 – 1705) would not produce a Protestant heir. It meant that Charles’ Catholic brother, James was almost certainly guaranteed to succeed him on the throne. So incensed was Parliament at having a Catholic on the throne, they attempted to pass the Crown onto one of Charles’ Anglican relatives.
Their concern was understandable, too. The English people had suffered the disastrous effects of religious intolerance since Henry VIII had broken away from the Catholic Church and established the Church of England. The result had been over a hundred years of religious conflict and persecution. Mary I (1516 – 1558), a devout Catholic, had earnt the moniker “bloody Mary” for burning Protestants the stake. During the reign of King James, Guy Fawkes (1570 – 1606), along with a group of Catholic terrorists, had attempted to blow up Parliament in the infamous “gunpowder plot.”
Unlike Charles II, James made his faith publicly known. He desired greater tolerance for Catholics and non-Anglican dissenters like Quakers and Baptists. The official documents he issued, designed to bring about the end of religious persecution, were met with considerable objection from both Bishops and Europe’s protestant monarchs.
Following the passing of the Test Act in 1672, James had briefly been forced to abandon his royal titles. The Act required officers and members of the nobility to take the Holy Communion as spelt out by the Church of England. It was designed to prevent Catholics from taking public office.
Now, as King, James was attempting to repeal the Test Act by placing Catholics in positions of power. His Court featured many Catholics and he became infamous for approaching hundreds of men – justices, wealthy merchants, and minor landowners – to stand as future MPs and, in a process known as ‘closeting’, attempting to persuade them to support his legal reforms. Most refused.
That was not the limits of James’ activities, either. He passed two Declarations of Indulgences to be read from every stage for two Sundays, and put those who opposed it on trial for seditious libel. Additionally, he had imprisoned seven Bishops for opposing him, made sweeping changes to the Church of England, and built an army comprising mainly of Catholics.
The people permitted James II to rule as long as his daughter, the Protestant Prince Mary (1662 – 1694) remained his heir. All this changed, however, when Mary Modena produced a Catholic heir: James Francis Edward Stuart (1688 – 1766). When James declared that the infant would be raised Catholic, it immediately became apparent that a Catholic dynasty was about to be established. Riots broke out. Conspiracy theorists posited that the child was a pawn in a Popish plot. The child, the theory went, was not the King’s son but rather a substitute who had been smuggled into the birthing chamber in a bed-warming pan.
In reality, it was the officers of the Army and Navy who were beginning to plot and scheme in their taverns and drinking clubs. They were annoyed that James had introduced Papist officers into the military. The Irish Army, for example, had seen much of its Protestant officer corps dismissed and replaced with Catholics who had little to no military experience.
James dissolved Parliament in July 1688. Around this time, a Bishop and six prominent politicians wrote to Mary and her Dutch husband, William of Orange (1650 – 1702) and invited them to raise an army, invade London, and seize the throne. They accepted.
William landed in Dorset on Guy Fawkes’ day accompanied by an army of fifteen-thousand Dutchmen and other Protestant Europeans. He quickly seized Exeter before marching eastward towards London. James II called for troops to confront William.
Things were not looking good for James, however. Large parts of his officer corps were defecting to the enemy and taking their soldiers with them. Without the leadership of their officers, many soldiers simply went home. English magnates started declaring for William. And his own daughter, Princess Anne (1665 – 1714) left Whitehall to join the rebels in Yorkshire. James, abandoned by everyone, fled to exile in France. He would die there twelve-years-later.
On January 22nd, 1689, William called the first ‘convention parliament.’ At this ‘convention’, Parliament passed two resolutions. First, it was decided that James’ flight into exile constituted an act of abdication. And second, it was declared a war against public policy for the throne to be occupied by a Catholic. As such, the throne was passed over James Francis Edward Stuart, and William and Mary were invited to take the Crown as co-monarchs.
They would be constrained, however, by the 1689 Bill of Rights and, later, by the 1701 Act of Settlement. The 1689 Bill of Rights made Great Britain a constitutional monarchy as opposed to an absolute one. It established Parliament, not the crown, as the supreme source of law. And it set out the most basic rights of the people.
Likewise, the 1701 Act of Settlement helped to strengthen the Parliamentary system of governance and secured a Protestant line of succession. Not only did it prevent Catholics from assuming the throne, but it also gave Parliament the ability to dictate who could ascend to the throne and who could not.
The Glorious Revolution was one of the most important events in Britain’s political evolution. It made William and Mary, and all monarchs after them, elected monarchs. It established the concept of Parliamentary sovereignty granting that political body the power to make or unmake any law it chose to. The establishment of Parliamentary sovereignty brought with it the ideas of responsible and representative government.
The British philosopher, Roger Scruton (1944 – ) described British constitutional monarchy as a “light above politics which shines down [on] the human bustle from a calmer and more exalted sphere.” A constitutional monarchy unites the people for a nation under a monarch who symbolises their shared history, culture, and traditions.
Constitutional monarchy is a compromise between autocracy and democracy. Power is shared between the monarch and the government, both of whom have their powers restricted by a written, or unwritten, constitution. This arrangement separates the theatre of power from the realities of power. The monarch is able to represent the nation whilst the politician is able to represent his constituency (or, more accurately, his party).
In the Need for Roots, the French philosopher, Simone Weils (1909 – 1943) wrote that Britain had managed to maintain a “centuries-old tradition of liberty guaranteed by the authorities.” Weils was astounded to find that chief power in the British constitution lay in the hands of a lifelong, unelected monarch. For Weils, it was this arrangement that allowed the British to retain its tradition of liberty when other countries – Russia, France, and Germany, among others – lost theirs when they abolished their monarchies.
Great Britain’s great legacy is not their once vast and now non-existent Empire, but the ideas of liberty and governance that they have gifted to most of their former colonies. Even the United States, who separated themselves from the British by means of war, inherited most of their ideas about “life, liberty, and the pursuit of happiness” from their English forebears.
The word “Commonwealth” was adopted at the Sixth Imperial Conference held between October 19th and November 26th, 1926. The Conference, which brought together the Prime Ministers of the various dominions of the British Empire, led to the formation of the Inter-Imperial Relations Committee. The Committee, headed for former British Prime Minister, Arthur Balfour (1848 – 1930), was designed to look into future constitutional arrangements within the commonwealth.
“We refer to the group of self-governing communities composed of Great Britain and the Dominions. Their position and mutual relation may be readily defined. They are autonomous Communities within the British Empire, equal in status, in no way subordinate one to another in any aspect of their domestic or external affairs, though united by a common allegiance to the Crown, and freely associated as members of the British Commonwealth of Nations.”
“Every self-governing member of the Empire is now the master of its destiny. In fact, if not always in form, it is subject to no compulsion whatsoever.”
Then, in 1931, the Parliament of the United Kingdom passed the Statute of Westminster. It became one of two laws that would secure Australia’s political and legal independence from Great Britain.
The Statute of Westminster gave legal recognition to the de-facto independence of the British dominions. Under the law, Australia, Canada, the Irish Free State, Newfoundland (which would relinquish its dominion status and be absorbed into Canada in 1949), New Zealand and South Africa were granted legal independence.
Furthermore, the law abolished the Colonial Validity Act 1865. A law which had been enacted with the intention of removing “doubts as to the validity of colonial laws.” According to the act, a Colonial Law was void when it “is or shall be in any respect repugnant to the provisions of any Act of Parliament extending to the colony to which such laws may relate, or repugnant to any order or regulation under authority of such act of Parliament or having in the colony the force and effect of such act, shall be read subject to such act, or regulation, and shall, to the extent of such repugnancy, but not otherwise, be and remain absolutely void and inoperative.”
The Statute of Westminster was quickly adopted by Canada, South Africa, and the Irish Free State. Australia, on the other hand, did not adopt it until 1942, and New Zealand did not adopt it until 1947.
More than forty-years-later, the Hawke Labor government passed the Australia Act 1986. This law effectively made the Australian legal system independent from Great Britain. It had three major achievements. First, it ended appeals to the Privy Council thereby establishing the High Court as the highest court in the land. Second, it ended the influence the British government had over the states of Australia. And third, it allowed Australia to update or repeal those imperial laws that applied to them by ending British legislative restrictions.
What the law did not do, however, was withdraw the Queen’s status as Australia’s Head of State:
“Her Majesty’s Representative in each State shall be the Governor.
Subject to subsections (3) and (4) below, all powers and functions of Her Majesty in respect of a State are exercisable only by the Governor of the State.
Subsection (2) above does not apply in relation to the power to appoint, and the power to terminate the appointment of, the Governor of a State.
While her Majesty is personally present in a State, Her Majesty is not precluded from exercising any of Her powers and functions in respect of the State that are the subject of subsection (2) above.
The advice of Her Majesty in relation to the exercise of powers and functions of Her Majesty in respect of a State shall be tendered by the Premier of the State.”
These two laws reveal an important miscomprehension that is often exploited by Australian Republicans. That myth is the idea that Australia does not have legal and political independence because its Head of State is the British monarch. The passage of the Statute of Westminster in 1931 and the Australia Act in 1986 effectively ended any real political or legal power the British government had over Australia.
In Australia, the monarch (who is our head of state by law) is represented by a Governor General. This individual – who has been an Australian since 1965 – is required to take an oath of allegiance and an oath of office that is administered by a Justice (typically the Chief Justice) of the High Court. The Governor-General holds his or her position at the Crown’s pleasure with appointments typically lasting five years.
The monarch issues letters patent to appoint the Governor General based on the advice of Australian ministers. Prior to 1924, Governor Generals were appointed on the advice of both the British government and the Australian government. This is because the Governor General at that time represented both the monarch and the British government. This arrangement changed, however, at the Imperial Conferences of 1926 and 1930. The Balfour Report produced by these conferences stated that the Governor General should only be the representative of the crown.
The Governor General’s role is almost entirely ceremonial. It has been argued that such an arrangement could work with an elected Head of State. However, such an arrangement would have the effect of politicising and thereby corrupting the Head of State. A Presidential candidate in the United States, for example, is required to raise millions of dollars for his campaign and often finds himself beholden to those donors who made his ascent possible. The beauty of having an unelected Head of State, aside from the fact that it prevents the government from assuming total power, is that they can avoid the snares that trap other political actors.
The 1975 Constitutional Crisis is a perfect example of the importance of having an independent and impartial Head of State. The crises stemmed from the Loans Affair which forced Dr. Jim Cairns (1914 – 2003), Deputy Prime Minister, Treasurer, and intellectual leader of the political left, and Rex Connor (1907 – 1977) out of the cabinet. As a consequence of the constitutional crisis, Gough Whitlam (1916 – 2014) was dismissed as Prime Minister and the 24th federal parliament was dissolved.
The Loan’s affair began when Rex Connor attempted to borrow money, up to US$4b, to fund a series of proposed national development projects. Connor deliberately flouted the rules of the Australian Constitution which required him to take such non-temporary government borrowing to the Loan Council (a ministerial council consisting of both Commonwealth and state elements which existed to coordinate public sector borrowing) for approval. Instead, on December 13th, 1974, Gough Whitlam, Attorney-General Lionel Murphy (1922 – 1986), and Dr. Jim Cairns authorised Connor to seek a loan without the council’s approval.
When news of the Loans Affair was leaked, the Liberal Party, led by Malcolm Fraser (1930 – 2015), began questioning the government. Whitlam attempted to brush the scandal aside by claiming that the loans had merely been “matters of energy” and claiming that the Loans Council would only be advised once a loan had been made. Then, on May 21st, Whitlam informed Fraser that the authority for the plan had been revoked.
Despite this, Connor continued to liaise with the Pakistani financial broker, Tirath Khemlani (1920 – 1991). Khemlani was tracked down and interviewed by Herald Journalist, Peter Game (1927 – ) in mid-to-late 1975. Khemlani claimed that Connor had asked for a twenty-year loan with an interest of 7.7% and a 2.5% commission for Khemlani. The claim threw serious doubt on Dr. Jim Cairn’s claim that the government had not offered Khemlani a commission on a loan. Game also revealed that Connor and Khemlani were still in contact, something Connor denied in the Sydney Morning Herald.
Unfortunately, Khemlani had stalled on the loan, most notably when he had been asked to go to Zurich with Australian Reserve Bank officials to prove the funds were in the Union Bank of Switzerland. When it became apparent that Khemlani would never deliver Whitlam was forced to secure the loan through a major American investment bank. As a condition of that loan, the Australian government was required to cease all other loans activities. Consequentially, Connor had his loan raising authority revoked on May 20th, 1975.
The combination of existing economic difficulties with the political impact of the Loan’s Affair severely damaged to the Whitlam government. At a special one day sitting of the Parliament held on July 9th, Whitlam attempted to defend the actions of his government and tabled evidence concerning the loan. It was an exercise in futility, however. Malcolm Fraser authorised Liberal party senators – who held the majority in the upper house at the time – to force a general election by blocking supply.
And things were only about to get worse. In October 1975, Khemlani flew to Australia and provided Peter Game with telexes and statutory declarations Connor had sent him as proof that he and Connor had been in frequent contact between December 1974 and May 1975. When a copy of this incriminating evidence found its way to Whitlam, the Prime Minister had no other choice but to dismiss Connor and Cairns (though he did briefly make Cairns Minister for the Environment).
By mid-October, every metropolitan newspaper in Australia was calling on the government to resign. Encouraged by this support, the Liberals in the Senate deferred the Whitlam budget on October 16th. Whitlam warned Fraser that the Liberal party would be “responsible for bills not being paid, for salaries not being paid, for utter financial chaos.” Whitlam was alluding to the fact that blocking supply threatened essential services, Medicare rebates, the budgets of government departments and the salaries of public servants. Fraser responded by accusing Whitlam of bringing his own government to ruin by engaging in “massive illegalities.”
On October 21st, Australian’s longest-serving Prime Minister, Sir Robert Menzies (1894 – 1978) signalled his support for Fraser and the Liberals. The next day, Treasurer, Bill Hayden (1933 – ) reintroduced the budget bills and warned that further delay would increase unemployment and deepen a recession that had blighted the western world since 1973.
The crisis would come to a head on Remembrance Day 1975. Whitlam had asserted for weeks that the Senate could not force him into an election by claiming that the House of Representatives had an independence and an authority separate from the Senate.
Whitlam had decided that he would end the stalemate by seeking a half-senate election. Little did he know, however, that the Governor-General, Sir John Kerr (1914 – 1991) had been seeking legal advice from the Chief Justice of the High Court on how he could use his Constitutional Powers to end the deadlock. Kerr had come to the conclusion that should Whitlam refuse to call a general election, he would have no other alternative but to dismiss him.
And this is precisely what happened. With the necessary documents drafted, Whitlam arranged to meet Kerr during the lunch recess. When Whitlam refused to call a general election, Kerr dismissed him and, shortly after, swore in Malcolm Fraser as caretaker Prime Minister. Fraser assured Kerr that he would immediately pass the supply bills and dissolve both houses in preparation for a general election.
Whitlam returned to the Lodge to eat lunch and plan his next movie. He informed his advisors that he had been dismissed. It was decided that Whitlam’s best option was to assert Labor’s legitimacy as the largest party in the House of Representatives. However, fate was already moving against Whitlam. The Senate had already passed the supply bills and Fraser was drafting documents that would dissolve the Parliament.
At 2pm, Deputy Prime Minister, Frank Crean (1916 – 2008) defended the government against a censure motion started by the opposition. “What would happen, for argument’s sake, if someone else were to come here today and say he was now the Prime Minister of this country”, Crean asked. In fact, Crean was stalling for time while Whitlam prepared his response.
At 3pm, Whitlam made a last-ditch effort to save his government by addressing the House. Removing references to the Queen, he asked that the “House expresses its want of confidence in the Prime Minister and requests, Mr. Speaker, forthwith to advice His Excellency, the Governor-General to call the member of Wannon to form a government.” Whitlam’s motion was passed with a majority of ten.
The speaker, Gordon Scholes (1931 – 2018) expressed his intention to “convey the message of the House to His Excellency at the first opportunity.” It was a race that Whitlam was not supposed to win. Scholes was unable to arrange an appointment until quarter-to-five in the afternoon.
Behind the scenes, departmental officials were working to provide Fraser with the paperwork he needed to proclaim a double dissolution. By ten-to-four, Fraser left for government house. Ten minutes later, Sir John Kerr had signed the proclamation dissolving both Houses of Parliament and set the date for the upcoming election for December 13th, 1975. Shortly after, Kerr’s official secretary, David Smith (1933) drove to Parliament House and, with Whitlam looming behind him, read the Governor General’s proclamation.
The combination of economic strife, political scandal, and Whitlam’s dismissal signed the death warrant for Whitlam’s government. At the 1975 Federal Election, the Liberal-National coalition won by a landslide, gaining a majority of ninety-one seats and obtaining a popular vote of 4,102,078. In the final analysis, it seems that the Australian people had agreed with Kerr’s decision and had voted to remove Whitlam’s failed government from power once and for all.
Most of the arguments levelled against constitutional monarchies can be described as petty, childish, and ignorant. The biggest faux pas those who oppose constitutional monarchies make is a failure to separate the royal family (who are certainly not above reproach) from the institution of monarchy itself. Dislike for the Windsor family is not a sufficient reason to disagree with constitutional monarchy. It would be as if I decided to argue for the abolition of the office of Prime Minister just because I didn’t like the person who held that office.
One accusation frequently levelled against the monarchy is that they are an undue financial burden on the British taxpaying public. This is a hollow argument, however. It is certainly true that the monarchy costs the British taxpayer £299.4 million every year. And it is certainly true that the German Presidency costs only £26 million every year. However, it is not true that all monarchies are necessarily more expensive than Presidencies. The Spanish monarchy costs only £8 million per year, less than the Presidencies of Germany, Finland, and Portugal.
Australia has always had a small but vocal republican movement. The National Director of the Republican Movement, Michael Cooney has stated: “no one thinks it ain’t broken, that we should fix it. And no one thinks we have enough say over our future, and so, no matter what people think about in the sense of the immediate of the republic everyone knows that something is not quite working.”
History, however, suggests that the Australian people do not necessarily agree with Cooney’s assessment. The Republican referendum of 1999 was designed to facilitate two constitutional changes: first, the establishment of a republic, and, second, the insertion of a preamble in the Constitution.
The Referendum was held on November 6th, 1999. Around 99.14%, or 11,683,811 people, of the Australian voting public participated. 45.13%, or 5,273,024 voted yes. However, 54.87%, or 6,410,787 voted no. The Australian people had decided to maintain Australia’s constitutional monarchy.
All things considered, it was probably a wise decision. The chaos caused by establishing a republic would pose a greater threat to our liberties than a relatively powerless old lady. Several problems would need to be addressed. How often should elections occur? How would these elections be held? What powers should a President have? Will a President be just the head of state, or will he be the head of the government as well? Australian republicans appear unwilling to answer these questions.
Margaret Tavits of Washington University in St. Louis once observed that: “monarchs can truly be above politics. They usually have no party connections and have not been involved in daily politics before assuming the post of Head of State.” It is the job of the monarch to become the human embodiment of the nation. It is the monarch who becomes the centrepiece of pageantry and spectacle. And it the monarch who symbolises a nation’s history, tradition, and values.
Countries with elected, or even unelected, Presidents can be quite monarchical in style. Americans, for example, often regard their President (who is both the Head of State and the head of the government) with an almost monarchical reverence. A constitutional monarch might be a lifelong, unelected Head of State, but unlike a President, that is generally where their power ends. It is rather ironic that the Oxford political scientists, Petra Schleiter and Edward Morgan-Jones have noted that allow governments to change without democratic input like elections than monarchs are. Furthermore, by occupying his or her position as Head of State, the monarch is able to prevent other, less desirable people from doing so.
The second great advantage of constitutional monarchies is that they provide their nation with stability and continuity. It is an effective means to bridging the past and future. A successful monarchy must evolve with the times whilst simultaneously keeping itself rooted in tradition. All three of my surviving grandparents have lived through the reign of King George VI, Queen Elizabeth II, and may possibly live to see the coronation of King Charles III. I know that I will live through the reigns of Charles, King William V, and possibly survive to see the coronation of King George VII (though he will certainly outlive me).
It would be easy to dismiss stability and continuity as manifestations of mere sentimentality, but such things also have a positive effect on the economy, as well. In a study entitled Symbolic Unity, Dynastic Continuity, and Countervailing Power: Monarchies, Republics and the Economy Mauro F. Guillén found that monarchies had a positive impact on economies and living standards over the long term. The study, which examined data from one-hundred-and-thirty-seven countries including different kinds of republics and dictatorships, found that individuals and businesses felt more confident that the government was not going to interfere with their property in constitutional monarchies than in republics. As a consequence, they are more willing to invest in their respective economies.
When Wordsworth wrote his ode to Milton, he was mourning the loss of chivalry he felt had pervaded English society. Today, the West is once again in serious danger of losing those two entities that is giving them a connection to the chivalry of the past: a belief in God and a submission to a higher authority.
Western culture is balanced between an adherence to reason and freedom on the one hand and a submission to God and authority on the other. It has been this delicate balance that has allowed the West to become what it is. Without it, we become like Shakespeare’s Hamlet: doomed to a life of moral and philosophical uncertainty.
It is here that the special relationship between freedom and authority that constitutional monarchy implies becomes so important. It satisfies the desire for personal autonomy and the need for submission simultaneously.
The Christian apologist and novelist, C.S. Lewis (1898 – 1964) once argued that most people no more deserved a share in governing a hen-roost than they do in governing a nation:
“I am a democrat because I believe in the fall of man. I think most people are democrats for the opposite reason. A great deal of democratic enthusiasm descends from the idea of people like Rousseau who believed in democracy because they thought mankind so wise and good that everyone deserved a share in the government. The danger of defending democracy on those grounds is that they’re not true and whenever their weakness is exposed the people who prefer tyranny make capital out of the exposure.”
The necessity for limited government, much like the necessity for authority, comes from our fallen nature. Democracy did not arise because people are so naturally good (which they are not) that they ought to be given unchecked power over their fellows. Aristotle (384BC – 322BC) may have been right when he stated that some people are only fit to be slaves, but unlimited power is wrong because there is no one person who is perfect enough to be a master.
Legal and economic equality are necessary bulwarks against corruption and cruelty. (Economic equality, of course, refers to the freedom to engage in lawful economic activity, not to socialist policies of redistributing wealth that inevitably lead to tyranny). Legal and economic equality, however, does not provide spiritual sustenance. The ability to vote, buy a mobile phone, or work a job without being discriminated against may increase the joy in your life, but it is not a pathway to genuine meaning in life.
Equality serves the same purpose that clothing does. We are required to wear clothing because we are no longer innocent. The necessity of clothes, however, does not mean that we do not sometimes desire the naked body. Likewise, just because we adhere to the idea that God made all people equal does not mean that there is not a part of us that does not wish for inequality to present itself in certain situations.
Chivalry symbolises the best human beings can be. It helps us realise the best in ourselves by reconciling fealty and command, inferiority and superiority. However, the ideal of chivalry is a paradox. When the veil of innocence has been lifted from our eyes, we are forced to reconcile ourselves to the fact that bullies are not always cowards and heroes are not always modest. Chivalry, then, is not a natural state, but an ideal to be aimed for.
The chivalric ideal marries the virtues of humility and meekness with those of valour, bravery, and firmness. “Thou wert the meekest man who ever ate in hall among ladies”, said Sir Ector to the dead Lancelot. “And thou wert the sternest knight to thy mortal foe that ever-put spear in the rest.”
Constitutional monarchy, like chivalry, makes a two-fold demand on the human spirit. Its democratic element, which upholds liberty, demands civil participation from all its citizens. And its monarchical element, which champions tradition and authority, demands that the individual subjugate himself to that tradition.
It has been my aim in this essay to provide a historical, practical, and spiritual justification for constitutional monarchy. I have demonstrated that the British have developed ideals of liberty, justice, and good governance. The two revolutions of the 17th century – the English Civil War and the Glorious Revolution – established Great Britain as a constitutional monarchy. It meant that the monarch could not rule without the consent of parliament, established parliament as the supreme source of law, and allowed them to determine the line of succession. I have demonstrated that constitutional monarchs are more likely to uphold democratic principles and that the stability they produce encourages robust economies. And I have demonstrated that monarchies enrich our souls because it awakens in us the need for both freedom and obedience.
Our world has become so very vulgar. We have turned our backs on God, truth, beauty, and virtue. Perhaps we, like Wordsworth before us, should seek virtue, manners, freedom, and power. We can begin to do this by retaining the monarchy.
There is an old adage which states that you do not know how big a tree is until you try and cut it down. Today, as cultural forces slowly destroy it, we are beginning to understand that the same thing can be said about personal responsibility.
Society no longer believes that people ought to bear their suffering with dignity and grace. Rather, it now believes that the problems of the individual ought to be made the problems of the community. Individual problems are no longer the consequence of individual decisions, but come as the result of race, gender, class, and so forth.
The result of this move towards collective responsibility has been the invention of victim culture. According to this culture, non-whites are the victims of racism and white privilege, women are the victims of the patriarchy, homosexuals are the victims of a heteronormative society.
The 20th century is a perfect example of what happens when responsibility is taken from the hands of the individual and placed in the hands of the mob. The twin evils of communism and Nazism – which blamed the problems of the individual on economic and racial factors, respectively – led to the deaths of tens of millions of people.
Furthermore, such ideologies led otherwise decent individuals to commit acts of unspeakable violence. Whilst observing the trial of Adolf Eichmann, a former SS soldier who had been one of the architects of the Holocaust, the writer, Hannah Arendt was struck by the “banality of evil” that had characterised German war atrocities. Arendt noted that the men who conspired to commit genocide were not raving lunatics foaming at the mouth, but rather dull individuals inspired to commit evil due to a sense of duty to a toxic and corrupt ideology.
The Bolsheviks taught the Russian people that their misfortune had been caused by the wealthy. And that the wealth was gained through theft and exploitation. Likewise, the Nazis convinced the German people that their problems could be blamed on the Jews. It is not difficult to see how this philosophy led, step by step, to the gulags and the concentration camps.
The same thing is happening today. The only difference is that those who play it have become more sophisticated. Today people are encouraged to identify with identity groups ranked by so-called social privilege. Then they are taught to despise those with more social privilege than them.
Under this philosophy, crime is not caused by the actions of the individual, but by social forces like poverty, racism, and upbringing. Advocates claim that women should not be forced to take responsibility for their sexual behaviour by allowing them to essentially murder their unborn children. Sexually transmitted diseases like HIV is caused by homophobia rather than immoral and socially irresponsible behaviour. And alcoholism and drug addiction are treated as a disease rather than a behaviour the addict is supposed to take responsibility for. The list is endless.
Personal responsibility helps us take control of our lives. It means that the individual can take a certain amount of control over his own life even when the obstacles he is facing seem insurmountable.
No one, least of all me, is going to argue that individuals don’t face hardships that are not their fault. What I am going to argue, however, is that other people will respect you more if you take responsibility for your problems, especially if those problems are not your fault. Charity for aids sufferers, the impoverished, or reformed criminals is all perfectly acceptable. But we only make their plight worse by taking their personal responsibility from them.
Responsibility justifies a person’s life and helps them find meaning in their suffering. Central to the Christian faith is the idea that individuals are duty bound to bear their suffering with dignity and grace and to struggle towards being a good person. To force a man to take responsibility for himself is to treat him as one of God’s creations.
You cannot be free if other people have to take responsibility for your decisions. When you take responsibility from the hands of the individual you tarnish his soul and steal his freedom.
Freedom from responsibility is slavery, not freedom. Freedom is the ability to make decisions according to the dictates of own’s own conscience and live with the consequences of that decision. Freedom means having the choice to engage in the kind immoral behaviour that leads to an unwanted pregnancy or AIDS. What it does not do is absolve you from responsibility for those actions. Slavery disguised as kindness and compassion is still slavery.
Sometimes a civilisation can become so sophisticated that it believes it can overcome truth. We have become one of those civilisations. As a consequence of our arrogance, we have come to believe that we can circumvent some of the most fundamental truths about reality. We blame inequality on the social structure even though most social animals live in hierarchies. We believe that primitive people are noble even though mankind in its primitive state is more violent than at any other stage. And we believe that we can change the way human beings eat despite the fact that it is making us unhappy.
It is our modern obsession over diet and exercise that I would like to focus on. This obsession has arisen from a society that is too safe, too free, and too prosperous for its own good. This is not to say that safety, freedom, and prosperity are bad things. Indeed, we should get down on our knees and thank God every day that we live in a country that has these things. However, it is also true that too much safety, freedom, and prosperity breeds passivity and complacency. The hardships our ancestors faced – war, poverty, disease – are no longer problems for us. Therefore, we lack the meaning that these hardships bring to our life. As a result, we have come to invent problems. Among these has been a tendency to render the consumption of certain food as something unhealthy, unethical, or both.
Our modern obsession with food is causing significant personal problems. On the one hand, the ease in which food, especially that which is laden with sugar, is causing a rise in cases of obesity. (Note: I am using the word ‘obesity’ as a blanket term for people who are overweight). It is a uniquely modern problem. Our ancestors never battled weight gain because they were only able to find or afford enough food to keep them and their families from starving. Now the quantity, cheapness, and, in many cases, poor quality of food means that the fattest amongst are also often the poorest. But obesity is less a problem that arises out of food and more of a problem arising from laziness and gluttony. (Naturally, I am excluding health problems and genetic disorders from this conclusion).
On the other hand, however, our obsession over being skinny or muscle-bound is also causing problems. I have seen plenty of people who are clearly overweight. In rare cases, I have even seen people who are so morbidly obese that it can only be described as breathtaking. However, I have also seen women (and it primarily women, by the way) who can only be described as unnaturally thin. It is as though our society, having realised that being overweight is healthy, has decided that its opposite must be good. It isn’t. Just right is just right.
And it’s not just individuals who are subjecting themselves to this kind of self-imposed torture. And it’s not limited to people in the here and now, either. In 1998, The Independent reported that many doctors in the United Kingdom were concerned that well-meaning parents were unintentionally starving their children to death by feeding them low fat, low sugar diets. These children were said to be suffering from the effects of “muesli-belt nutrition.” They had become malnourished because either they or their parents had maintained had become obsessed with maintaining a low-fat, low-sugar, low-salt diet. The article reported: “Malnutrition, once associated with slums, is said to have become an increasing problem for middle-class families in the past fifteen years. The victim of so-called ‘muesli-belt nutrition’ are at risk of stunted growth, anaemia, learning difficulties, heart disease and diabetes.”
Our obsession over diet is really a sign of how well-off our society is. Our ancestors had neither the time nor the resources to adhere to the kind of crazy-strict diets that modern people, in their infinite stupidity, decide to subject themselves to. It is high time we stopped obsessing over food and got a grip.
President George Herbert Walker Bush died in his home on November 30th following a long battle with Vascular Parkinson’s disease. Below is a brief overview of his life:
- Born June 12th, 1924 to Prescott Sheldon Bush (1895 – 1972) and Dorothy Bush (1901 – 1992).
- Attended Greenwich Country Day School
- Attended Phillips Academy in Andover Massachusetts from 1938
- Held numerous leadership positions including President of the senior class, secretary of the student council, president of the community fund-raising group, member of the editorial board of the school newspaper, and captain of the varsity baseball and soccer teams
- Served in the US Navy as a naval aviator from 1942 until 1945
- Attained the rank of junior-grade Lieutenant
- Earnt the Distinguished Flying Cross, Air Medal, and President Unit Citation
- Married Barbara Bush (1925 – 2018) in January 1945
- Fathered six children: President George W. Bush (1946 – ), Robin Bush (1949 – 1953), Jeb Bush (1953 – ), Neil Bush (1955 – ), Marvin Bush (1956 – ), and Doro Bush (1959 – ).
- Enrolled at Yale University where he earnt an undergraduate degree in economics on an accelerated program which allowed him to complete his studies in two years.
- Elected President of the Delta Kappa Epsilon fraternity
- Captain of the Yale Baseball Team with whom he played two college world series as a left-handed batsman
- Became a member of the secret Skull and Bones Society
- Elected Phi Beta Kappa, America’s oldest academic honour society, upon graduating Yale in 1948.
- Worked as an oil field equipment salesman for Dressler Industries
- Established Bush-Overby Oil Development Company in 1951
- Co-founded Zapata Petroleum Corporation, which drilled in Texas’ Permian Basin, in 1953
- Became President of Zapata Offshore Company
- After Zapata Offshore Company became independent in 1959, Bush served as its President until 1964 and then Chairman until 1966
- Elected Chairman of the Harris County, Texas Republican Party
- Ran against Democrat incumbent Ralph W. Yarborough for the US Senate in 1964, but lost
- Elected to the House of Representatives in 1966
- Appointed to the Ways and Means Committee
- Ran against Democrat Lloyd Bentsen for a seat in the Senate in 1970, but lost
- Served as the US Ambassador the United Nations from 1971 to 1973.
- Served as Chairman of the Republican Nation Committee from 1973 to 1974.
- Appointed Chief of the US Liason Office in the People’s Republic of China from 1974 to 1975.
- Director of the Central Intelligence Agency from 1976 to 1977.
- Chairman of the Executive Committee of the First International Bank in 1977
- Part-time Professor of Administrative Science at Rice University’s Jones School of Businesses in 1978
- Director of the Council On Foreign Relations between 1977 and 1979.
- Sought the Republican nomination for President in 1980 but lost to Ronald Reagan.
- Served as Vice President from 1981 to 1989.
- Elected President of the United States in 1988.
- President of the United States from 1989 to 1993.
- Defeated by Bill Clinton in the 1992 Presidential election
- Awarded an honourary knighthood by Queen Elizabeth II.
- Chairman of the board of trustee for Eisenhower Fellowships from 1993 to 1999
- Chairman of the National Constitution Centre from 2007 to 2009.
- Became a widower after seventy-three-years of marriage.
- Died November 30th, 2018 at the age of 94.
There can be little doubt that technology is going to transform our world in ways that will make it unrecognisable to us fifty years from now. Technology is going to transform our lives, our work, and our relationships in ways that we, in our mortal and limited wisdom, will prove unable to comprehend. What we consider science fiction today, we will consider reality tomorrow.
The most obvious clue has been the internet. This medium is, indeed has, changed the world in ways we cannot even begin to fathom. Virtually every home in the developed world has the internet. Most of us carry it around with us in the form of smartphones and tablets. It has revolutionised the way we learn, do business, commit crimes, and communicate with one another.
Then there’s television. The shows featured on mainstream television can be described, accurately, as formulaic, petty, cheap, and shallow. It’s news and current affairs programs provide little in the way of real or, for that matter, interesting information. Likewise, their fictional programming features staid and one-dimensional characters in cliché plots and scenarios.
By contrast, paid subscription services like Netflix and Hulu feature shows that appeal to a wide variety of temperaments and interests. By contrast, the shows on paid subscription services like Netflix and Hulu appeal to a wide variety of temperaments and interests. Their shows captivate the imagination by featuring intriguing plots, complex characters, inspired cinematography, beautiful set designs, and state-of-the-art special effects. Just take a look at some of the titles: Archer, Suits, Spartacus, Mindhunter, House of Cards, Rick and Morty, Game of Thrones, and so forth.
One night of watching mainstream television followed by a single night of watching a paid subscription service should be proof positive to anyone that television is slowly, but surely, fading away.
Finally, there is music. Digital music outlets like I-Tunes and Spotify has revolutionised the way in which listen (and, more sinisterly, steal) music. Where once our grandparents were limited to their vinyl record collection, today’s music lover has access to thousands of songs at his or her fingertips.
Allow me to reiterate what I said before: the high-tech world that pervaded the imaginations of storytellers and filmmakers will no longer be a fantasy, it will be a reality.
I, for one, can easily envision a world in which an omnipresent house computer reads our body temperature and regulates the climate in our home without us being consciously aware of it. I can envision a world where a computer-controlled kitchen cooks our food with little intervention us. I can envision a world of driverless cars, endless self-serve checkouts, and more.
The future will be digital. The challenge for the human race is to be able to embrace this change without losing our individual autonomy.
The term “noble savage”, referring to the so-called “natural man” who has not been corrupted by civilization, first appeared in The Conquest of Granada by the English playwright, John Dryden (1631 – 1700). Since then it has been a popular theme in books, television, and movies with stories like Dances with Wolves, Pocahontas, and Avatar espousing noble-savage philosophies.
The Genevan philosopher, Jean-Jacques Rousseau (1712 – 1778) believed there to be a distinction between human nature and society. Taking his inspiration from John Locke’s (1632 – 1704) philosophy of innate goodness, Rousseau believed human beings were inherently peaceful and that concepts like sin and wickedness and bore no consequence to the natural man. Rather it was society that had perverted mankind’s natural sense of ‘amour de soi’ (a form of positive self-love which Rousseau saw as a combination of reason and the natural instinct for self-preservation) had been corrupted by societal forces. Rosseau wrote in the 18th century:
“Nothing can be more gentle than he in his primitive state, when placed by nature at an equal distance from the stupidity of brutes and the pernicious good sense of civilized man.”
While Rousseau was not the first philosopher to posit that society may have a corrupting influence (the French philosopher, Montaigne (1533 – 1592) described the lives of Native Americans as being so idyllic that he claimed they did not have words for lying, cheating, avarice, or envy, and that they did not need to work), it has been his influence that has been the most damaging. The first attempt to politicise Rousseauan philosophy, the French Revolution, ended not with paradise on earth, but with the mass executions that characterised the reign of terror. The social movements that have followed Rousseauan ideals have worked on the notion that it is society, not the individual, that is to blame for social problems. No aspect of human nature is responsible for evil, that is the result of a bad home, a bad neighbourhood, prejudice, poverty, and so forth. Human emotions are ultimately benevolent; evil and brutality are the results of social stressors on the individual. It is this philosophy that has been the driving force behind virtually all social programs.
The English philosopher, Thomas Hobbes (1588 – 1679), saw life in a state of nature as one of perpetual civil war. According to Hobbes, life in a state of nature was “nasty, brutish, and short” (this, rather amusingly, has been used to describe the careers of some football managers). Since concepts like morality and justice have no place in a state of nature, the natural man has no concept of them. In Leviathan, written in 1651, Hobbes asked the reader to imagine what their lives would be like if they lived outside the protection of the state. Without law and order there are no checks and balances on an individual’s behaviour. Human beings, therefore, must be kept in check by an authority that has the ability to punish wickedness. Kings and governments have a responsibility to teach their citizens to be just, to not deprive others of their property, including their lives, through theft, fraud, murder, rape, and so forth. Hobbes believed the only way people could protect themselves from the trials and tribulations of life would be to transfer authority to a Government and a King.
The “noble savage” idea is a myth, pure and simple. It is merely a means for shifting responsibility away from the individual towards society. It is time for people to put this ridiculous belief in the one place it belongs: the waste-paper bin.
Everyone versed in culture and politics understands the truth in Percy Bysshe Shelley’s (1792 – 1822) argument that creators of culture are the “unacknowledged legislators of the world.” Our view of the world is derived from our religious beliefs, the stories we read as children, the movies we watched, the cultural customs we become accustomed to, and so forth. It is not that culture constructs the physical edifices of civilisation per say, but that culture forms the values and philosophies upon which civilisation is founded.
In the west, the prevailing cultural narrative champions wholesome virtues: kindness, compassion, love, fair-play, and so forth, as being the only way to achieve prosperity and success. The individual must avoid combat with others, and be polite, civil, pleasant, and diplomatic to all. To be seen using aggression or wanting power leads to social isolation. This has certainly been the message in culture. In Shakespeare’s Richard III, the title character is a corrupt, twisted, and Machiavellian prince who schemes his way into power. By contrast, the future Henry VII is seen to be fair and humane. By the end of the play, Richard dies hated even by members of his own family, whereas Henry is celebrated as a noble hero.
This worldview bears little resemblance to reality:
“The manner in which we live, and that in which we ought to live, are things so wide asunder, that he who quits the one to betake himself with the other is more likely to destroy than to save himself; since anyone who would act up to a perfect standard of goodness in everything, must be ruined among so many who are not good. It is essential for a prince who wishes to maintain his position, to have learned how to be other than good, and to use or not to use his goodness as necessity requires.” (Niccolo Machiavelli, The Prince, 1532, Chapter 15, page 114)
Bubbling just below the surface are the real, amoral virtues which foster prosperity and success. In Beyond Good and Evil (1886), Friedrich Nietzsche (1844 – 1900) puts forth the following proposition:
“Suppose nothing is given as ‘real’ except our world of desires and passions, and we could not get down, or up, to any other ‘reality’ besides the reality of our drives.” (Beyond Good and Evil, page 59).
Maybe we aren’t as driven by morality and Godliness as we like to think we are. Maybe we are driven by lust for power, material wealth, and sex. (This, of course, brings forth the possibility that the purpose of wholesomeness is to temper our real desires).
Even though we loathe having to admit it, all of us want power. Power gives us greater control and makes us feel more secure. But since it is socially unacceptable to be seen wanting power we are forced to rely on subtlety. We are forced to become honest on the one hand, and duplicitous on the other, congenial yet cunning, democratic yet devious.
In chapter twenty-one of the Prince, Machiavelli (1469 – 1527) wrote: “Nothing makes a prince so well thought of as to undertake great enterprises and give striking proofs of his capacity.” Our civilisation was built through ambitious and power-hungry individuals. Not by the wholesome virtues presented to us.
This is our weekly theological article.
For most of my life I have had a great affinity for cemeteries and graveyards. A gentle stroll through the neat and peaceful rows of graves, pausing occasionally to read the inscription on the headstone of someone who lived and died long before I was born has been the source of great pleasure for me.
I believe cemeteries and graveyards are important for two reasons. First, they are incredibly artistic. One cannot help but notice the well-manicured lawns and beautiful gardens, the magnificent sculpting’s of the headstones, and the often-poetic rhetoric of the epitaphs. Second, I believe that cemeteries and graveyards provide people with a physical connection with their cultural heritage and allows them to tap into their ancestral past. As Doctor Celestina Sagazio, a historian working for Melbourne’s Southern Metropolitan Cemeteries Trust, observed, cemeteries and graveyards provide a clue into the daily lives of people throughout history.
Modern culture has little time for the contemplation of death. That would go against ‘positive thinking’ and the perpetual lie of ‘eternal youth.’ This, however, stands in stark contrast with the convictions of most of our forebears. From antiquity through to the early twentieth century, the consideration of death was considered a good motivator for leading a virtuous and meaningful life. Recent studies affirm this belief, finding that the contemplation of one’s own mortality acts as a motivator for assessing one’s values and goals and can greatly improve physical health.
The phrase, ‘Memento Mori’, is said to have originated with the Ancient Romans. Tradition in Ancient Rome dictated that a servant or slave should stand behind a triumphant General during his victory parade. This servant or slave would whisper in the General’s ear: “Respice post te! Hominem te esse memento! Memento Mori!” (“Look behind you! Remember that you are but a man! Remember that you will die!”).
Between the 14th and 17th centuries, the concept of ‘Memento Mori’ took on new motifs. The engraving, ‘The Triumph of Death’ (1539) by Georg Pencz (1500 – 1550) depicted a scythe-wielding skeleton commanding an oxen-driven chariot. Similarly, the dance of death – involving skeletal figures – lead everyone from the Pope to the humble ploughman in a final dance of death. During the 17th and 18th centuries, many New England graves were adorned with epitaphs like ‘Memento Mori’ and ‘Hora Fugit’ (‘the hour flees’) and were emblazoned with images of skulls, bones, winged death’s heads, hourglasses, and other symbols of death and the passage of time.
The Roman stoic philosopher, Seneca (4BC – AD65) advised: “Let us prepare our minds as if we’d come to the very end of life. Let us postpone nothing. Let us balance life’s books each day… The one who puts the finishing touches on their life each day is never short of time.” The careful contemplation of mortality and the deliberate awareness of death has a profoundly positive effect on the health and vitality of the soul.
Kofi Annan, the former Secretary-General of the United Nations, has stated that disagreeing with globalism is like disagreeing with “the laws of gravity.” Similarly, new French President, Emmanuel Macron, another supporter of globalism, wishes to deregulate France’s ailing industry and boost freedom of movement and trade. Donald Trump’s election to the US Presidency, and the UK’s decision to leave the European Union, however, have challenged the presumed supremacy of globalism as a political force.
The roots of globalism can be traced back to the 2nd Century BC when the formation of the Silk Road facilitated the trade of silk, wool, silver, and gold between Europe and China. It wasn’t until the 20th century, however, that the idea gathered momentum. Following the Second World War, world power was to be split between America, representing the capitalist west, and the Union of Soviet Socialist Republics, representing the communist east. Following the collapse of the Soviet Union in 1991, America took it upon herself to create an undivided, democratic, and peaceful Europe.
Of course, the aim for an undivided Europe, indeed an undivided world, existed long before the collapse of the Soviet Union. In 1944. Allied delegates, met at Bretton Woods, New Hampshire, to establish an economic system based on open markets and free trade. Their idea gathered momentum. Today, the Monetary Fund, World Bank, and, the World Trade Centre all exist to unite the various national economies of the world into a single, global economy.
In 1950, the French foreign minister, Robert Schuman, proposed pooling Western Europe’s coal and steel producing countries together. Originally, Schuman’s objective had been to unite France with the Federal Republic of Germany. In the end, however, the Treaty of Paris would unite Belgium, France, West Germany, Italy, Luxembourg, and the Netherlands in the European Coal and Steel Community. By 1957, the Treaty of Rome had been used to create the European Economic Community.
Globalism is an ideology which seeks to form a world where nations base their economic and foreign policies on global, rather than national, interests. It can be viewed as a blanket term for various phenomena: the pursuit of classical liberal and free market policies on the world stage, Western dominance over the political, cultural, and economic spheres, the proliferation of new technologies, and global integration.
John Lennon’s Imagine, speaking of ‘no countries’, ‘no religion’, and a ‘brotherhood of man’, acts as an almost perfect anthem for globalism. Your individual views on globalism, however, will depend largely on your personal definition of a nation. If you support globalism it is likely you believe a nation to be little more than a geographical location. If you are a nationalist, however, it is likely you believe a nation to be the accumulation of its history, culture, and traditions.
Supporters of John Lennon’s political ideology seem to suffer from a form of self-loathing. European heritage and culture are not seen as something worth celebrating, but as something to be dismissed. And it appears to be working: decades of anti-nationalist, anti-Western policies have stripped many European nations of their historical and cultural identities. In the UK, there have been calls to remove the statue of Cecil Rhodes – an important, yet controversial figure. In other countries, certain areas are have become so rife with ethnic violence they are considered ‘no-go’ zones.
Perhaps, it is the result of “white man’s burden”, Rudyard Kipling’s prophetic 1899 poem about the West’s perceived obligation to improve the lot of non-westerners. Today, many white, middle-class elites echo Kipling’s sentiments by believing that it to be their duty to save the world. These people are told at charity events, at protests, at their universities, and by their media of their obligation to their ‘fellow man.’ When it comes to immigration, they believe it to be their responsibility to save the wretched peoples of the world by importing them, and their problems, to the West.
By contrast, nationalism champions the idea that nations, as defined by a common language, ethnicity, or culture, have the right to form communities based on a shared history and/or a common destiny. The phenomenon can be described as consisting of patriotic feelings, principles, or efforts, an extreme form or patriotism characterised by feelings of national superiority, or as the advocacy of political independence. It is primarily driven by two factors. First, feelings of nationhood among members of a nation-state, and, two, the actions of a state in trying to achieve or sustain self-determination. In simplest terms, nationalism constitutes a form of human identity.
One cannot become a citizen of a nation merely by living there. Citizenship arises from the sharing of a common culture, tradition, and history. As American writer Alan Wolfe observed: “behind every citizen lies a graveyard.” The sociologist Emile Durkheim believed people to be united by their families, their religion, and their culture. In Suicide: a Study in Sociology, Durkheim surmises:
“It is not true, then, that human activity can be released from all restraint. Nothing in the world can enjoy such a privilege. All existence being a part of the universe is relative to the remainder; its nature and method of manifestation accordingly depend not only on itself but on other beings, who consequently restrain and regulate it. Here there are only differences of degree and form between the mineral realm and the thinking person.’ Man’s characteristic privilege is that the bond he accepts is not physical but moral; that is, social. He is governed not by a material environment brutally imposed on him, but by a conscience superior to his own, the superiority of which he feels.” – Suicide: a Study in Sociology (pg. 277)
Globalism has primarily manifested itself through economic means. In the economic sense, globalism began in the late 19th, early 20th centuries with the invention of the locomotive, the motor-car, the steamship, and the telegraph. Prior to the industrial revolution, a great deal of economic output was restricted to certain countries. China and India combined produced an economic output of fifty-percent, whilst Western Europe produced an economic output of eighteen percent. It was the industrial revolution of the 19th century, and the dramatic growth of industrial productivity, which caused Western Europe’s economic output to double. Today, we experience the consequences of globalism every time we enter a McDonalds Restaurant, call someone on our mobile phones, or use the internet.
Philip Lower, the Governor of the Reserve Bank of Australia, told a group of businessmen and women at the Sydney Opera House that Australia was “committed to an open international order.” Similarly, the Nobel Prize-winning economist, Amartya Sen, argued that globalisation had “enriched the world scientifically and culturally, and benefited many people economically as well.” It is certainly true that globalisation has facilitated the sharing of technological, cultural, and scientific advances between nations. However, as some economists, like Joseph Stiglitz and Ha-Joon Chang, have pointed out: globalisation can also have the effect of increasing rather than reducing inequality. In 2007, the International Monetary Fund admitted that investment in the foreign capital of developing countries and the introduction of new technologies has had the effect of increasing levels of inequality. Countries with larger populations, lower working and living standards, more advanced technology, or a combination of all three, are in a better position to compete than countries that lack these factors.
The underlying fact is that globalism has economic consequences. Under globalisation, there is little to no restrictions on the movement of goods, capital, services, people, technology, and information. Among the things championed by economic globalisation is the cross-border division of labour. Different countries become responsible different forms of labour.
The United Nations has unrealistically asserted globalism to be the key to ending poverty in the 21st Century. The Global Policy Forum, an organisation which acts as an independent policy watchdog of the United Nations, has suggested that imposition of global taxes as a means of achieving this reality. These include taxes on carbon emissions to slow climate change, taxes on currency trading to ‘dampen instability in the foreign exchange markets’, and taxes to support major initiatives like reducing poverty and hunger, increasing access to education, and fighting preventable diseases.
In one sense, the battle between globalism and nationalism can be seen as a battle between ideology and realism. Globalism appears committed to creating a ‘brotherhood of man.’ Nationalism, on the other hand, reminds us that culture and nationality form an integral part of human identity, and informs us they are sentiments worth protecting. The true value of globalism and nationalism come not from their opposition, but from how they can be made to work together. Globalism has the economic benefit of allowing countries to develop their economies through global trade. It is not beneficial, however, when it devolves into open-border policies, global taxes, or attacks on a nation’s culture or sovereignty. Nationalism, by the same token, has the benefit of providing people with a national and cultural identity, as well as the benefits and protections of citizenship. Nationalism fails when it becomes so fanatical it leads to xenophobia or war. The answer, therefore, is not to forsake one for the other, but to reconcile the two.