King Alfred Press

Home » Posts tagged 'democracy'

Tag Archives: democracy

On Constitutional Monarchy

i12510

I would like to begin this essay by reciting a poem by the English Romantic poet, William Wordsworth (1770 – 1850):

 

     Milton! thou shouldst be living at this hour:

            England hath need for thee: she is a fen

            Of stagnant waters: altar, sword, and pen,

            Fireside, the heroic wealth of hall and bower,

            Have forfeited their ancient English dower

            Of inward happiness. We are selfish men;

            Oh! raise us up, return to us again;

            And give us manners, virtue, freedom, power.

            Thy soul was like a star, and dwelt apart:

            Thou hadst a voice whose sound was like the sea:

            Pure as the naked heavens, majestic, free

            So didst thou travel on life’s common way,

            In cheerful godliness; and yet thy heart

            The lowliest duties on herself did lay.

 

The poem, entitled London 1802, is Wordsworth’s ode to an older, nobler time. In it he attempts to conjure up the spirit of John Milton (1608 – 1674), the writer and civil servant immortalised for all time as the writer of Paradise Lost.

Milton acts as the embodiment for a nobler form of humanity. He symbolises a time when honour and duty played far greater a role in the human soul than it did in Wordsworth’s time, or even today. It is these themes of honour, duty, and nobility that will provide the spiritual basis for constitutional monarchy.

It is a subject that I will return to much later in this essay. But, to begin, it would perhaps be more prudent to begin this essay in earnest by examining those aspects of English history that allowed both constitutional monarchy and English liberty to be borne.

The English monarchy has existed for over eleven-hundred years. Stretching from King Alfred the Great in the 9th century to Elizabeth II in the 21st, the English people have seen more than their fair share of heroes and villains, wise kings and despotic tyrants. Through their historical and political evolution, the British have developed, and championed, ideals of liberty, justice, and good governance. The English have gifted these ideals to most of the Western World through the importation of their culture to most of the former colonies.

It is a sad reality that there are many people, particularly left-wing intellectuals, who need to reminded of the contributions the English have made to world culture. The journalist, Peter Hitchens (1951 – ) noted in his book, The Abolition of Britain that abhorrence for one’s own country was a unique trait of the English intellectual. Similarly, George Orwell (1903 – 1950) once observed, an English intellectual would sooner be seen stealing from the poor box than standing for “God Save the King.”

However, these intellectuals fail to notice, in their arrogance, that “God save the King” is actually a celebration of constitutional monarchy and not symbolic reverence to an archaic and rather powerless royal family. It is intended to celebrate the nation as embodied in the form of a single person or family and the fact that the common man and woman can live in freedom because there are constitutional restraints placed on the monarch’s power.

If one’s understanding of history has come from films like Braveheart, it is easy to believe that all people in all times have yearned to be free. A real understanding of history, one that comes from books, however, reveals that this has not always been the case. For most of history, people lived under the subjugation of one ruler or another. They lived as feudal serfs, subjects of a king or emperor, or in some other such arrangement. They had little reason to expect such arrangements to change and little motivation to try and change them.

At the turn of the 17th century, the monarchs of Europe began establishing absolute rule by undermining the traditional feudal institutions that had been in place for centuries. These monarchs became all-powerful wielding their jurisdiction over all forms of authority: political, social, economic, and so forth.

To justify their mad dash for power, Europe’s monarchs required a philosophical argument that vindicated their actions. They found it in a political doctrine known as ‘the divine rights of kings.’ This doctrine, formulated by the Catholic Bishop, Jacques Bossuet (1627 – 1704) in his book, Politics Derived from Sacred Scripture, argued that monarchs were ordained by God and therefore represented His will. It was the duty of the people to obey that individual without question. As such, no limitations could be put on a monarch’s power.

What Bossuet was suggesting was hardly a new, but it did provide the justification many monarchs needed to centralise power in themselves. King James I (1566 – 1625) of England and Scotland saw monarchs as God’s lieutenants and believed that their actions should be tempered by the fear of God since they would be called to account at the Last Judgement. On the basis of this belief, King James felt perfectly justified in proclaiming laws without the consent of parliament and involving himself in cases being tried before the court.

When King James died in 1625, he was succeeded by his second-eldest son, Charles (1600 – 1649). King Charles I assumed the throne during a time of political change. He was an ardent believer in the divine rights of kings, a belief that caused friction between the monarch and parliament from whom he had to get approval to raise funds.

In 1629, Charles outraged much of the population, as well as many nobles, when he elected to raise funds for his rule using outdated taxes and fines, and stopped calling parliament altogether. Charles had been frustrated by Parliament’s constant attacks on him and their refusal to furnish him with money. The ensuing period would become known as the eleven years tyranny.

By November 1640, Charles had become so bereft of funds that he was forced to recall Parliament. The newly assembled Parliament immediately began clamouring for change. They asserted the need for a regular parliament and sought changes that would make it illegal for the King to dissolve the political body without the consent of its members. In addition, the Parliament ordered the king to execute his friend and advisor, Thomas Wentworth (1593 – 1641), the 1st Earl of Stafford, for treason.

The result was a succession of civil wars that pitted King Charles against the forces of Parliament, led by the country gentlemen, Oliver Cromwell (1599 – 1658). Hailing from Huntingdon, Cromwell was a descendant of Henry VIII’s (1491 – 1547) chief minister, Thomas Cromwell (1485 – 1550). In the end, it would decimate the English population and forever alter England’s political character.

The English Civil War began in January 1642 when King Charles marched on Parliament with a force of four-hundred-thousand men. He withdrew to Oxford after being denied entry. Trouble was brewing. Throughout the summer, people aligned themselves with either the monarchists or the Parliamentarians.

The forces of King Charles and the forces of Parliament would meet at the Battle of Edgehill in October. What would follow is several years of bitter and bloody conflict.

Ultimately, it was Parliament that prevailed. Charles was captured, tried for treason, and beheaded on January 30th, 1642. England was transformed into a republic or “commonwealth.” The English Civil War had claimed the lives of two-hundred-thousand peoples, divided families, and facilitated enormous social and political change. Most importantly, however, it set the precedent that a monarch could not rule without the consent of parliament.

The powers of parliament had been steadily increasing since the conclusion of the English Civil War. However, total Parliamentary supremacy had proven unpopular. The Commonwealth created in the wake of the Civil War had collapsed shortly after Oliver Cromwell’s death. When this happened, it was decided to restore the Stuart dynasty.

The exiled Prince Charles returned to France and was crowned King Charles II (1630 – 1685). Like his father and grandfather, Charles was an ardent believer in the divine rights of kings. This view put him at odds with those of the Enlightenment which challenged the validity of absolute monarchy, questioned traditional authority, and idealised liberty.

By the third quarter of the 17th century, Protestantism had triumphed in both England and Scotland. Ninety-percent of the British population was Protestant. The Catholic minority was seen as odd, sinister, and, in extreme cases, outright dangerous. People equated Catholicism with tyranny linking French-Style autocracy with popery.

It should come as no surprise, then, that Catholics became the target of persecution. Parliament barred them from holding offices of state and banned Catholic forms of worship. Catholics were barred from becoming members of Parliament, justices of the peace, officers in the army, or hold any other position in Parliament unless they were granted a special dispensation by the King.

It is believed that Charles II may have been a closet Catholic. He was known for pardoning Catholics for crimes (controversial considering Great Britain was a protestant country) and ignoring Parliament.

However, Charles’ brother and successor, James (1633 – 1701) was a Catholic beyond any shadow of a doubt. He had secretly converted in 1669 and was forthright in his faith. After his first wife, Anne Hyde (1637 – 1671) died, James had even married the Italian Catholic, Mary of Modena (1658 – 1718). A decision that hardly endeared him to the populace.

The English people became alarmed when it became obvious that Charles II’s wife, Catherine of Braganza (1638 – 1705) would not produce a Protestant heir. It meant that Charles’ Catholic brother, James was almost certainly guaranteed to succeed him on the throne. So incensed was Parliament at having a Catholic on the throne, they attempted to pass the Crown onto one of Charles’ Anglican relatives.

Their concern was understandable, too. The English people had suffered the disastrous effects of religious intolerance since Henry VIII had broken away from the Catholic Church and established the Church of England. The result had been over a hundred years of religious conflict and persecution. Mary I (1516 – 1558), a devout Catholic, had earnt the moniker “bloody Mary” for burning Protestants the stake. During the reign of King James, Guy Fawkes (1570 – 1606), along with a group of Catholic terrorists, had attempted to blow up Parliament in the infamous “gunpowder plot.”

Unlike Charles II, James made his faith publicly known. He desired greater tolerance for Catholics and non-Anglican dissenters like Quakers and Baptists. The official documents he issued, designed to bring about the end of religious persecution, were met with considerable objection from both Bishops and Europe’s protestant monarchs.

Following the passing of the Test Act in 1672, James had briefly been forced to abandon his royal titles. The Act required officers and members of the nobility to take the Holy Communion as spelt out by the Church of England. It was designed to prevent Catholics from taking public office.

Now, as King, James was attempting to repeal the Test Act by placing Catholics in positions of power. His Court featured many Catholics and he became infamous for approaching hundreds of men – justices, wealthy merchants, and minor landowners – to stand as future MPs and, in a process known as ‘closeting’, attempting to persuade them to support his legal reforms. Most refused.

That was not the limits of James’ activities, either. He passed two Declarations of Indulgences to be read from every stage for two Sundays, and put those who opposed it on trial for seditious libel. Additionally, he had imprisoned seven Bishops for opposing him, made sweeping changes to the Church of England, and built an army comprising mainly of Catholics.

The people permitted James II to rule as long as his daughter, the Protestant Prince Mary (1662 – 1694) remained his heir. All this changed, however, when Mary Modena produced a Catholic heir: James Francis Edward Stuart (1688 – 1766). When James declared that the infant would be raised Catholic, it immediately became apparent that a Catholic dynasty was about to be established. Riots broke out. Conspiracy theorists posited that the child was a pawn in a Popish plot. The child, the theory went, was not the King’s son but rather a substitute who had been smuggled into the birthing chamber in a bed-warming pan.

In reality, it was the officers of the Army and Navy who were beginning to plot and scheme in their taverns and drinking clubs. They were annoyed that James had introduced Papist officers into the military. The Irish Army, for example, had seen much of its Protestant officer corps dismissed and replaced with Catholics who had little to no military experience.

James dissolved Parliament in July 1688. Around this time, a Bishop and six prominent politicians wrote to Mary and her Dutch husband, William of Orange (1650 – 1702) and invited them to raise an army, invade London, and seize the throne. They accepted.

William landed in Dorset on Guy Fawkes’ day accompanied by an army of fifteen-thousand Dutchmen and other Protestant Europeans. He quickly seized Exeter before marching eastward towards London. James II called for troops to confront William.

Things were not looking good for James, however. Large parts of his officer corps were defecting to the enemy and taking their soldiers with them. Without the leadership of their officers, many soldiers simply went home. English magnates started declaring for William. And his own daughter, Princess Anne (1665 – 1714) left Whitehall to join the rebels in Yorkshire. James, abandoned by everyone, fled to exile in France. He would die there twelve-years-later.

On January 22nd, 1689, William called the first ‘convention parliament.’ At this ‘convention’, Parliament passed two resolutions. First, it was decided that James’ flight into exile constituted an act of abdication. And second, it was declared a war against public policy for the throne to be occupied by a Catholic. As such, the throne was passed over James Francis Edward Stuart, and William and Mary were invited to take the Crown as co-monarchs.

They would be constrained, however, by the 1689 Bill of Rights and, later, by the 1701 Act of Settlement. The 1689 Bill of Rights made Great Britain a constitutional monarchy as opposed to an absolute one. It established Parliament, not the crown, as the supreme source of law. And it set out the most basic rights of the people.

Likewise, the 1701 Act of Settlement helped to strengthen the Parliamentary system of governance and secured a Protestant line of succession. Not only did it prevent Catholics from assuming the throne, but it also gave Parliament the ability to dictate who could ascend to the throne and who could not.

The Glorious Revolution was one of the most important events in Britain’s political evolution. It made William and Mary, and all monarchs after them, elected monarchs. It established the concept of Parliamentary sovereignty granting that political body the power to make or unmake any law it chose to. The establishment of Parliamentary sovereignty brought with it the ideas of responsible and representative government.

The British philosopher, Roger Scruton (1944 – ) described British constitutional monarchy as a “light above politics which shines down [on] the human bustle from a calmer and more exalted sphere.” A constitutional monarchy unites the people for a nation under a monarch who symbolises their shared history, culture, and traditions.

Constitutional monarchy is a compromise between autocracy and democracy. Power is shared between the monarch and the government, both of whom have their powers restricted by a written, or unwritten, constitution. This arrangement separates the theatre of power from the realities of power. The monarch is able to represent the nation whilst the politician is able to represent his constituency (or, more accurately, his party).

In the Need for Roots, the French philosopher, Simone Weils (1909 – 1943) wrote that Britain had managed to maintain a “centuries-old tradition of liberty guaranteed by the authorities.” Weils was astounded to find that chief power in the British constitution lay in the hands of a lifelong, unelected monarch. For Weils, it was this arrangement that allowed the British to retain its tradition of liberty when other countries – Russia, France, and Germany, among others – lost theirs when they abolished their monarchies.

sir_isaac_isaacs_and_lady_isaacs

Great Britain’s great legacy is not their once vast and now non-existent Empire, but the ideas of liberty and governance that they have gifted to most of their former colonies. Even the United States, who separated themselves from the British by means of war, inherited most of their ideas about “life, liberty, and the pursuit of happiness” from their English forebears.

The word “Commonwealth” was adopted at the Sixth Imperial Conference held between October 19th and November 26th, 1926. The Conference, which brought together the Prime Ministers of the various dominions of the British Empire, led to the formation of the Inter-Imperial Relations Committee. The Committee, headed for former British Prime Minister, Arthur Balfour (1848 – 1930), was designed to look into future constitutional arrangements within the commonwealth.

Four years later, at the Seventh Imperial Conference, the committee delivered the Balfour Report. It stated:

“We refer to the group of self-governing communities composed of Great Britain and the Dominions. Their position and mutual relation may be readily defined. They are autonomous Communities within the British Empire, equal in status, in no way subordinate one to another in any aspect of their domestic or external affairs, though united by a common allegiance to the Crown, and freely associated as members of the British Commonwealth of Nations.”

It continued:

“Every self-governing member of the Empire is now the master of its destiny. In fact, if not always in form, it is subject to no compulsion whatsoever.”

Then, in 1931, the Parliament of the United Kingdom passed the Statute of Westminster. It became one of two laws that would secure Australia’s political and legal independence from Great Britain.

The Statute of Westminster gave legal recognition to the de-facto independence of the British dominions. Under the law, Australia, Canada, the Irish Free State, Newfoundland (which would relinquish its dominion status and be absorbed into Canada in 1949), New Zealand and South Africa were granted legal independence.

Furthermore, the law abolished the Colonial Validity Act 1865. A law which had been enacted with the intention of removing “doubts as to the validity of colonial laws.” According to the act, a Colonial Law was void when it “is or shall be in any respect repugnant to the provisions of any Act of Parliament extending to the colony to which such laws may relate, or repugnant to any order or regulation under authority of such act of Parliament or having in the colony the force and effect of such act, shall be read subject to such act, or regulation, and shall, to the extent of such repugnancy, but not otherwise, be and remain absolutely void and inoperative.”

The Statute of Westminster was quickly adopted by Canada, South Africa, and the Irish Free State. Australia, on the other hand, did not adopt it until 1942, and New Zealand did not adopt it until 1947.

More than forty-years-later, the Hawke Labor government passed the Australia Act 1986. This law effectively made the Australian legal system independent from Great Britain. It had three major achievements. First, it ended appeals to the Privy Council thereby establishing the High Court as the highest court in the land. Second, it ended the influence the British government had over the states of Australia. And third, it allowed Australia to update or repeal those imperial laws that applied to them by ending British legislative restrictions.

What the law did not do, however, was withdraw the Queen’s status as Australia’s Head of State:

“Her Majesty’s Representative in each State shall be the Governor.

Subject to subsections (3) and (4) below, all powers and functions of Her Majesty in respect of a State are exercisable only by the Governor of the State.

Subsection (2) above does not apply in relation to the power to appoint, and the power to terminate the appointment of, the Governor of a State.

While her Majesty is personally present in a State, Her Majesty is not precluded from exercising any of Her powers and functions in respect of the State that are the subject of subsection (2) above.

The advice of Her Majesty in relation to the exercise of powers and functions of Her Majesty in respect of a State shall be tendered by the Premier of the State.”

These two laws reveal an important miscomprehension that is often exploited by Australian Republicans. That myth is the idea that Australia does not have legal and political independence because its Head of State is the British monarch. The passage of the Statute of Westminster in 1931 and the Australia Act in 1986 effectively ended any real political or legal power the British government had over Australia.

In Australia, the monarch (who is our head of state by law) is represented by a Governor General. This individual – who has been an Australian since 1965 – is required to take an oath of allegiance and an oath of office that is administered by a Justice (typically the Chief Justice) of the High Court. The Governor-General holds his or her position at the Crown’s pleasure with appointments typically lasting five years.

The monarch issues letters patent to appoint the Governor General based on the advice of Australian ministers. Prior to 1924, Governor Generals were appointed on the advice of both the British government and the Australian government. This is because the Governor General at that time represented both the monarch and the British government. This arrangement changed, however, at the Imperial Conferences of 1926 and 1930. The Balfour Report produced by these conferences stated that the Governor General should only be the representative of the crown.

The Governor General’s role is almost entirely ceremonial. It has been argued that such an arrangement could work with an elected Head of State. However, such an arrangement would have the effect of politicising and thereby corrupting the Head of State. A Presidential candidate in the United States, for example, is required to raise millions of dollars for his campaign and often finds himself beholden to those donors who made his ascent possible. The beauty of having an unelected Head of State, aside from the fact that it prevents the government from assuming total power, is that they can avoid the snares that trap other political actors.

image-20151106-16263-1t48s2d

The 1975 Constitutional Crisis is a perfect example of the importance of having an independent and impartial Head of State. The crises stemmed from the Loans Affair which forced Dr. Jim Cairns (1914 – 2003), Deputy Prime Minister, Treasurer, and intellectual leader of the political left, and Rex Connor (1907 – 1977) out of the cabinet. As a consequence of the constitutional crisis, Gough Whitlam (1916 – 2014) was dismissed as Prime Minister and the 24th federal parliament was dissolved.

The Loan’s affair began when Rex Connor attempted to borrow money, up to US$4b, to fund a series of proposed national development projects. Connor deliberately flouted the rules of the Australian Constitution which required him to take such non-temporary government borrowing to the Loan Council (a ministerial council consisting of both Commonwealth and state elements which existed to coordinate public sector borrowing) for approval. Instead, on December 13th, 1974, Gough Whitlam, Attorney-General Lionel Murphy (1922 – 1986), and Dr. Jim Cairns authorised Connor to seek a loan without the council’s approval.

When news of the Loans Affair was leaked, the Liberal Party, led by Malcolm Fraser (1930 – 2015), began questioning the government. Whitlam attempted to brush the scandal aside by claiming that the loans had merely been “matters of energy” and claiming that the Loans Council would only be advised once a loan had been made. Then, on May 21st, Whitlam informed Fraser that the authority for the plan had been revoked.

Despite this, Connor continued to liaise with the Pakistani financial broker, Tirath Khemlani (1920 – 1991). Khemlani was tracked down and interviewed by Herald Journalist, Peter Game (1927 – ) in mid-to-late 1975. Khemlani claimed that Connor had asked for a twenty-year loan with an interest of 7.7% and a 2.5% commission for Khemlani. The claim threw serious doubt on Dr. Jim Cairn’s claim that the government had not offered Khemlani a commission on a loan. Game also revealed that Connor and Khemlani were still in contact, something Connor denied in the Sydney Morning Herald.

Unfortunately, Khemlani had stalled on the loan, most notably when he had been asked to go to Zurich with Australian Reserve Bank officials to prove the funds were in the Union Bank of Switzerland. When it became apparent that Khemlani would never deliver Whitlam was forced to secure the loan through a major American investment bank. As a condition of that loan, the Australian government was required to cease all other loans activities. Consequentially, Connor had his loan raising authority revoked on May 20th, 1975.

The combination of existing economic difficulties with the political impact of the Loan’s Affair severely damaged to the Whitlam government. At a special one day sitting of the Parliament held on July 9th, Whitlam attempted to defend the actions of his government and tabled evidence concerning the loan. It was an exercise in futility, however. Malcolm Fraser authorised Liberal party senators – who held the majority in the upper house at the time – to force a general election by blocking supply.

And things were only about to get worse. In October 1975, Khemlani flew to Australia and provided Peter Game with telexes and statutory declarations Connor had sent him as proof that he and Connor had been in frequent contact between December 1974 and May 1975. When a copy of this incriminating evidence found its way to Whitlam, the Prime Minister had no other choice but to dismiss Connor and Cairns (though he did briefly make Cairns Minister for the Environment).

By mid-October, every metropolitan newspaper in Australia was calling on the government to resign. Encouraged by this support, the Liberals in the Senate deferred the Whitlam budget on October 16th. Whitlam warned Fraser that the Liberal party would be “responsible for bills not being paid, for salaries not being paid, for utter financial chaos.” Whitlam was alluding to the fact that blocking supply threatened essential services, Medicare rebates, the budgets of government departments and the salaries of public servants. Fraser responded by accusing Whitlam of bringing his own government to ruin by engaging in “massive illegalities.”

On October 21st, Australian’s longest-serving Prime Minister, Sir Robert Menzies (1894 – 1978) signalled his support for Fraser and the Liberals. The next day, Treasurer, Bill Hayden (1933 – ) reintroduced the budget bills and warned that further delay would increase unemployment and deepen a recession that had blighted the western world since 1973.

The crisis would come to a head on Remembrance Day 1975. Whitlam had asserted for weeks that the Senate could not force him into an election by claiming that the House of Representatives had an independence and an authority separate from the Senate.

Whitlam had decided that he would end the stalemate by seeking a half-senate election. Little did he know, however, that the Governor-General, Sir John Kerr (1914 – 1991) had been seeking legal advice from the Chief Justice of the High Court on how he could use his Constitutional Powers to end the deadlock. Kerr had come to the conclusion that should Whitlam refuse to call a general election, he would have no other alternative but to dismiss him.

And this is precisely what happened. With the necessary documents drafted, Whitlam arranged to meet Kerr during the lunch recess. When Whitlam refused to call a general election, Kerr dismissed him and, shortly after, swore in Malcolm Fraser as caretaker Prime Minister. Fraser assured Kerr that he would immediately pass the supply bills and dissolve both houses in preparation for a general election.

Whitlam returned to the Lodge to eat lunch and plan his next movie. He informed his advisors that he had been dismissed. It was decided that Whitlam’s best option was to assert Labor’s legitimacy as the largest party in the House of Representatives. However, fate was already moving against Whitlam. The Senate had already passed the supply bills and Fraser was drafting documents that would dissolve the Parliament.

At 2pm, Deputy Prime Minister, Frank Crean (1916 – 2008) defended the government against a censure motion started by the opposition. “What would happen, for argument’s sake, if someone else were to come here today and say he was now the Prime Minister of this country”, Crean asked. In fact, Crean was stalling for time while Whitlam prepared his response.

At 3pm, Whitlam made a last-ditch effort to save his government by addressing the House. Removing references to the Queen, he asked that the “House expresses its want of confidence in the Prime Minister and requests, Mr. Speaker, forthwith to advice His Excellency, the Governor-General to call the member of Wannon to form a government.” Whitlam’s motion was passed with a majority of ten.

The speaker, Gordon Scholes (1931 – 2018) expressed his intention to “convey the message of the House to His Excellency at the first opportunity.” It was a race that Whitlam was not supposed to win. Scholes was unable to arrange an appointment until quarter-to-five in the afternoon.

Behind the scenes, departmental officials were working to provide Fraser with the paperwork he needed to proclaim a double dissolution. By ten-to-four, Fraser left for government house. Ten minutes later, Sir John Kerr had signed the proclamation dissolving both Houses of Parliament and set the date for the upcoming election for December 13th, 1975. Shortly after, Kerr’s official secretary, David Smith (1933) drove to Parliament House and, with Whitlam looming behind him, read the Governor General’s proclamation.

The combination of economic strife, political scandal, and Whitlam’s dismissal signed the death warrant for Whitlam’s government. At the 1975 Federal Election, the Liberal-National coalition won by a landslide, gaining a majority of ninety-one seats and obtaining a popular vote of 4,102,078. In the final analysis, it seems that the Australian people had agreed with Kerr’s decision and had voted to remove Whitlam’s failed government from power once and for all.

23163929155_9f41dc691d_h

Most of the arguments levelled against constitutional monarchies can be described as petty, childish, and ignorant. The biggest faux pas those who oppose constitutional monarchies make is a failure to separate the royal family (who are certainly not above reproach) from the institution of monarchy itself. Dislike for the Windsor family is not a sufficient reason to disagree with constitutional monarchy. It would be as if I decided to argue for the abolition of the office of Prime Minister just because I didn’t like the person who held that office.

One accusation frequently levelled against the monarchy is that they are an undue financial burden on the British taxpaying public. This is a hollow argument, however. It is certainly true that the monarchy costs the British taxpayer £299.4 million every year. And it is certainly true that the German Presidency costs only £26 million every year. However, it is not true that all monarchies are necessarily more expensive than Presidencies. The Spanish monarchy costs only £8 million per year, less than the Presidencies of Germany, Finland, and Portugal.

Australia has always had a small but vocal republican movement. The National Director of the Republican Movement, Michael Cooney has stated: “no one thinks it ain’t broken, that we should fix it. And no one thinks we have enough say over our future, and so, no matter what people think about in the sense of the immediate of the republic everyone knows that something is not quite working.”

History, however, suggests that the Australian people do not necessarily agree with Cooney’s assessment. The Republican referendum of 1999 was designed to facilitate two constitutional changes: first, the establishment of a republic, and, second, the insertion of a preamble in the Constitution.

The Referendum was held on November 6th, 1999. Around 99.14%, or 11,683,811 people, of the Australian voting public participated. 45.13%, or 5,273,024 voted yes. However, 54.87%, or 6,410,787 voted no. The Australian people had decided to maintain Australia’s constitutional monarchy.

All things considered, it was probably a wise decision. The chaos caused by establishing a republic would pose a greater threat to our liberties than a relatively powerless old lady. Several problems would need to be addressed. How often should elections occur? How would these elections be held? What powers should a President have? Will a President be just the head of state, or will he be the head of the government as well? Australian republicans appear unwilling to answer these questions.

Margaret Tavits of Washington University in St. Louis once observed that: “monarchs can truly be above politics. They usually have no party connections and have not been involved in daily politics before assuming the post of Head of State.” It is the job of the monarch to become the human embodiment of the nation. It is the monarch who becomes the centrepiece of pageantry and spectacle. And it the monarch who symbolises a nation’s history, tradition, and values.

Countries with elected, or even unelected, Presidents can be quite monarchical in style. Americans, for example, often regard their President (who is both the Head of State and the head of the government) with an almost monarchical reverence. A constitutional monarch might be a lifelong, unelected Head of State, but unlike a President, that is generally where their power ends. It is rather ironic that the Oxford political scientists, Petra Schleiter and Edward Morgan-Jones have noted that allow governments to change without democratic input like elections than monarchs are. Furthermore, by occupying his or her position as Head of State, the monarch is able to prevent other, less desirable people from doing so.

The second great advantage of constitutional monarchies is that they provide their nation with stability and continuity. It is an effective means to bridging the past and future. A successful monarchy must evolve with the times whilst simultaneously keeping itself rooted in tradition. All three of my surviving grandparents have lived through the reign of King George VI, Queen Elizabeth II, and may possibly live to see the coronation of King Charles III. I know that I will live through the reigns of Charles, King William V, and possibly survive to see the coronation of King George VII (though he will certainly outlive me).

It would be easy to dismiss stability and continuity as manifestations of mere sentimentality, but such things also have a positive effect on the economy, as well. In a study entitled Symbolic Unity, Dynastic Continuity, and Countervailing Power: Monarchies, Republics and the Economy Mauro F. Guillén found that monarchies had a positive impact on economies and living standards over the long term. The study, which examined data from one-hundred-and-thirty-seven countries including different kinds of republics and dictatorships, found that individuals and businesses felt more confident that the government was not going to interfere with their property in constitutional monarchies than in republics. As a consequence, they are more willing to invest in their respective economies.

When Wordsworth wrote his ode to Milton, he was mourning the loss of chivalry he felt had pervaded English society. Today, the West is once again in serious danger of losing those two entities that is giving them a connection to the chivalry of the past: a belief in God and a submission to a higher authority.

Western culture is balanced between an adherence to reason and freedom on the one hand and a submission to God and authority on the other. It has been this delicate balance that has allowed the West to become what it is. Without it, we become like Shakespeare’s Hamlet: doomed to a life of moral and philosophical uncertainty.

It is here that the special relationship between freedom and authority that constitutional monarchy implies becomes so important. It satisfies the desire for personal autonomy and the need for submission simultaneously.

The Christian apologist and novelist, C.S. Lewis (1898 – 1964) once argued that most people no more deserved a share in governing a hen-roost than they do in governing a nation:

“I am a democrat because I believe in the fall of man. I think most people are democrats for the opposite reason. A great deal of democratic enthusiasm descends from the idea of people like Rousseau who believed in democracy because they thought mankind so wise and good that everyone deserved a share in the government. The danger of defending democracy on those grounds is that they’re not true and whenever their weakness is exposed the people who prefer tyranny make capital out of the exposure.”

The necessity for limited government, much like the necessity for authority, comes from our fallen nature. Democracy did not arise because people are so naturally good (which they are not) that they ought to be given unchecked power over their fellows. Aristotle (384BC – 322BC) may have been right when he stated that some people are only fit to be slaves, but unlimited power is wrong because there is no one person who is perfect enough to be a master.

Legal and economic equality are necessary bulwarks against corruption and cruelty. (Economic equality, of course, refers to the freedom to engage in lawful economic activity, not to socialist policies of redistributing wealth that inevitably lead to tyranny). Legal and economic equality, however, does not provide spiritual sustenance. The ability to vote, buy a mobile phone, or work a job without being discriminated against may increase the joy in your life, but it is not a pathway to genuine meaning in life.

Equality serves the same purpose that clothing does. We are required to wear clothing because we are no longer innocent. The necessity of clothes, however, does not mean that we do not sometimes desire the naked body. Likewise, just because we adhere to the idea that God made all people equal does not mean that there is not a part of us that does not wish for inequality to present itself in certain situations.

Chivalry symbolises the best human beings can be. It helps us realise the best in ourselves by reconciling fealty and command, inferiority and superiority. However, the ideal of chivalry is a paradox. When the veil of innocence has been lifted from our eyes, we are forced to reconcile ourselves to the fact that bullies are not always cowards and heroes are not always modest. Chivalry, then, is not a natural state, but an ideal to be aimed for.

The chivalric ideal marries the virtues of humility and meekness with those of valour, bravery, and firmness. “Thou wert the meekest man who ever ate in hall among ladies”, said Sir Ector to the dead Lancelot. “And thou wert the sternest knight to thy mortal foe that ever-put spear in the rest.”

Constitutional monarchy, like chivalry, makes a two-fold demand on the human spirit. Its democratic element, which upholds liberty, demands civil participation from all its citizens. And its monarchical element, which champions tradition and authority, demands that the individual subjugate himself to that tradition.

It has been my aim in this essay to provide a historical, practical, and spiritual justification for constitutional monarchy. I have demonstrated that the British have developed ideals of liberty, justice, and good governance. The two revolutions of the 17th century – the English Civil War and the Glorious Revolution – established Great Britain as a constitutional monarchy. It meant that the monarch could not rule without the consent of parliament, established parliament as the supreme source of law, and allowed them to determine the line of succession. I have demonstrated that constitutional monarchs are more likely to uphold democratic principles and that the stability they produce encourages robust economies. And I have demonstrated that monarchies enrich our souls because it awakens in us the need for both freedom and obedience.

Our world has become so very vulgar. We have turned our backs on God, truth, beauty, and virtue. Perhaps we, like Wordsworth before us, should seek virtue, manners, freedom, and power. We can begin to do this by retaining the monarchy.

IN DEFENCE OF CHRISTIANITY

afbeb38a66c65270d37b74cc7fb86fbf

In 2017, the online video subscription service, Hulu, embarked on the production of Margaret Atwood’s (1939 – ) 1985 novel, The Handmaid’s Tale. The story is set in the fictional, totalitarian state of Gilead: a society run by fundamentalist Christians who overthrew the previous secular state and set up a theocracy in its wake. For years, influential thought leaders and other arbiters of popular opinion have espoused the opinion that broader society would greatly benefit from the abolition of Christianity. It is my belief that such an occurrence would have precisely the opposite effect.

No group has criticised Christianity more than the New Atheists. Frequently deriding it as nothing more than “science for stupid people”, prominent New Atheists have ridiculed Christianity and dismissed its positive effects. Atheists and anti-Christians turn Christianity into a straw man by reducing it down to his most basic elements (they are helped, unfortunately, by those fundamentalist Christians who still assert that the earth is literally six-thousand years old). They then use this straw man to discredit the idea of faith. The philosopher, Sam Harris (1967 – ) argued in his book, The End of Faith that religious belief constituted a mental illness. More alarmingly, the British Scientist, Richard Dawkins (1941 – ) took things one step further by claiming that religious instruction constituted a form of child abuse.

The basis for much of Christianity’s negative portrayal finds its roots in the philosophies of the political left. A central tenet of the left-wing worldview is an adherence to secularism, which appears set to replace Christianity as the prevailing cultural belief system. (This is not to be confused with atheism, which denies the existence of a creator). On the one hand, secularism promotes both religious liberty and the separation of church and state (both of which are good things). On the other hand, however, proponents of secularism reject the knowledge and wisdom religious institutions can impart on the world. In a secular society, God can be believed to exist, but not in any sort of a productive way. God is something to be confined the private home or the sanctuary of one’s local Church. God is something to be worshipped behind closed doors where no one can see you.

Of course, anti-Christian rhetoric has been a facet of popular culture since the 1960s. Today, finding a positively-portrayed devout Christian family is about as likely as finding a virgin in the maternity ward. Christians are routinely depicted as stupid, backwards, hateful, and extreme. By contrast, atheists are routinely depicted as witty, intelligent, and tolerant. In short, Atheism is deemed as good and Christianity is deemed as bad. And, of course, this attitude has filled some with a kind of arrogant grandiosity. During an interview in 1966, John Lennon (1940 – 1980) opined: “Christianity will go. It will vanish and shrink. I needn’t argue with that; I’m right and I will be proved right. We’re more popular than Jesus now; I don’t know which will go first, rock and roll or Christianity.”

The mainstream media rarely discusses the persecution of Christians. Indeed, prejudice and discrimination against Christianity is treated with a type of permissiveness that prejudice and discrimination against other religions, Islam being a primary example, is not.

Christians are estimated to be the victims of four out of five discriminatory acts around the world, and face persecutions in one-hundred-and-thirty-nine countries. Churches have been firebombed in Nigeria. North Koreans caught with Bibles are summarily shot. In Egypt, Coptic Christians have faced mob violence, forced removals, and, in the wake of the Arab spring, the abduction of their females who are forced to marry Muslim men.

In China, Christian villagers were instructed to remove pictures of Christ, the Crucifix, and Gospel passages by Communist Party officials who wished to “transform believers in religion into believers in the party.” According to the South China Morning Post, the purpose behind the drive was the alleviation of poverty. The Chinese Communist Party believed that it was religious faith that was responsible for poverty in the region and wanted the villagers to look to their political leaders for help, rather than a saviour. (Wouldn’t it be wonderful if the Chinese Communist Party looked at their own evil and ineffective political ideology as the true cause of poverty in their country rather than blaming it on religion?). As a result, around six-hundred people in China’s Yugan county – where about ten percent of the population is Christian – removed Christian symbology from their living rooms.

Popular culture and thought in the West has attempted, with a great deal of success, to paint Christianity as stupid, backwards, dogmatic, and immoral. It is the presence religion that is to blame for holding the human race back. It is religion that is to blame for racism, sexism, and all manner of social injustices. It is religion that is the cause of all wars. So, on and so forth.

cat3

I strongly disagree with this argument. Indeed, it is my belief that the abolishment of Christianity from public life would have the effect of increasing intolerance and immorality. Christianity’s abolishment will have precisely this effect because it will abolish those metaphysical doctrines – divine judgement, universal and absolute morality, and the divinity of the human soul – that has made those things possible.

Christianity and Western Civilisation are inextricably linked. In the field of philosophy, virtually all Western thinkers have grappled with the concepts of God, faith, morality, and more. As the writer, Dinesh D’Souza (1961 – ) wrote in his book, What’s So Great About Christianity:

“Christianity is responsible for the way our society is organised and for the way we currently live. So extensive is Christian contribution to our laws, our economics, our politics, our art, our calendar, our holidays, and our moral and cultural priorities that J.M. Robers writes in Triumph of the West: ‘We could none one of us today be what we are if a handful of Jews nearly two thousand years ago had not believed that they had known a great teacher, seen him crucified, died, and buried, and then rise again’.”

The primary contribution of Christianity to Western civilisation has been to act as a stabilising force, providing society with an overarching metaphysical structure as well as rules and guidelines that act as a moral foundation. This shared metaphysical structure and moral foundation, combined with traditions and cultural customs, has the effect of bringing a country, a township, even a school or parish, together.

When Christianity lost its supremacy in society it was replaced by smaller, less transcendent and more ideological, belief systems. Where people had once been unified by a common belief, they have now become more divided along ideological lines. Religious belief has not been replaced by rationalism or logic, as the New Atheists supposed. Rather, people have found outlets for their need to believe in other places: social activism, political ideologies, and so forth.

The most prevalent contribution that Christianity has made to the Western world comes under the guise of human rights. Stories like The Parable of the Good Samaritan have had a remarkable influence on its conception. Human rights stem, in part, from the belief that human beings were created in the image of God and hold a divine place in the cosmos.  Christianity has played a positive role in ending numerous brutal and archaic practices, including slavery, human sacrifice, polygamy, and infanticide. Furthermore, it has condemned incest, abortion, adultery, and divorce. (Remarkably, there are some secularists who wish to bring back some of these antiquated practices).

Christianity placed an intrinsic value on human life that had not been present in pre-Christian society. As the American Pastor, Tim Keller (1950 – ) wrote in Reasons for God: “It was extremely common in the Greco-Roman world to throw out new female infants to die from exposure, because of the low status of women in society.” Roman culture was well known for its brutality and callousness. Practices of regicide, gladiatorial combat, infanticide, and crucifixion were all common. Seneca (4BC – AD65), Nero’s (AD37 – AD68) chief advisor, once stated that it was Roman practice to “drown children who, at birth, are weakly and abnormal.”

Christian morality has had a notable effect on our views on human sexuality and has helped to provide women with far greater rights and protections than its pagan predecessors. Christianity helped to end the hypocritical pagan practice of allowing men to have extra-marital affairs and keep mistresses. It formulated rules against the cohabitation of couples prior to marriage, adultery, and divorce. Unlike the Ancient Greeks and Ancient Romans, Christians do not force widows to remarry, and even allowed widows to keep their husband’s estates.

The Christian faith has been instrumental in the enactment and promotion of public works. The instigator of the Protestant Reformation, Martin Luther (1483 – 1546) championed the idea of compulsory education and state-funded schools. Similarly, the Lutheran layman, Johann Sturm (1507 – 1589) pioneered graded education. Christianity has been the source of numerous social services including health-care, schooling, charity, and so forth. Christianity’s positive belief in charity and compassion has lead to many orphanages, old-age homes, and groups like the Sisters of Charity and Missionaries of the Poor, the YMCA and YWCA, Teen Challenge, the Red Cross, and numerous hospitals and mental health institutions being founded by the faithful.

One of the frequent criticisms levelled at the Christian faith, particularly the Catholic Church, has been that it has stymied scientific and technological development. In truth, Western science and technology have been able to flourish because of the influence of Christianity, not in spite of it. This is because the Christian belief that God created everything lends itself to the idea that everything is worth contemplating. It is certainly true that the Catholic Church has been hostile to those discoveries that do not conform to its doctrine. Galileo, for example, was forced to retract his claim of heliocentrism because it challenged the Church’s doctrine that the earth acted as the centre of the solar system. For the most part, however, Christianity has been largely supportive of scientific endeavour. Christian scientists have included Gregor Mendel (1822 – 1884), Nicolaus Copernicus (1473 – 1543), Johannes Kepler (1571 – 1630), Galileo Galilei (1564 – 1642), Arthur Eddington (1882 – 1944), Isaac Newton (1643 – 1727), Blaise Pascal (1623 – 1662), Andre Ampere (1775 – 1836), James Joule (1818 – 1889), Lord Kelvin (1824 – 1907), Robert Boyle (1627 – 1691), George Washington Carver (1860s – 1943), Louis Pasteur (1822 – 1895), Joseph Lister (1827 – 1912), Francis Collins (1950 – ), William Phillips (1914 – 1975), and Sir John Houghton (1931 – ), and more.

The forces behind the stratospheric success of Western civilisation has not been its art or music or architecture, but the ideas it has built itself upon. It is notions like the rule of law, property rights, free markets, a preference for reason and logic, and Christian theology that are responsible for making Western society the freest and most prosperous civilisation that has ever existed. It cannot survive with one of its central tenents removed.

IT’S TIME FOR A RETURN TO TRADITION

18kb2imhneiqljpg

Modernity is in trouble. From the menace of migrant crime in Europe to the sexual transgressions rife in modern-day Hollywood, the moral argument for modernity is quickly waning. How did things go so wrong? And how do we fix it? Perhaps a return to traditional values and ideals are in order.

The modern world developed over hundreds of years. The post-medieval period has seen the advent of tolerance as a social and political virtue, the rise of the nation-state, the increased role of science and technology in daily life, the development of representative democracy, the creation of property rights, urbanisation, mass literacy, print media, industrialisation, mercantilism, colonisation, the social sciences, modern psychology, emancipation, romanticism, naturalist approaches to art and culture, and the development of existential philosophy.  From the computer to the mobile phone, the motor car to the aeroplane, the marvels of the modern world are all around us.

The modern world has replaced the Aristotelean and faith-based concept of human life that was popular in the Middle Ages with a worldview based on science and reason. Modern intellectualism, therefore, follows the example set forth by Cartesian and Kantian philosophy: mistrusting tradition and finding its roots in science and rationality.

Culturally and intellectually, the 21st century represents the postmodern era. Postmodernism can be difficult to define accurately because the various cultural and social movements that use it as their central philosophy define it for their own purposes. Jean-Franҫois Lyotard (1924 – 1998), who introduced the term in his 1979 book, The Postmodern Condition, defined postmodernism as “incredulity towards metanarratives.” Similarly, Encyclopedia Britannica defines it as a philosophical movement in opposition to the philosophical assumptions and values of modern Western philosophy.

Postmodernism came about as a reaction, indeed a rejection, to modernity. With its roots in the philosophies of Friedrich Nietzsche (1844 – 1900), Martin Heidegger (1889 – 1976), Sigmund Freud (1856 – 1939), and Karl Marx (1818 – 1883), the postmodernist rejects the philosophical theory of Foundationalism – the idea that knowledge is built upon a solid foundation – in favour of large-scale scepticism, subjectivism, and relativism.

The postmodernist likes to see himself as Beowulf fighting Grendel. That is, he likes to see himself as the mythical hero fighting the historical-critical monster. Inspired by doctrines of white privilege and toxic masculinity, and driven by an anti-capitalist (except when it comes to their I-phones), anti-racist (provided the person isn’t white), anti-imperialist (but only European imperialism), and anti-transphobic (because gender is a “social construct”) rhetoric, the post-modernist inspired neo-Marxists and social justice warriors have invaded the modern university and college campus.

Modernity and post-modernism have produced a swathe of existential and moral problems that the Western world has, as of yet, proved unable (or perhaps even unwilling) to solve. To begin, the modern world has abolished the central role that God, nature, and tradition has played in providing life with purpose. In spite of all its cruelty, the German sociologist, Max Weber (1864 – 1920) saw the Middle Ages as a highly humanistic period. Everything was considered to have a divine purpose. Even someone as lowly as a Medieval serf, for example, could feel that he had a role in God’s greater scheme. There was a sense of, as Martin Buber (1878 – 1965) puts it, “I-thou.” Modernity swapped “I-thou” for “I-it”. The human will replaced God as the ultimate arbiter of meaning.

This problem has been further exacerbated by the alienation of the human spirit to nature. Science, for all of its positive qualities, has had the effect of rendering nature meaningless. No longer is a thunderclap the voice of an angry God, nor does a cave contain a goblin or a mountain harbour a giant. Science may be an excellent means for understanding facts, but it is not a substitute for wisdom or tradition when it comes to determining human purpose. No longer does the natural world command the sense of reverential majesty that it once did.

The answer to the problems of the modern, and, by extension, post-modern, world is a revitalisation of the traditional beliefs, values, ideas, customs, and practices that have made the Western world great in the first place. We must reject the destructive ideas espoused by the postmodernists and work to revitalise our traditions. It is high time we started taking some pride in the traditions that have made our civilisation so great.

SMALL GOVERNMENT MATTERS

 

big-government

(This is derived from an old essay I wrote for university)

The size of government is an important yet seldom discussed issue. This is a peculiar phenomenon as the size of government is integral to our freedom. When government power is not limited those with power are able to encroach upon the freedoms of the people. However, when the powers of government are limited people are able to live in peace, freedom, and prosperity.

The Age of Enlightenment (c. 1685 – c. 1815) represents a period in history where the principles of the old world were replaced by new ideals. It was during the Enlightenment that the concepts of modern democracy (democracy originated with the Ancient Greeks, albeit in a rather primitive form), liberty, and inalienable rights began to emerge. One of its key concepts, limited government, came about during the High Enlightenment (c. 1730 – 1780). The English philosopher John Locke (1632 – 1704), perhaps the greatest defender of limited government, believed civil power should be derived from individual autonomy and that the separation of powers was necessary to protect people from tyranny.

Limited government works on the idea that governments should have a little interference in people’s lives as possible. Supporters of small government believe that big government destroys human creativity and innovation because. As the Austro-Hungarian philosopher, Friedrich Hayek (1899 – 1992) stated: “the more the state plans, the more difficult planning becomes for the individual”. Numerous supporters of democracy and liberty had held limited government as an important, and necessary, ideal. The American statesmen, founding father, and President, James Madison (1751 – 1836) sought institutions which would limit the scope of government and give more rights to the individual. Similarly, the Australian Prime Minister, Malcolm Fraser (1930 – 2015) argued that “the power of the state should be limited and contained”.

In no other area is this been clearer than the economy. The economist, Adam Smith (1723 – 1790) argued that regulations on commerce are not only ill-founded but also counter-productive as countries depend on capital accumulation . According to James Madison, guarding persons and property would: “encourage industry by securing the enjoyment of its fruits.” Nations with small governments create their own fortune by allowing the people to participate freely in the marketplace.

Small government makes them master of their own destinies rather than making the government master of them. The people should never forget, as Ronal Reagan put it, “we the people are the driver, the government is the car.” Only small government can continue to survive into the future, only small government can protect the rights of the individual, and only small government celebrates human achievement. This is why small government matters.

REFERENCE LIST

  1. Adam Smith Institute, ‘the Wealth of Nations’: http://www.adamsmith.org/wealth-of-nations. [23/03/2014]
  2. Australian Greens, ‘the Greens’: http://greens.org.au/. [23/03/2014]
  3. Australian Greens, ‘the Economy: We Live in a Society, Not an Economy’: http://greens.org.au/economy. [23/03/2014]
  4. Australian Greens, ‘Standing Up for Small Business’: http://greens.org.au/small-business. [23/03/2014]
  5. Australian Government, ‘Australian Constitution,: Australian Politics, http://australianpolitics.com/constitution-aus/text [23/03/2014]
  6. Australian Government, ‘Australia’s System of Government’: Australian Government: Department of Foreign Affairs and Trade, https://www.dfat.gov.au/facts/sys_gov.html. [23/03/2014]
  7. Australian Government, ‘Australian Government Taxation and Spending’: 2011-12 Budget Overview, http://www.budget.gov.au/2011-12/content/overview/html/overview_46.htm. [23/03/2014]
  8. Moran, ‘Economic Freedom Delivers Results’, Review – Institute of Public Affairs, vol 59, no. 3. 2007.
  9. Australian Labor Party, ‘Australian Labor Party’: http://www.alp.org.au/. [23/03/2014]
  10. Australian Labor Party, ‘Labor is for Growth and Opportunity’: Growth and Opportunity, http://www.alp.org.au/growthandopportunity. [23/03/2014]
  11. Eltham, ‘Size of Government: Big is Not So Bad’: the Drum, http://www.abc.net.au/unleashed/3912918.html. [23/03/2014]
  12. Bonner, ‘the Golden Rule: He Who Has the Gold Makes the Rules’: Daily Reckoning Australia, http://www.dailyreckoning.com.au/golden-rule/2008/03/05/. [23/03/2014]
  13. Bowen, ‘Economic Statement August 2013: Joint Media Release with Senator the Hon Penny Wong Minister for Finance and Deregulation’, Australian Government: the Treasury, http://ministers.treasury.gov.au/DisplayDocs.aspx?doc=pressreleases/2013/016.htm&pageID=003&min=cebb&Year=&DocType. [23/03/2014]
  14. Cracked, ‘Australian Greens’: http://www.cracked.com/funny-6522-australian-greens/. [23/03/2014]
  15. Boaz, ‘Remembering Ronald Reagan’: Cato Institute, http://www.cato.org/publications/commentary/remembering-ronald-reagan. [23/03/2014]
  16. M. Cooray, ‘More About Limited Government and the Role of the State’: http://www.ourcivilisation.com/cooray/westdem/chap6.htm. [23/03/2014]
  17. Western, ‘Big Government is Good for You’: the Guardian, http://www.theguardian.com/commentisfree/cifamerica/2009/oct/13/obama-healthcare-economy-socialism [23/03/2014]
  18. W. Younkins, ‘John Locke’s Limited State’: Le Quebecois Libre, http://www.quebecoislibre.org/06/060219-4.htm. [23/03/2014]
  19. For Dummies, ‘How the Enlightenment Affected Politics and Government’: http://www.dummies.com/how-to/content/how-the-enlightenment-affected-politics-and-govern.html [23/03/2014]
  20. History, ‘Enlightenment’: http://www.history.com/topics/enlightenment [23/03/2014]
  21. Indiana University Northwest, ‘Two Enlightenment Philosophes: Montesquieu and Rousseau’: http://www.iun.edu/~hisdcl/h114_2002/enlightenment2.htm. [23/03/2014]
  22. A. Dorn, ‘the Scope of Government in a Free Society, Cato Journal, vol 32, no.3. 2012. Pp: 1 – 14
  23. Novak, ‘Small Government Means Better Governance’: the Drum, http://www.abc.net.au/unleashed/4147992.html. [23/03/2014]
  24. P. Sommerville, ‘Limited Government, Resistance and Locke’: http://faculty.history.wisc.edu/sommerville/283/283%20session10.htm. [23/03/2014]
  25. Liberal-National Coalition, ‘the Coalition’s Policy to Increase Employment Participation’: http://lpaweb-static.s3.amazonaws.com/13-08-27%20The%20Coalition%E2%80%99s%20Policy%20to%20Increase%20Employment%20Participation%20-%20policy%20document.pdf. [23/03/2014]
  26. Liberal Party, ‘Our Plan for Real Action’: https://www.liberal.org.au/our-plan. [23/03/2014]
  27. Liberal-National Coalition, ‘the Coalition’s Policy for Trade’: http://lpaweb-static.s3.amazonaws.com/Coalition%202013%20Election%20Policy%20%E2%80%93%20Trade%20%E2%80%93%20final.pdf. [23/03/2014]
  28. Lobao and G. Hooks, ‘Public Employment, Welfare Transfers, and Economic Well-Being across Local Populations: Does a Lean and Mean Government Benefit the Masses?’, Social Forces, vol 82, no. 2. 2003. Pp: 519 – 556
  29. R. Cima and P. S. Cotter, ‘the Coherence of the Concept of Limited Government’, Journal of Policy Analysis and Management¸ vol. 4. 1985. Pp. 266 – 270
  30. Baird, ‘The State, Work and Family in Australia’, the International Journal of Human Resource Management, vol 22, no. 18, 2011. Pp: 1 – 14
  31. New Learning, ‘Ronald Reagan on Small Government’: http://newlearningonline.com/new-learning/chapter-4/ronald-reagan-on-small-government. [23/03/2014]
  32. Parker, ‘Religion and Politics’, Distinktion: Scandinavian Journal of Social Theory, vol 7, no. 1. 2006. Pp: 93 – 115
  33. Public Interest Institute, ‘A Short History of Economic Theory Classical Economic Theory: From Adam Smith to Jean-Baptiste Say’: http://limitedgovernment.org/ps-12-9-p3.html. [23/03/2014]
  34. Hollander, ‘John Howard, Economic Liberalism, Social Conservatism, and Australian Federation’, Australian Journal of Politics and History, vol 53, no. 1. 2008. Pp: 85 – 103
  35. Kelman, ‘Limited Government: an Incoherent Concept’, Journal of Policy Analysis and Management, vol. 3, no. 1. 1983. Pp. 31 – 44
  36. Pryce, ‘the Thatcher Years – Political Analysis: Putting the Great Back into Britain?’: Margaret Thatcher: 1925 – 2013, http://www2.granthamtoday.co.uk/gj/site/news/thatcher/analysis.htm. [23/03/2014]
  37. Dunlop, ‘Small Government Can Equal Big Problems’: the Drum, http://www.abc.net.au/news/2014-02-28/dunlop-small-government-can-equal-big-problems/5287718. [23/03/2014]
  38. US Government, ‘Bill of Rights’: the Charters of Freedom “a New World is at Hand”,http://www.archives.gov/exhibits/charters/bill_of_rights_transcript.html. [23/03/2014]
  39. US Government, ‘Constitution of the United States’: the Chapters of Freedom “a New World is at Hand”, http://www.archives.gov/exhibits/charters/constitution_transcript.html. [23/03/2014]
  40. Various Authors, ‘Social Issues and Political Psychology’, International Journal of Psychology, vol 47, no. 1. 2012. Pp: 687 – 697
  41. We the People, ‘Principles, Priorities, and Policies of President Reagan’: Ronald Reagan and Executive Power, http://reagan.civiced.org/lessons/middle-school/principles-priorities-policies-president-reagan. [23/03/2014]
  42. Voegeli, ‘the Trouble with Limited Government’, Claremont Review of Books¸ vol 7, no. 4. 2007. Pp: 10 – 14.
  43. W, ‘Size of Government: Brooks and Ryan’s False Choice’: the Economist, http://www.economist.com/blogs/democracyinamerica/2010/09/size_government. [23/03/2014]

Free Speech Matters

19642011-2020free20speech2020pct

There has been an alarming trend in modern culture: numerous political and social activist groups have been attempting to use the pernicious and false doctrines of political correctness, tolerance, and diversity to silence those they disagree with. Many of these groups have sought the passage of so-called “hate speech” laws designed to silence voices of dissent.

At public colleges and universities, places where free speech and open debate should be actively encouraged, measures – including protests, disruption, and, in some cases, outright violence – taken to suppress voices of dissent has become tantamount to Government censorship. This censorship prevents students from inviting the speakers they wish to hear and debate speech they disagree with. Eva Fourakis, the editor-in-chief of The Williams Record (the student newspaper of Williams College) wrote an editorial, later recanted, commenting that “some speech is too harmful to invite to campus.” The editorial went on to say: “students should not face restrictions in terms of the speakers they bring to campus, provided of course that these speakers do not participate in legally recognised forms of hate speech.”

The University of California, Berkeley, is famous for sparking the free speech movement of the 1960s. Today, however, it has become a haven for radical, anti-free speech Neo-Marxists and social justice warriors. Not only have many Republican students had their personal property destroyed, but numerous conservative speakers have had their talks disturbed, and, in some cases, halted altogether. In February, Antifa – so-called anti-fascists – set fires and vandalised building during a speech by the controversial journalist, Milo Yiannopoulos (1984 – ). In April, threats of violence aimed at members of the Young Americas Foundation forced political commentator, Ann Coulter (1961 – ), to cancel her speech. A speech by David Horowitz (1939 – ), founder and president of the David Horowitz Freedom Center, was cancelled after organisers discovered that the event would take place during normal class times (for safety, or so they claimed). Finally, the conservative journalist, Ben Shapiro (1984 – ), was forced to spend US$600,000 on security for his speech at UC Berkeley. These events show that those who wish to use disruption, vilification, threats, and outright violence to silence others can be, and often are, successful in doing so.

unit-1-intro-hero-image_option-2

Like most the principles of classical liberalism, free speech developed through centuries of political, legal, and philosophical progress. And like many Western ideas, its development can be traced back to the Ancient Greeks. During his trial in Athens in 399BC, Socrates (470BC – 399BC) expressed the belief that the ability to speak was man’s most divine gift. “If you offered to let me off this time on condition I am not any longer to speak my mind”, Socrates stated, “I should say to you, ‘Men of Athens, I shall obey the Gods rather than you.”

Sixteen hundred years later, in 1215, the Magna Carta became the founding document of English liberty. In 1516, Desiderius Erasmus (1466 – 1536) wrote in the Education of a Christian Prince that “in a free state, tongues too should be free.” In 1633, the astronomist Galileo Galilei was put on trial by the Catholic Church for refusing to retract his claim of a heliocentric solar system. In 1644, the poet, John Milton (1608 – 1674), author of Paradise Lost, warned in Areopagictica that “he who destroys a good book kills reason itself.” Following the usurpation of King James II (1633 – 1701) by William III (1650 – 1702) and Mary II (1662 – 1694) in 1688, the English Parliament passed the English Bill of Rights which guaranteed free elections, regular parliaments, and freedom of speech in Parliament.

In 1789, the French Declaration of the Rights of Man and of the Citizen, an important document of the French revolution, provided for freedom of speech (needless to say, Robespierre and company were not very good at actually promoting this ideal). That same year, the philosopher Voltaire (1694 – 1778) famously wrote: “I detest what you write, but I would give my life to make it possible for you to continue to write.” Over in the United States, in 1791, the first amendment of the US Bill of Rights guaranteed freedom of religion, freedom of speech, freedom of the press, and the right to assemble:

ARTICLE [I] (AMENDMENT 1 – FREEDOM OF SPEECH AND RELIGION)

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of people peaceably to assemble, and to petition the Government for a redress of grievances.”

During the 19th century, the British philosopher, John Stuart Mill (1806 – 1873) argued for toleration and individuality in his 1859 essay, On Liberty. “If any opinion is compelled to silence”, Mill warned, “that opinion may, for aught we can certainly know, be true. To deny this is to presume our own infallibility.” Mill believed that all doctrines, no matter how immoral or offensive, ought to be given public exposure. He stated in On Liberty:

“If the argument of the present chapter are of any validity, there ought to exist the fullest liberty of professing and discussing, as a matter of ethical conviction, any doctrine, however immoral it may be considered.”

Elsewhere in On Liberty, Mill warned that the suppression of one voice was as immoral as the suppression of all voices:

“If all mankind minus one were of one opinion, and only one person were of the contrary opinion, mankind would be no more justified in silencing that one person than he, if he had the power, would be justified in silencing mankind.”

Centuries later, in 1948, the Universal Declaration of Human Rights, accepted unilaterally by the United Nations, urged member states to promote civil, human, economic, social, and political rights – including freedom of expression and religion.

31docket-master768

Supreme Court

 

Within the American Justice System, numerous Supreme Court cases have created judicial protections for freedom of speech. In the case of the Nationalist Socialist Party of America v. Village of Stoke (1977), the Supreme Court upheld the right of neo-Nazis to march through a village with a large Jewish population and wear Nazi insignia. The Justices found that the promotion of religious hatred was not a sufficient reason to restrict free speech.

In the city of St. Paul during the early 1990s, a white teenager was arrested under the “Bias-Motivated Crime Ordinance” after he burnt a cross made of a broken chair (cross-burning is commonly used by the Ku Klux Klan to intimidate African Americans) in the front yard of an African American family. The Court ruled that the city’s Ordinance was unconstitutional. Justice Antonin Scalia (1936 – 2016), noted that the purpose of restricting fighting words was to prevent civil unrest, not to ban the content or message of the speaker’s words. Scalia wrote in the case of R.A.V. v. City of St. Paul (1992):

“The ordinance applies only to ‘fighting words’ that insult, or provoke violence, ‘on the basis of race, colour, creed, religion or gender.’ Displays containing abusive invective, no matter how vicious or severe, are permissible unless they are addressed to one of the specified disfavored topics. Those who wish to use ‘fighting words’ in connection with other ideas—to express hostility, for example, on the basis of political affiliation, union membership, or homosexuality—are not covered. The First Amendment does not permit St. Paul to impose special prohibitions on those speakers who express views on disfavored subjects.”

In the Matal v. Tam case (2017), the Supreme Court found that a provision within the Lanham Act prohibiting the registration of trademarks that disparaged persons, institutions, beliefs, or national symbols violated the First Amendment. Justice Samuel Alito (1950 – ) opined:

“[The idea that the government may restrict] speech expressing ideas that offend … strikes at the heart of the First Amendment. Speech that demeans on the basis of race, ethnicity, gender, religion, age, disability, or any other similar ground is hateful; but the proudest boast of our free speech jurisprudence is that we protect the freedom to express ‘the thought that we hate’.”

Justice Anthony Kennedy (1936 – ) opined:

“A law found to discriminate based on viewpoint is an “egregious form of content discrimination,” which is “presumptively unconstitutional.” … A law that can be directed against speech found offensive to some portion of the public can be turned against minority and dissenting views to the detriment of all. The First Amendment does not entrust that power to the government’s benevolence. Instead, our reliance must be on the substantial safeguards of free and open discussion in a democratic society.”

gabrielle-giffords-hate-speech-murderjpg-b7cb13c5e1267ad6

In recent years, numerous calls to ban speech have been justified on the basis that it is “hateful.” Much of this has come from the political left who (in what one may cynically regard as having more to do with silencing voices of dissent than with protecting vulnerable groups) argue that restrictions on hate speech must occur if minorities are to be given equal status with everyone else.

That certain types of speech can be offensive, and that some of that speech may be aimed at certain groups of people, is undeniable. Hate speech has even been criticised for undermining democracy! In an article, Alexander Tsesis, Professor of Law at Loyola University, wrote: “hate speech is a threatening form of communication that is contrary to democratic principles.” Some have even argued that hate speech violates the fourteenth amendment to the US Constitution which guarantees equal protection under the law:

Article XIV (AMENDMENT 14 – RIGHTS GUARANTEED: PRIVILEGES AND IMMUNITIES OF CITIZENSHIP, DUE PROCESS, AND EQUAL PROTECTION)

1: All persons born or naturalised in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside. No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny any person within its jurisdiction the equal protection of the laws.

That there is a historical basis for restricting hate speech is undeniable. Slavery, Jim Crow, and the Holocaust, among other atrocities, were all proceeded by violent and hateful rhetoric. (Indeed, incitement to genocide is considered a serious war crime and a serious crime against humanity under international law.) Genocide is almost always preceded by hate speech. However, what proponents of hate speech laws fail to realise is that the countries that perpetrated these atrocities did not extend the freedom to speak to the groups that they were targeting. Joseph Goebbels (1897 – 1945), the Nazi minister for public enlightenment and propaganda, for example, had such an iron grip on Germany’s media that any voice contradicting the Nazi’s anti-Semitic propaganda had no opportunity to be heard.

Age

But who, exactly, supports hate speech laws? Analysis of survey data taken from Pew Research Center and YouGov reveals that it is primarily non-white, millennial democrats. In terms of age, the Pew Research Centre found that forty-percent of millennials supported Government censorship of hate speech, compared to twenty-seven percent of gen x-ers, twenty-four percent of baby-boomers, and only twelve percent of the silent generation.

race

In terms of race, research by YouGov reveals that sixty-two percent of African Americans support Government censorship of hate speech, followed by fifty percent of Hispanics, and thirty-six percent of White Americans.

political beliefs

In terms of political affiliation, research from YouGov taken in 2015 found that fifty-one percent of Democrats supported restrictions on hate speech, compared to thirty-seven percent of Republicans, and only thirty-five percent of independents.

The primary issue with hate speech is that determining what it does and does not constitute is very difficult. (The cynic may argue, fairly, that hate speech begins when the speaker expresses a view or states a fact or expresses an opinion that another person does not want others to hear.) As Christopher Hitchens (1949 – 2011) pointed out, the central problem with hate speech is that someone has to decide what it does and does not constitute.

The second issue with hate speech laws is that they can easily be used by one group to silence another. Often this kind of censorship is aimed at particular groups of individuals purely for ideological and/or political purposes, often with the justification that such actions increase the freedom and equality of the people the advocates claim to represent.

In Canada, Bill C-16 has sought to outlaw “hate propaganda” aimed at members of the community distinguishable by their gender identity or expression. The Bill originated with a policy paper by the Ontario Human Rights Commission which sought to determine what constituted discrimination against gender identity and expression. This included “refusing to refer to a person by their self-identified name and proper personal pronoun.”  Supporters of Bill C-16 see it as an important step towards the creation of legal protections for historically marginalised groups. Detractors, however, have expressed concern that the Bill creates a precedence for Government mandated speech.

The Canadian clinical psychologist and cultural critic, Professor Jordan Peterson (1962 – ), first came to public attention when he posted a series of YouTube videos warning of the dangers of political correctness and criticising Bill C-16. In his videos, Professor Peterson warned that the law could be used to police speech and compel individuals to use ‘transgender pronouns’ (these are terms like ‘ze’ and ‘zer’, among others). For his trouble, Peterson has been accused of violence by a fellow panellist on the Agenda with Steve Palkin, received two warning letters from the University of Toronto in 2016, and was denied a social research grant from Canada’s Social Sciences and Humanities Research Council.

Vor 80 Jahren wurde Adolf Hitler als Reichskanzler vereidigt

A Nazi torch-light rally. 

Europe has been experiencing similar attempts to silence speech. A law passed in the Bundestag this year will force social media companies operating in Germany to delete racist or slanderous comments and posts within twenty-four hours or face a fine of up to €50 million if they fail to do so. Additionally, numerous public figures have found themselves charged with hate speech crimes for merely pointing out the relationship between the large influx of non-European migrants and high crime rates, particularly in terms of rape and terrorism. One politician in Sweden was prosecuted for daring to post immigrant crime statistics on Facebook.

In Great Britain, British Freedom of Information documents reveal that around twenty-thousand adults and two-thousand children had been investigated by the police for comments that made online. In politics, British MP, Paul Weston (1965 – ), found himself arrested after he quoted a passage on Islam written by Winston Churchill (1874 – 1965). In Scotland, a man was charged under the 2003 Communication’s Act with the improper use of electronic communications after he filmed his dog making a Hitler salute.

In Australia, Herald Sun columnist, Andrew Bolt (1959 – ), was found to have contravened section 18C of the Racial Discrimination Act after he published articles accusing fair-skinned Aborigines of using their racial status for personal advantages. The law firm, Holding Redlich, speaking for a group of Aboriginal persons, demanded that the Herald Sun retract two Andrew Bolt articles, written in April and August of 2009, and restrain Bolt from writing similar articles in the future. Joel Zyngier, who acted for the group pro-bono, told Melbourne’s The Age:

“We see it as clarifying the issue of identity—who gets to say who is and who is not Aboriginal. Essentially, the articles by Bolt have challenged people’s identity. He’s basically arguing that the people he identified are white people pretending they’re black so they can access public benefits.”

Judge Morcedai Bromberg (1959 – ) found that the people targeted by Bolt’s articles were reasonably likely to have been “offended, insulted, humiliated, or intimidated.”

We need speech to be as free as possible because it is that which allows us to exchange and critique information. It through free speech that we are able to keep our politicians and public officials in check, that we are able to critique public policy, and that we are able to disseminate information. As the Canadian cognitive psychologist, Stephen Pinker (1954 – ), observed: “free speech is the only way to acquire knowledge about the world.” Measures taken to restrict free speech, whether it be the criminalization of hate speech or any other, is a complete contradiction of the principles that free Western democracies are founded upon.

CUBAN MISSILE CRISIS

maxresdefault

Next Monday will mark fifty-five years since the Cuban Missile Crisis. For thirteen days, the world held its collective breath as tensions between the United States of America and the Union of Soviet Socialist Republics reached boiling point. Whoever averted the crisis would be glorified in the annals of history, whoever escalated it would be responsible for the annihilation of life on earth.

Our story begins in July, 1962, when Cuban dictator Fidel Castro (1926 – 2016) and Soviet premier Nikita Khrushchev (1894 – 1971) came to a secret agreement to deter another US-backed invasion attempt (the US had previously backed the disastrous Bay of Pigs operation, and were planning another invasion called ‘Operation Mongoose’) by planting nuclear missiles on Cuban soil. On September 4th, routine surveillance flights discovered the general build-up of Soviet arms, including Soviet IL-28 bombers. President John F. Kennedy (1917 – 1963) issued a public warning against the introduction of offensive weapons in Cuba.

Another surveillance flight on October 14th discovered the existence of medium-range and immediate range ballistic nuclear weapons in Cuba. President Kennedy met with his advisors to discuss options and direct a course of action. Opinions seemed to be divided between sending strong warnings to Cuba and the Soviet Union and using airstrikes to eliminate the threat followed by an immediate invasion. Kennedy chose a third option. He would use the navy to ‘quarantine Cuba’ – a word used to legally distinguish the action from a blockade (an act of war).

kennedy-khrushchev-pKennedy then sent a letter to Khrushchev stating that the US would not tolerate offensive weapons in Cuba and demanded the immediate dismantling of the sites and the return of the missiles to the Soviet Union. Finally, Kennedy appeared on national television to explain the crisis and its potential global consequences to the American people. Directly echoing the Monroe doctrine, he told the American people: “It shall be the policy of this nation to regard any nuclear missile launched from Cuba against any nation in the Western Hemisphere as an attack by the Soviet Union on the United States, requiring a full retaliatory response upon the Soviet Union.” The Joint Chief of Staff then declared a military readiness level of DEFCON 3.

On October 23rd, Khrushchev replied to Kennedy’s letter claiming that the quarantining of Cuba was an act of aggression and that he had ordered Soviet ships to proceed to the island. When another US reconnaissance flight reported that the Cuban missile sites were nearing operational readiness, the Joint Chiefs of Staff responded by upgrading military readiness to DEFCON 2. War involving Strategic Air Command was imminent.

On October 26th, Kennedy complained to his advisors that it appeared only military action could remove the missiles from Cuba. Nevertheless, he continued to pursue a diplomatic resolution. That afternoon, ABC News correspondent, John Scali (1918 – 1995), informed the White House that he had been approached by a Soviet agent who had suggested that the Soviets were prepared to remove their missiles from Cuba if the US promised not to proceed with an invasion. The White House scrambled to determine the validity of this offer. Later that evening, Khrushchev sent Kennedy a long, emotional message which raised the spectre of nuclear holocaust and suggested a resolution similar to that of the Soviet agent: “if there is no intention to doom the world to the catastrophe of thermonuclear war, then let us not only relax the forces pulling on the ends of the rope, let us take measures to untie the knot. We are ready for this.”

Hope was short-lived. The next day Khrushchev sent Kennedy another message demanding the US remove its Jupiter missiles from Turkey as a part of any resolution. That same day, a U2 Spy Plane was shot down over Cuba.

Kennedy and his advisors now planned for an immediate invasion of Cuba. Nevertheless, slim hopes for a diplomatic resolution remained. It was decided to respond the Khrushchev’s first message. In his message, Kennedy suggested possible steps towards the removal of the missiles from Cuba, suggested the whole business take place under UN supervision, and promised the US would not invade Cuba. Meanwhile, Attorney General Robert Kennedy (1925 – 1968) met secretly with the Soviet Ambassador to America, Anatoly Dobrynin (1919 – 2010). Attorney General Kennedy indicated that the US was prepared to remove its Jupiter missiles from Turkey but that it could not be part of any public resolution.

On the morning of October 28th, Khrushchev issued a public statement. The Soviet missiles stationed in Cuba would be dismantled and returned to the Soviet Union. The United States continued its quarantine of Cuba until the missiles had been removed, and withdrew its Navy on November 20th. In April 1963, the US removed its Jupiter missiles from Turkey. The world breathed a sigh of relief.

The Cuban Missile Crisis symbolises both the terrifying spectre of nuclear holocaust, and the power of diplomacy in resolving differences. By forming an intolerable situation, the presence of nuclear weapons forced Kennedy and Khrushchev to favour diplomatic, rather than militaristic, resolutions. In the final conclusion, it must be acknowledged that nuclear weapons, and the knowledge and technology to produce them, will always exist. The answer, therefore, cannot be to rid the world of nuclear weapons but learn to live peacefully in a world that has them.

A CRITIQUE OF GLOBALISM

presentation11

Kofi Annan, the former Secretary-General of the United Nations, has stated that disagreeing with globalism is like disagreeing with “the laws of gravity.” Similarly, new French President, Emmanuel Macron, another supporter of globalism, wishes to deregulate France’s ailing industry and boost freedom of movement and trade. Donald Trump’s election to the US Presidency, and the UK’s decision to leave the European Union, however, have challenged the presumed supremacy of globalism as a political force.

The roots of globalism can be traced back to the 2nd Century BC when the formation of the Silk Road facilitated the trade of silk, wool, silver, and gold between Europe and China. It wasn’t until the 20th century, however, that the idea gathered momentum. Following the Second World War, world power was to be split between America, representing the capitalist west, and the Union of Soviet Socialist Republics, representing the communist east. Following the collapse of the Soviet Union in 1991, America took it upon herself to create an undivided, democratic, and peaceful Europe.

Of course, the aim for an undivided Europe, indeed an undivided world, existed long before the collapse of the Soviet Union. In 1944. Allied delegates, met at Bretton Woods, New Hampshire, to establish an economic system based on open markets and free trade. Their idea gathered momentum. Today, the Monetary Fund, World Bank, and, the World Trade Centre all exist to unite the various national economies of the world into a single, global economy.

In 1950, the French foreign minister, Robert Schuman, proposed pooling Western Europe’s coal and steel producing countries together. Originally, Schuman’s objective had been to unite France with the Federal Republic of Germany. In the end, however, the Treaty of Paris would unite Belgium, France, West Germany, Italy, Luxembourg, and the Netherlands in the European Coal and Steel Community. By 1957, the Treaty of Rome had been used to create the European Economic Community.

Globalism is an ideology which seeks to form a world where nations base their economic and foreign policies on global, rather than national, interests. It can be viewed as a blanket term for various phenomena: the pursuit of classical liberal and free market policies on the world stage, Western dominance over the political, cultural, and economic spheres, the proliferation of new technologies, and global integration.

John Lennon’s Imagine, speaking of ‘no countries’, ‘no religion’, and a ‘brotherhood of man’, acts as an almost perfect anthem for globalism. Your individual views on globalism, however, will depend largely on your personal definition of a nation. If you support globalism it is likely you believe a nation to be little more than a geographical location. If you are a nationalist, however, it is likely you believe a nation to be the accumulation of its history, culture, and traditions.

Supporters of John Lennon’s political ideology seem to suffer from a form of self-loathing. European heritage and culture are not seen as something worth celebrating, but as something to be dismissed. And it appears to be working: decades of anti-nationalist, anti-Western policies have stripped many European nations of their historical and cultural identities. In the UK, there have been calls to remove the statue of Cecil Rhodes – an important, yet controversial figure. In other countries, certain areas are have become so rife with ethnic violence they are considered ‘no-go’ zones.

chester-agos09-029-2

Perhaps, it is the result of “white man’s burden”, Rudyard Kipling’s prophetic 1899 poem about the West’s perceived obligation to improve the lot of non-westerners. Today, many white, middle-class elites echo Kipling’s sentiments by believing that it to be their duty to save the world. These people are told at charity events, at protests, at their universities, and by their media of their obligation to their ‘fellow man.’ When it comes to immigration, they believe it to be their responsibility to save the wretched peoples of the world by importing them, and their problems, to the West.

By contrast, nationalism champions the idea that nations, as defined by a common language, ethnicity, or culture, have the right to form communities based on a shared history and/or a common destiny. The phenomenon can be described as consisting of patriotic feelings, principles, or efforts, an extreme form or patriotism characterised by feelings of national superiority, or as the advocacy of political independence. It is primarily driven by two factors. First, feelings of nationhood among members of a nation-state, and, two, the actions of a state in trying to achieve or sustain self-determination. In simplest terms, nationalism constitutes a form of human identity.

One cannot become a citizen of a nation merely by living there. Citizenship arises from the sharing of a common culture, tradition, and history. As American writer Alan Wolfe observed: “behind every citizen lies a graveyard.” The sociologist Emile Durkheim believed people to be united by their families, their religion, and their culture. In Suicide: a Study in Sociology, Durkheim surmises:

“It is not true, then, that human activity can be released from all restraint. Nothing in the world can enjoy such a privilege. All existence being a part of the universe is relative to the remainder; its nature and method of manifestation accordingly depend not only on itself but on other beings, who consequently restrain and regulate it. Here there are only differences of degree and form between the mineral realm and the thinking person.’ Man’s characteristic privilege is that the bond he accepts is not physical but moral; that is, social. He is governed not by a material environment brutally imposed on him, but by a conscience superior to his own, the superiority of which he feels.” – Suicide: a Study in Sociology (pg. 277)

Globalism has primarily manifested itself through economic means. In the economic sense, globalism began in the late 19th, early 20th centuries with the invention of the locomotive, the motor-car, the steamship, and the telegraph. Prior to the industrial revolution, a great deal of economic output was restricted to certain countries. China and India combined produced an economic output of fifty-percent, whilst Western Europe produced an economic output of eighteen percent. It was the industrial revolution of the 19th century, and the dramatic growth of industrial productivity, which caused Western Europe’s economic output to double. Today, we experience the consequences of globalism every time we enter a McDonalds Restaurant, call someone on our mobile phones, or use the internet.

Philip Lower, the Governor of the Reserve Bank of Australia, told a group of businessmen and women at the Sydney Opera House that Australia was “committed to an open international order.” Similarly, the Nobel Prize-winning economist, Amartya Sen, argued that globalisation had “enriched the world scientifically and culturally, and benefited many people economically as well.” It is certainly true that globalisation has facilitated the sharing of technological, cultural, and scientific advances between nations. However, as some economists, like Joseph Stiglitz and Ha-Joon Chang, have pointed out: globalisation can also have the effect of increasing rather than reducing inequality. In 2007, the International Monetary Fund admitted that investment in the foreign capital of developing countries and the introduction of new technologies has had the effect of increasing levels of inequality.  Countries with larger populations, lower working and living standards, more advanced technology, or a combination of all three, are in a better position to compete than countries that lack these factors.

The underlying fact is that globalism has economic consequences. Under globalisation, there is little to no restrictions on the movement of goods, capital, services, people, technology, and information. Among the things championed by economic globalisation is the cross-border division of labour. Different countries become responsible different forms of labour.

The United Nations has unrealistically asserted globalism to be the key to ending poverty in the 21st Century. The Global Policy Forum, an organisation which acts as an independent policy watchdog of the United Nations, has suggested that imposition of global taxes as a means of achieving this reality. These include taxes on carbon emissions to slow climate change, taxes on currency trading to ‘dampen instability in the foreign exchange markets’, and taxes to support major initiatives like reducing poverty and hunger, increasing access to education, and fighting preventable diseases.

In one sense, the battle between globalism and nationalism can be seen as a battle between ideology and realism. Globalism appears committed to creating a ‘brotherhood of man.’ Nationalism, on the other hand, reminds us that culture and nationality form an integral part of human identity, and informs us they are sentiments worth protecting. The true value of globalism and nationalism come not from their opposition, but from how they can be made to work together. Globalism has the economic benefit of allowing countries to develop their economies through global trade. It is not beneficial, however, when it devolves into open-border policies, global taxes, or attacks on a nation’s culture or sovereignty. Nationalism, by the same token, has the benefit of providing people with a national and cultural identity, as well as the benefits and protections of citizenship. Nationalism fails when it becomes so fanatical it leads to xenophobia or war. The answer, therefore, is not to forsake one for the other, but to reconcile the two.

THE RIDDLE OF INDIVIDUAL RESPONSIBILITY

Defendants At Nuremberg Trials

On November 20th, 1945, twenty-four leaders of the defeated Nazi regime filed into Courtroom 600 of Nuremberg’s Palace of Justice to be tried for some of the most reprehensible crimes ever committed. Over the next ten months, the world would be shocked to learn of the depth and extent of the Nazi regime’s mechanised horrors. By the end of the trial, twelve of the defendants would be sentenced to death, seven would be sentenced to periods of imprisonment, and three would be acquitted.

Contrary to what one may believe, the perpetrators of the Holocaust were not psychopaths, sadists, or otherwise psychologically disturbed individuals. Rather, their actions arose, as psychologist Gustave Gilbert (1911 – 1977) concluded, from a culture which valued obedience. The observation that mass-horror is more likely to be committed by normal men and women influenced by social conformity would later be categorised by Hannah Arendt (1906 – 1975) as the ‘banality of evil.’

This shouldn’t be as too much of a surprise. After all, human beings are hard-wired to obey orders from people they deem superior to themselves. In 1961, Yale Psychologist Stanley Milgram (1933 – 1984) carried out a famous experiment which explored the conflict between authority and personal conscience. Milgram’s experiment was inspired by an interview with the Commandant of Auschwitz, Rudolf Höss (1900 – 1947). Höss was asked how it was possible to be directly involved in the deaths of over a million people without suffering emotional distress. Chillingly, Höss answered that he was merely following orders.

The process of the experiment was simple. Two participants, one who whom was actually a researcher, would draw to decide who would take the role of teacher and who would take the role of student. (The system, needless to say, was rigged to ensure the actual participant took the role of teacher). The teacher and student were then separated, and the teacher was taken to a room with an electric shock generator consisting of a row of switches ranging from fifteen to four-hundred-and-fifty volts.  Supervising the teacher was an experimenter in a grey lab coat (an actor in reality). Through the experiment, the teacher was to ask the student questions and administer an electric shock every time the student got a question wrong. As the experiment continued the student would deliberately give wrong answers. As the shocks got more and more severe, the student would scream and beg for mercy. When the teacher expressed concern, however, the experimenter would insist that the experiment continue. By the end of the experiment, Milgram had concluded that all participants would continue to three-hundred volts whilst two-thirds would continue to full volts when pressed.

The Nazis were able to create such obedience through a well-calculated propaganda campaign. Hitler outlined the principles of this campaign in Mein Kampf:

  1.  Keep the dogma simple. One or two points only.
  2.  Be forthright and powerfully direct – tell or order why.
  3.  Reduce concepts down to black and white stereotypes
  4.  Constantly stir people’s emotions
  5.  Use repetition.
  6.  Forget literary beauty, scientific reasoning, balance, or novelty.
  7.  Focus solely on convincing people and creating zealots.
  8.  Find slogans which can be used to drive the movement forward.

Similarly, Hitler’s speeches also followed a very specific and calculated formula:

  1. Hitler would unify the crowd by pointing out some form of commonality.
  2. Hitler would stir up feelings of fear and anger by pointing out some kind of existential threat.
  3. Hitler would invoke himself as the agent of a higher power.
  4. Hitler would present his solution to the problem.
  5. Hitler would proclaim the utilisation of the solution as a victory for both the higher power and the commoners.

In essence, the Nazi propaganda machine facilitated feelings of group identity and then used conformity to gain control over that group. They gambled that the majority of people preferred being beholden to a group than identifying as an individual.

If there is any lesson which can be derived from the Holocaust it is that the distance between good and evil is shorter than we like to believe. As clinical psychologist Jordan Peterson is fond of pointing out, if the Holocaust was perpetrated by ordinary people and you’re an ordinary person, the only logical conclusion is that you too are capable of horrendous evil. It is not enough to be critical of those in powers, eternal vigilance means being critical of our own need to conform and obey. Our freedom depends upon it.

THE PROBLEM WITH MULTICULTURALISM

no-blood-5-1024x778

At a security conference in Germany, the former British Prime Minister, David Cameron, condemned multiculturalism as a failure. He stated: “we need less of the passive tolerance of recent years and much more active, muscular liberalism.” In a similar statement, the French president, Nicolas Sarkozy, also condemned the doctrine of multiculturalism. Sarkozy told the French people: “we have been too concerned about the identity of the person who was arriving and not enough about the identity of the country that was receiving him.” In recent years, the Western nations that have preached multiculturalism and diversity as bastions of peace, tolerance, and diversity – Great Britain, France, Germany, the United States – have been the primary targets of radical Islamic terrorism.

Progressives like to believe multiculturalism and diversity create harmonious and peaceful societies. When, in reality, it creates division. Telling newcomers that they do not have to assimilate into their adopted culture fosters tribalism: Irish form communities with fellow Irish, Muslims form communities with fellow Muslims, Japanese form communities with fellow Japanese, and so forth. As these cultures, especially those lacking the fundamental roots and beliefs of their adopted countries, compete for supremacy, they inevitably conflict with one another. So, whilst Germanic and French cultures may be able to live harmoniously thanks to their shared Christian heritage, the same cultures would not fare as well if they were expected to co-exist with a culture whose central tenants are profoundly different.

galleria_euro-castello-valerio-49-3

Why am I harping on about the inherent faults in multiculturalism and diversity? It is because I believe we have created the greatest culture mankind has ever seen: a culture that has produced Shakespeare, Mozart, Voltaire, Plato, Aristotle, John Locke, freedom and democracy, the television, the I-Phone, the movies, free market capitalism, Van Gogh, Da Vinci, Einstein, Newton, Mary Shelley, the Bronte sisters, and more. And I believe it is a culture worth protecting. And how do we protect it? We start by protecting the very things that have made the West so great in the first place: Christianity, an adherence to truth and a deep esteem towards the logos, the supremacy placed on individual rights and liberties, the free-market place of ideas and commerce, Small Governments, and political freedom.

Moral and cultural relativism is being used to tear down and replace the existing social order. When the Mayor of London, Shadiq Khan, is able to state “terror attacks are part and parcel of living in a big city” and young German women are able to hold signs proudly proclaiming “will trade racists for rapists” unopposed, it is clearly time for certain ideas to go away.

ON WAR

pic5cs5co5csoviet20offensive20on20berlin20in201944

The evolutionary psychologist E.O. Wilson referred to war as “humanity’s hereditary curse.” It has become infused in our collective and individual psyches. The Iliad tells the story of the Trojan War, Shakespeare’s Henry V is centred around the Battle of Agincourt, and All Quiet on the Western Front tells of the experiences of young German soldiers on the Western Front.

The purpose of war can be split into two fields: philosophical and pragmatic. Most modern wars are fought for ideological, and therefore philosophical reasons: capitalism versus communism, fascism versus democracy, and so forth. Richard Ned Lebow, a political scientist at the University of London, hypothesised that nations go to war for reasons of ‘national spirit.’ Institutions and nation-states may not have psyches per-say, but the individuals who run them do, and it is natural for these individuals to project the contents of their psyches onto the institutions and nation-states they are entrusted with.

Rationalists, on the other hand, have another perspective. War, they argue, is primarily used by nations to increase their wealth and power: allowing them to annex new territories, take control of vital resources, pillage, rape, and so forth. Bolshevism arose in the political instability and food shortages of World War One Russia. The Nazis used the spectre of Germany’s humiliating defeat in the Great War and its treatment in the Treaty of Versailles as a stepping stone to political power. In the Ancient World, Sargon of Akkad (2334-2279BC) used war to form the Akkadian Empire, and then used war to quell invasions and rebellion. Similarly, Philip II of Macedonia (382BC – 336BC) used war to unify the city states of Ancient Greece.

Another explanation may be that we engage in war because we are naturally inclined to. War speaks to our need for group identity, and to our deep predilection for conflict. And it should come as no surprise that the two are not mutually exclusive. Our strong predilection towards our own group not only makes us more willing to help other members of that group, it makes us more willing to commit evil on its behalf. Chimpanzees have been known to invade other congresses of chimps and go on killing sprees. The obvious intention being to increase territory and decrease intra-sexual competition. Similarly, our own evolutionary and primitive past is fraught with violence and conflict. It should not escape our attention that history is abundant with examples of invading soldiers slaughtering men and raping women.

Like all the profound aspects of culture, war conceptualises a facet of a deeper truth. It has been central to our history and culture capturing both the more heroic and the more frightening aspects of our individual and collective psyches. We both influence and are influenced by war.