King Alfred Press

Home » Posts tagged 'truth'

Tag Archives: truth

On Constitutional Monarchy

i12510

I would like to begin this essay by reciting a poem by the English Romantic poet, William Wordsworth (1770 – 1850):

 

     Milton! thou shouldst be living at this hour:

            England hath need for thee: she is a fen

            Of stagnant waters: altar, sword, and pen,

            Fireside, the heroic wealth of hall and bower,

            Have forfeited their ancient English dower

            Of inward happiness. We are selfish men;

            Oh! raise us up, return to us again;

            And give us manners, virtue, freedom, power.

            Thy soul was like a star, and dwelt apart:

            Thou hadst a voice whose sound was like the sea:

            Pure as the naked heavens, majestic, free

            So didst thou travel on life’s common way,

            In cheerful godliness; and yet thy heart

            The lowliest duties on herself did lay.

 

The poem, entitled London 1802, is Wordsworth’s ode to an older, nobler time. In it he attempts to conjure up the spirit of John Milton (1608 – 1674), the writer and civil servant immortalised for all time as the writer of Paradise Lost.

Milton acts as the embodiment for a nobler form of humanity. He symbolises a time when honour and duty played far greater a role in the human soul than it did in Wordsworth’s time, or even today. It is these themes of honour, duty, and nobility that will provide the spiritual basis for constitutional monarchy.

It is a subject that I will return to much later in this essay. But, to begin, it would perhaps be more prudent to begin this essay in earnest by examining those aspects of English history that allowed both constitutional monarchy and English liberty to be borne.

The English monarchy has existed for over eleven-hundred years. Stretching from King Alfred the Great in the 9th century to Elizabeth II in the 21st, the English people have seen more than their fair share of heroes and villains, wise kings and despotic tyrants. Through their historical and political evolution, the British have developed, and championed, ideals of liberty, justice, and good governance. The English have gifted these ideals to most of the Western World through the importation of their culture to most of the former colonies.

It is a sad reality that there are many people, particularly left-wing intellectuals, who need to reminded of the contributions the English have made to world culture. The journalist, Peter Hitchens (1951 – ) noted in his book, The Abolition of Britain that abhorrence for one’s own country was a unique trait of the English intellectual. Similarly, George Orwell (1903 – 1950) once observed, an English intellectual would sooner be seen stealing from the poor box than standing for “God Save the King.”

However, these intellectuals fail to notice, in their arrogance, that “God save the King” is actually a celebration of constitutional monarchy and not symbolic reverence to an archaic and rather powerless royal family. It is intended to celebrate the nation as embodied in the form of a single person or family and the fact that the common man and woman can live in freedom because there are constitutional restraints placed on the monarch’s power.

If one’s understanding of history has come from films like Braveheart, it is easy to believe that all people in all times have yearned to be free. A real understanding of history, one that comes from books, however, reveals that this has not always been the case. For most of history, people lived under the subjugation of one ruler or another. They lived as feudal serfs, subjects of a king or emperor, or in some other such arrangement. They had little reason to expect such arrangements to change and little motivation to try and change them.

At the turn of the 17th century, the monarchs of Europe began establishing absolute rule by undermining the traditional feudal institutions that had been in place for centuries. These monarchs became all-powerful wielding their jurisdiction over all forms of authority: political, social, economic, and so forth.

To justify their mad dash for power, Europe’s monarchs required a philosophical argument that vindicated their actions. They found it in a political doctrine known as ‘the divine rights of kings.’ This doctrine, formulated by the Catholic Bishop, Jacques Bossuet (1627 – 1704) in his book, Politics Derived from Sacred Scripture, argued that monarchs were ordained by God and therefore represented His will. It was the duty of the people to obey that individual without question. As such, no limitations could be put on a monarch’s power.

What Bossuet was suggesting was hardly a new, but it did provide the justification many monarchs needed to centralise power in themselves. King James I (1566 – 1625) of England and Scotland saw monarchs as God’s lieutenants and believed that their actions should be tempered by the fear of God since they would be called to account at the Last Judgement. On the basis of this belief, King James felt perfectly justified in proclaiming laws without the consent of parliament and involving himself in cases being tried before the court.

When King James died in 1625, he was succeeded by his second-eldest son, Charles (1600 – 1649). King Charles I assumed the throne during a time of political change. He was an ardent believer in the divine rights of kings, a belief that caused friction between the monarch and parliament from whom he had to get approval to raise funds.

In 1629, Charles outraged much of the population, as well as many nobles, when he elected to raise funds for his rule using outdated taxes and fines, and stopped calling parliament altogether. Charles had been frustrated by Parliament’s constant attacks on him and their refusal to furnish him with money. The ensuing period would become known as the eleven years tyranny.

By November 1640, Charles had become so bereft of funds that he was forced to recall Parliament. The newly assembled Parliament immediately began clamouring for change. They asserted the need for a regular parliament and sought changes that would make it illegal for the King to dissolve the political body without the consent of its members. In addition, the Parliament ordered the king to execute his friend and advisor, Thomas Wentworth (1593 – 1641), the 1st Earl of Stafford, for treason.

The result was a succession of civil wars that pitted King Charles against the forces of Parliament, led by the country gentlemen, Oliver Cromwell (1599 – 1658). Hailing from Huntingdon, Cromwell was a descendant of Henry VIII’s (1491 – 1547) chief minister, Thomas Cromwell (1485 – 1550). In the end, it would decimate the English population and forever alter England’s political character.

The English Civil War began in January 1642 when King Charles marched on Parliament with a force of four-hundred-thousand men. He withdrew to Oxford after being denied entry. Trouble was brewing. Throughout the summer, people aligned themselves with either the monarchists or the Parliamentarians.

The forces of King Charles and the forces of Parliament would meet at the Battle of Edgehill in October. What would follow is several years of bitter and bloody conflict.

Ultimately, it was Parliament that prevailed. Charles was captured, tried for treason, and beheaded on January 30th, 1642. England was transformed into a republic or “commonwealth.” The English Civil War had claimed the lives of two-hundred-thousand peoples, divided families, and facilitated enormous social and political change. Most importantly, however, it set the precedent that a monarch could not rule without the consent of parliament.

The powers of parliament had been steadily increasing since the conclusion of the English Civil War. However, total Parliamentary supremacy had proven unpopular. The Commonwealth created in the wake of the Civil War had collapsed shortly after Oliver Cromwell’s death. When this happened, it was decided to restore the Stuart dynasty.

The exiled Prince Charles returned to France and was crowned King Charles II (1630 – 1685). Like his father and grandfather, Charles was an ardent believer in the divine rights of kings. This view put him at odds with those of the Enlightenment which challenged the validity of absolute monarchy, questioned traditional authority, and idealised liberty.

By the third quarter of the 17th century, Protestantism had triumphed in both England and Scotland. Ninety-percent of the British population was Protestant. The Catholic minority was seen as odd, sinister, and, in extreme cases, outright dangerous. People equated Catholicism with tyranny linking French-Style autocracy with popery.

It should come as no surprise, then, that Catholics became the target of persecution. Parliament barred them from holding offices of state and banned Catholic forms of worship. Catholics were barred from becoming members of Parliament, justices of the peace, officers in the army, or hold any other position in Parliament unless they were granted a special dispensation by the King.

It is believed that Charles II may have been a closet Catholic. He was known for pardoning Catholics for crimes (controversial considering Great Britain was a protestant country) and ignoring Parliament.

However, Charles’ brother and successor, James (1633 – 1701) was a Catholic beyond any shadow of a doubt. He had secretly converted in 1669 and was forthright in his faith. After his first wife, Anne Hyde (1637 – 1671) died, James had even married the Italian Catholic, Mary of Modena (1658 – 1718). A decision that hardly endeared him to the populace.

The English people became alarmed when it became obvious that Charles II’s wife, Catherine of Braganza (1638 – 1705) would not produce a Protestant heir. It meant that Charles’ Catholic brother, James was almost certainly guaranteed to succeed him on the throne. So incensed was Parliament at having a Catholic on the throne, they attempted to pass the Crown onto one of Charles’ Anglican relatives.

Their concern was understandable, too. The English people had suffered the disastrous effects of religious intolerance since Henry VIII had broken away from the Catholic Church and established the Church of England. The result had been over a hundred years of religious conflict and persecution. Mary I (1516 – 1558), a devout Catholic, had earnt the moniker “bloody Mary” for burning Protestants the stake. During the reign of King James, Guy Fawkes (1570 – 1606), along with a group of Catholic terrorists, had attempted to blow up Parliament in the infamous “gunpowder plot.”

Unlike Charles II, James made his faith publicly known. He desired greater tolerance for Catholics and non-Anglican dissenters like Quakers and Baptists. The official documents he issued, designed to bring about the end of religious persecution, were met with considerable objection from both Bishops and Europe’s protestant monarchs.

Following the passing of the Test Act in 1672, James had briefly been forced to abandon his royal titles. The Act required officers and members of the nobility to take the Holy Communion as spelt out by the Church of England. It was designed to prevent Catholics from taking public office.

Now, as King, James was attempting to repeal the Test Act by placing Catholics in positions of power. His Court featured many Catholics and he became infamous for approaching hundreds of men – justices, wealthy merchants, and minor landowners – to stand as future MPs and, in a process known as ‘closeting’, attempting to persuade them to support his legal reforms. Most refused.

That was not the limits of James’ activities, either. He passed two Declarations of Indulgences to be read from every stage for two Sundays, and put those who opposed it on trial for seditious libel. Additionally, he had imprisoned seven Bishops for opposing him, made sweeping changes to the Church of England, and built an army comprising mainly of Catholics.

The people permitted James II to rule as long as his daughter, the Protestant Prince Mary (1662 – 1694) remained his heir. All this changed, however, when Mary Modena produced a Catholic heir: James Francis Edward Stuart (1688 – 1766). When James declared that the infant would be raised Catholic, it immediately became apparent that a Catholic dynasty was about to be established. Riots broke out. Conspiracy theorists posited that the child was a pawn in a Popish plot. The child, the theory went, was not the King’s son but rather a substitute who had been smuggled into the birthing chamber in a bed-warming pan.

In reality, it was the officers of the Army and Navy who were beginning to plot and scheme in their taverns and drinking clubs. They were annoyed that James had introduced Papist officers into the military. The Irish Army, for example, had seen much of its Protestant officer corps dismissed and replaced with Catholics who had little to no military experience.

James dissolved Parliament in July 1688. Around this time, a Bishop and six prominent politicians wrote to Mary and her Dutch husband, William of Orange (1650 – 1702) and invited them to raise an army, invade London, and seize the throne. They accepted.

William landed in Dorset on Guy Fawkes’ day accompanied by an army of fifteen-thousand Dutchmen and other Protestant Europeans. He quickly seized Exeter before marching eastward towards London. James II called for troops to confront William.

Things were not looking good for James, however. Large parts of his officer corps were defecting to the enemy and taking their soldiers with them. Without the leadership of their officers, many soldiers simply went home. English magnates started declaring for William. And his own daughter, Princess Anne (1665 – 1714) left Whitehall to join the rebels in Yorkshire. James, abandoned by everyone, fled to exile in France. He would die there twelve-years-later.

On January 22nd, 1689, William called the first ‘convention parliament.’ At this ‘convention’, Parliament passed two resolutions. First, it was decided that James’ flight into exile constituted an act of abdication. And second, it was declared a war against public policy for the throne to be occupied by a Catholic. As such, the throne was passed over James Francis Edward Stuart, and William and Mary were invited to take the Crown as co-monarchs.

They would be constrained, however, by the 1689 Bill of Rights and, later, by the 1701 Act of Settlement. The 1689 Bill of Rights made Great Britain a constitutional monarchy as opposed to an absolute one. It established Parliament, not the crown, as the supreme source of law. And it set out the most basic rights of the people.

Likewise, the 1701 Act of Settlement helped to strengthen the Parliamentary system of governance and secured a Protestant line of succession. Not only did it prevent Catholics from assuming the throne, but it also gave Parliament the ability to dictate who could ascend to the throne and who could not.

The Glorious Revolution was one of the most important events in Britain’s political evolution. It made William and Mary, and all monarchs after them, elected monarchs. It established the concept of Parliamentary sovereignty granting that political body the power to make or unmake any law it chose to. The establishment of Parliamentary sovereignty brought with it the ideas of responsible and representative government.

The British philosopher, Roger Scruton (1944 – ) described British constitutional monarchy as a “light above politics which shines down [on] the human bustle from a calmer and more exalted sphere.” A constitutional monarchy unites the people for a nation under a monarch who symbolises their shared history, culture, and traditions.

Constitutional monarchy is a compromise between autocracy and democracy. Power is shared between the monarch and the government, both of whom have their powers restricted by a written, or unwritten, constitution. This arrangement separates the theatre of power from the realities of power. The monarch is able to represent the nation whilst the politician is able to represent his constituency (or, more accurately, his party).

In the Need for Roots, the French philosopher, Simone Weils (1909 – 1943) wrote that Britain had managed to maintain a “centuries-old tradition of liberty guaranteed by the authorities.” Weils was astounded to find that chief power in the British constitution lay in the hands of a lifelong, unelected monarch. For Weils, it was this arrangement that allowed the British to retain its tradition of liberty when other countries – Russia, France, and Germany, among others – lost theirs when they abolished their monarchies.

sir_isaac_isaacs_and_lady_isaacs

Great Britain’s great legacy is not their once vast and now non-existent Empire, but the ideas of liberty and governance that they have gifted to most of their former colonies. Even the United States, who separated themselves from the British by means of war, inherited most of their ideas about “life, liberty, and the pursuit of happiness” from their English forebears.

The word “Commonwealth” was adopted at the Sixth Imperial Conference held between October 19th and November 26th, 1926. The Conference, which brought together the Prime Ministers of the various dominions of the British Empire, led to the formation of the Inter-Imperial Relations Committee. The Committee, headed for former British Prime Minister, Arthur Balfour (1848 – 1930), was designed to look into future constitutional arrangements within the commonwealth.

Four years later, at the Seventh Imperial Conference, the committee delivered the Balfour Report. It stated:

“We refer to the group of self-governing communities composed of Great Britain and the Dominions. Their position and mutual relation may be readily defined. They are autonomous Communities within the British Empire, equal in status, in no way subordinate one to another in any aspect of their domestic or external affairs, though united by a common allegiance to the Crown, and freely associated as members of the British Commonwealth of Nations.”

It continued:

“Every self-governing member of the Empire is now the master of its destiny. In fact, if not always in form, it is subject to no compulsion whatsoever.”

Then, in 1931, the Parliament of the United Kingdom passed the Statute of Westminster. It became one of two laws that would secure Australia’s political and legal independence from Great Britain.

The Statute of Westminster gave legal recognition to the de-facto independence of the British dominions. Under the law, Australia, Canada, the Irish Free State, Newfoundland (which would relinquish its dominion status and be absorbed into Canada in 1949), New Zealand and South Africa were granted legal independence.

Furthermore, the law abolished the Colonial Validity Act 1865. A law which had been enacted with the intention of removing “doubts as to the validity of colonial laws.” According to the act, a Colonial Law was void when it “is or shall be in any respect repugnant to the provisions of any Act of Parliament extending to the colony to which such laws may relate, or repugnant to any order or regulation under authority of such act of Parliament or having in the colony the force and effect of such act, shall be read subject to such act, or regulation, and shall, to the extent of such repugnancy, but not otherwise, be and remain absolutely void and inoperative.”

The Statute of Westminster was quickly adopted by Canada, South Africa, and the Irish Free State. Australia, on the other hand, did not adopt it until 1942, and New Zealand did not adopt it until 1947.

More than forty-years-later, the Hawke Labor government passed the Australia Act 1986. This law effectively made the Australian legal system independent from Great Britain. It had three major achievements. First, it ended appeals to the Privy Council thereby establishing the High Court as the highest court in the land. Second, it ended the influence the British government had over the states of Australia. And third, it allowed Australia to update or repeal those imperial laws that applied to them by ending British legislative restrictions.

What the law did not do, however, was withdraw the Queen’s status as Australia’s Head of State:

“Her Majesty’s Representative in each State shall be the Governor.

Subject to subsections (3) and (4) below, all powers and functions of Her Majesty in respect of a State are exercisable only by the Governor of the State.

Subsection (2) above does not apply in relation to the power to appoint, and the power to terminate the appointment of, the Governor of a State.

While her Majesty is personally present in a State, Her Majesty is not precluded from exercising any of Her powers and functions in respect of the State that are the subject of subsection (2) above.

The advice of Her Majesty in relation to the exercise of powers and functions of Her Majesty in respect of a State shall be tendered by the Premier of the State.”

These two laws reveal an important miscomprehension that is often exploited by Australian Republicans. That myth is the idea that Australia does not have legal and political independence because its Head of State is the British monarch. The passage of the Statute of Westminster in 1931 and the Australia Act in 1986 effectively ended any real political or legal power the British government had over Australia.

In Australia, the monarch (who is our head of state by law) is represented by a Governor General. This individual – who has been an Australian since 1965 – is required to take an oath of allegiance and an oath of office that is administered by a Justice (typically the Chief Justice) of the High Court. The Governor-General holds his or her position at the Crown’s pleasure with appointments typically lasting five years.

The monarch issues letters patent to appoint the Governor General based on the advice of Australian ministers. Prior to 1924, Governor Generals were appointed on the advice of both the British government and the Australian government. This is because the Governor General at that time represented both the monarch and the British government. This arrangement changed, however, at the Imperial Conferences of 1926 and 1930. The Balfour Report produced by these conferences stated that the Governor General should only be the representative of the crown.

The Governor General’s role is almost entirely ceremonial. It has been argued that such an arrangement could work with an elected Head of State. However, such an arrangement would have the effect of politicising and thereby corrupting the Head of State. A Presidential candidate in the United States, for example, is required to raise millions of dollars for his campaign and often finds himself beholden to those donors who made his ascent possible. The beauty of having an unelected Head of State, aside from the fact that it prevents the government from assuming total power, is that they can avoid the snares that trap other political actors.

image-20151106-16263-1t48s2d

The 1975 Constitutional Crisis is a perfect example of the importance of having an independent and impartial Head of State. The crises stemmed from the Loans Affair which forced Dr. Jim Cairns (1914 – 2003), Deputy Prime Minister, Treasurer, and intellectual leader of the political left, and Rex Connor (1907 – 1977) out of the cabinet. As a consequence of the constitutional crisis, Gough Whitlam (1916 – 2014) was dismissed as Prime Minister and the 24th federal parliament was dissolved.

The Loan’s affair began when Rex Connor attempted to borrow money, up to US$4b, to fund a series of proposed national development projects. Connor deliberately flouted the rules of the Australian Constitution which required him to take such non-temporary government borrowing to the Loan Council (a ministerial council consisting of both Commonwealth and state elements which existed to coordinate public sector borrowing) for approval. Instead, on December 13th, 1974, Gough Whitlam, Attorney-General Lionel Murphy (1922 – 1986), and Dr. Jim Cairns authorised Connor to seek a loan without the council’s approval.

When news of the Loans Affair was leaked, the Liberal Party, led by Malcolm Fraser (1930 – 2015), began questioning the government. Whitlam attempted to brush the scandal aside by claiming that the loans had merely been “matters of energy” and claiming that the Loans Council would only be advised once a loan had been made. Then, on May 21st, Whitlam informed Fraser that the authority for the plan had been revoked.

Despite this, Connor continued to liaise with the Pakistani financial broker, Tirath Khemlani (1920 – 1991). Khemlani was tracked down and interviewed by Herald Journalist, Peter Game (1927 – ) in mid-to-late 1975. Khemlani claimed that Connor had asked for a twenty-year loan with an interest of 7.7% and a 2.5% commission for Khemlani. The claim threw serious doubt on Dr. Jim Cairn’s claim that the government had not offered Khemlani a commission on a loan. Game also revealed that Connor and Khemlani were still in contact, something Connor denied in the Sydney Morning Herald.

Unfortunately, Khemlani had stalled on the loan, most notably when he had been asked to go to Zurich with Australian Reserve Bank officials to prove the funds were in the Union Bank of Switzerland. When it became apparent that Khemlani would never deliver Whitlam was forced to secure the loan through a major American investment bank. As a condition of that loan, the Australian government was required to cease all other loans activities. Consequentially, Connor had his loan raising authority revoked on May 20th, 1975.

The combination of existing economic difficulties with the political impact of the Loan’s Affair severely damaged to the Whitlam government. At a special one day sitting of the Parliament held on July 9th, Whitlam attempted to defend the actions of his government and tabled evidence concerning the loan. It was an exercise in futility, however. Malcolm Fraser authorised Liberal party senators – who held the majority in the upper house at the time – to force a general election by blocking supply.

And things were only about to get worse. In October 1975, Khemlani flew to Australia and provided Peter Game with telexes and statutory declarations Connor had sent him as proof that he and Connor had been in frequent contact between December 1974 and May 1975. When a copy of this incriminating evidence found its way to Whitlam, the Prime Minister had no other choice but to dismiss Connor and Cairns (though he did briefly make Cairns Minister for the Environment).

By mid-October, every metropolitan newspaper in Australia was calling on the government to resign. Encouraged by this support, the Liberals in the Senate deferred the Whitlam budget on October 16th. Whitlam warned Fraser that the Liberal party would be “responsible for bills not being paid, for salaries not being paid, for utter financial chaos.” Whitlam was alluding to the fact that blocking supply threatened essential services, Medicare rebates, the budgets of government departments and the salaries of public servants. Fraser responded by accusing Whitlam of bringing his own government to ruin by engaging in “massive illegalities.”

On October 21st, Australian’s longest-serving Prime Minister, Sir Robert Menzies (1894 – 1978) signalled his support for Fraser and the Liberals. The next day, Treasurer, Bill Hayden (1933 – ) reintroduced the budget bills and warned that further delay would increase unemployment and deepen a recession that had blighted the western world since 1973.

The crisis would come to a head on Remembrance Day 1975. Whitlam had asserted for weeks that the Senate could not force him into an election by claiming that the House of Representatives had an independence and an authority separate from the Senate.

Whitlam had decided that he would end the stalemate by seeking a half-senate election. Little did he know, however, that the Governor-General, Sir John Kerr (1914 – 1991) had been seeking legal advice from the Chief Justice of the High Court on how he could use his Constitutional Powers to end the deadlock. Kerr had come to the conclusion that should Whitlam refuse to call a general election, he would have no other alternative but to dismiss him.

And this is precisely what happened. With the necessary documents drafted, Whitlam arranged to meet Kerr during the lunch recess. When Whitlam refused to call a general election, Kerr dismissed him and, shortly after, swore in Malcolm Fraser as caretaker Prime Minister. Fraser assured Kerr that he would immediately pass the supply bills and dissolve both houses in preparation for a general election.

Whitlam returned to the Lodge to eat lunch and plan his next movie. He informed his advisors that he had been dismissed. It was decided that Whitlam’s best option was to assert Labor’s legitimacy as the largest party in the House of Representatives. However, fate was already moving against Whitlam. The Senate had already passed the supply bills and Fraser was drafting documents that would dissolve the Parliament.

At 2pm, Deputy Prime Minister, Frank Crean (1916 – 2008) defended the government against a censure motion started by the opposition. “What would happen, for argument’s sake, if someone else were to come here today and say he was now the Prime Minister of this country”, Crean asked. In fact, Crean was stalling for time while Whitlam prepared his response.

At 3pm, Whitlam made a last-ditch effort to save his government by addressing the House. Removing references to the Queen, he asked that the “House expresses its want of confidence in the Prime Minister and requests, Mr. Speaker, forthwith to advice His Excellency, the Governor-General to call the member of Wannon to form a government.” Whitlam’s motion was passed with a majority of ten.

The speaker, Gordon Scholes (1931 – 2018) expressed his intention to “convey the message of the House to His Excellency at the first opportunity.” It was a race that Whitlam was not supposed to win. Scholes was unable to arrange an appointment until quarter-to-five in the afternoon.

Behind the scenes, departmental officials were working to provide Fraser with the paperwork he needed to proclaim a double dissolution. By ten-to-four, Fraser left for government house. Ten minutes later, Sir John Kerr had signed the proclamation dissolving both Houses of Parliament and set the date for the upcoming election for December 13th, 1975. Shortly after, Kerr’s official secretary, David Smith (1933) drove to Parliament House and, with Whitlam looming behind him, read the Governor General’s proclamation.

The combination of economic strife, political scandal, and Whitlam’s dismissal signed the death warrant for Whitlam’s government. At the 1975 Federal Election, the Liberal-National coalition won by a landslide, gaining a majority of ninety-one seats and obtaining a popular vote of 4,102,078. In the final analysis, it seems that the Australian people had agreed with Kerr’s decision and had voted to remove Whitlam’s failed government from power once and for all.

23163929155_9f41dc691d_h

Most of the arguments levelled against constitutional monarchies can be described as petty, childish, and ignorant. The biggest faux pas those who oppose constitutional monarchies make is a failure to separate the royal family (who are certainly not above reproach) from the institution of monarchy itself. Dislike for the Windsor family is not a sufficient reason to disagree with constitutional monarchy. It would be as if I decided to argue for the abolition of the office of Prime Minister just because I didn’t like the person who held that office.

One accusation frequently levelled against the monarchy is that they are an undue financial burden on the British taxpaying public. This is a hollow argument, however. It is certainly true that the monarchy costs the British taxpayer £299.4 million every year. And it is certainly true that the German Presidency costs only £26 million every year. However, it is not true that all monarchies are necessarily more expensive than Presidencies. The Spanish monarchy costs only £8 million per year, less than the Presidencies of Germany, Finland, and Portugal.

Australia has always had a small but vocal republican movement. The National Director of the Republican Movement, Michael Cooney has stated: “no one thinks it ain’t broken, that we should fix it. And no one thinks we have enough say over our future, and so, no matter what people think about in the sense of the immediate of the republic everyone knows that something is not quite working.”

History, however, suggests that the Australian people do not necessarily agree with Cooney’s assessment. The Republican referendum of 1999 was designed to facilitate two constitutional changes: first, the establishment of a republic, and, second, the insertion of a preamble in the Constitution.

The Referendum was held on November 6th, 1999. Around 99.14%, or 11,683,811 people, of the Australian voting public participated. 45.13%, or 5,273,024 voted yes. However, 54.87%, or 6,410,787 voted no. The Australian people had decided to maintain Australia’s constitutional monarchy.

All things considered, it was probably a wise decision. The chaos caused by establishing a republic would pose a greater threat to our liberties than a relatively powerless old lady. Several problems would need to be addressed. How often should elections occur? How would these elections be held? What powers should a President have? Will a President be just the head of state, or will he be the head of the government as well? Australian republicans appear unwilling to answer these questions.

Margaret Tavits of Washington University in St. Louis once observed that: “monarchs can truly be above politics. They usually have no party connections and have not been involved in daily politics before assuming the post of Head of State.” It is the job of the monarch to become the human embodiment of the nation. It is the monarch who becomes the centrepiece of pageantry and spectacle. And it the monarch who symbolises a nation’s history, tradition, and values.

Countries with elected, or even unelected, Presidents can be quite monarchical in style. Americans, for example, often regard their President (who is both the Head of State and the head of the government) with an almost monarchical reverence. A constitutional monarch might be a lifelong, unelected Head of State, but unlike a President, that is generally where their power ends. It is rather ironic that the Oxford political scientists, Petra Schleiter and Edward Morgan-Jones have noted that allow governments to change without democratic input like elections than monarchs are. Furthermore, by occupying his or her position as Head of State, the monarch is able to prevent other, less desirable people from doing so.

The second great advantage of constitutional monarchies is that they provide their nation with stability and continuity. It is an effective means to bridging the past and future. A successful monarchy must evolve with the times whilst simultaneously keeping itself rooted in tradition. All three of my surviving grandparents have lived through the reign of King George VI, Queen Elizabeth II, and may possibly live to see the coronation of King Charles III. I know that I will live through the reigns of Charles, King William V, and possibly survive to see the coronation of King George VII (though he will certainly outlive me).

It would be easy to dismiss stability and continuity as manifestations of mere sentimentality, but such things also have a positive effect on the economy, as well. In a study entitled Symbolic Unity, Dynastic Continuity, and Countervailing Power: Monarchies, Republics and the Economy Mauro F. Guillén found that monarchies had a positive impact on economies and living standards over the long term. The study, which examined data from one-hundred-and-thirty-seven countries including different kinds of republics and dictatorships, found that individuals and businesses felt more confident that the government was not going to interfere with their property in constitutional monarchies than in republics. As a consequence, they are more willing to invest in their respective economies.

When Wordsworth wrote his ode to Milton, he was mourning the loss of chivalry he felt had pervaded English society. Today, the West is once again in serious danger of losing those two entities that is giving them a connection to the chivalry of the past: a belief in God and a submission to a higher authority.

Western culture is balanced between an adherence to reason and freedom on the one hand and a submission to God and authority on the other. It has been this delicate balance that has allowed the West to become what it is. Without it, we become like Shakespeare’s Hamlet: doomed to a life of moral and philosophical uncertainty.

It is here that the special relationship between freedom and authority that constitutional monarchy implies becomes so important. It satisfies the desire for personal autonomy and the need for submission simultaneously.

The Christian apologist and novelist, C.S. Lewis (1898 – 1964) once argued that most people no more deserved a share in governing a hen-roost than they do in governing a nation:

“I am a democrat because I believe in the fall of man. I think most people are democrats for the opposite reason. A great deal of democratic enthusiasm descends from the idea of people like Rousseau who believed in democracy because they thought mankind so wise and good that everyone deserved a share in the government. The danger of defending democracy on those grounds is that they’re not true and whenever their weakness is exposed the people who prefer tyranny make capital out of the exposure.”

The necessity for limited government, much like the necessity for authority, comes from our fallen nature. Democracy did not arise because people are so naturally good (which they are not) that they ought to be given unchecked power over their fellows. Aristotle (384BC – 322BC) may have been right when he stated that some people are only fit to be slaves, but unlimited power is wrong because there is no one person who is perfect enough to be a master.

Legal and economic equality are necessary bulwarks against corruption and cruelty. (Economic equality, of course, refers to the freedom to engage in lawful economic activity, not to socialist policies of redistributing wealth that inevitably lead to tyranny). Legal and economic equality, however, does not provide spiritual sustenance. The ability to vote, buy a mobile phone, or work a job without being discriminated against may increase the joy in your life, but it is not a pathway to genuine meaning in life.

Equality serves the same purpose that clothing does. We are required to wear clothing because we are no longer innocent. The necessity of clothes, however, does not mean that we do not sometimes desire the naked body. Likewise, just because we adhere to the idea that God made all people equal does not mean that there is not a part of us that does not wish for inequality to present itself in certain situations.

Chivalry symbolises the best human beings can be. It helps us realise the best in ourselves by reconciling fealty and command, inferiority and superiority. However, the ideal of chivalry is a paradox. When the veil of innocence has been lifted from our eyes, we are forced to reconcile ourselves to the fact that bullies are not always cowards and heroes are not always modest. Chivalry, then, is not a natural state, but an ideal to be aimed for.

The chivalric ideal marries the virtues of humility and meekness with those of valour, bravery, and firmness. “Thou wert the meekest man who ever ate in hall among ladies”, said Sir Ector to the dead Lancelot. “And thou wert the sternest knight to thy mortal foe that ever-put spear in the rest.”

Constitutional monarchy, like chivalry, makes a two-fold demand on the human spirit. Its democratic element, which upholds liberty, demands civil participation from all its citizens. And its monarchical element, which champions tradition and authority, demands that the individual subjugate himself to that tradition.

It has been my aim in this essay to provide a historical, practical, and spiritual justification for constitutional monarchy. I have demonstrated that the British have developed ideals of liberty, justice, and good governance. The two revolutions of the 17th century – the English Civil War and the Glorious Revolution – established Great Britain as a constitutional monarchy. It meant that the monarch could not rule without the consent of parliament, established parliament as the supreme source of law, and allowed them to determine the line of succession. I have demonstrated that constitutional monarchs are more likely to uphold democratic principles and that the stability they produce encourages robust economies. And I have demonstrated that monarchies enrich our souls because it awakens in us the need for both freedom and obedience.

Our world has become so very vulgar. We have turned our backs on God, truth, beauty, and virtue. Perhaps we, like Wordsworth before us, should seek virtue, manners, freedom, and power. We can begin to do this by retaining the monarchy.

OUR OBSESSION OVER FOOD IS RIDICULOUS

20091117-koodies

Sometimes a civilisation can become so sophisticated that it believes it can overcome truth. We have become one of those civilisations. As a consequence of our arrogance, we have come to believe that we can circumvent some of the most fundamental truths about reality. We blame inequality on the social structure even though most social animals live in hierarchies. We believe that primitive people are noble even though mankind in its primitive state is more violent than at any other stage. And we believe that we can change the way human beings eat despite the fact that it is making us unhappy.

It is our modern obsession over diet and exercise that I would like to focus on. This obsession has arisen from a society that is too safe, too free, and too prosperous for its own good. This is not to say that safety, freedom, and prosperity are bad things. Indeed, we should get down on our knees and thank God every day that we live in a country that has these things. However, it is also true that too much safety, freedom, and prosperity breeds passivity and complacency. The hardships our ancestors faced – war, poverty, disease – are no longer problems for us. Therefore, we lack the meaning that these hardships bring to our life. As a result, we have come to invent problems. Among these has been a tendency to render the consumption of certain food as something unhealthy, unethical, or both.

Our modern obsession with food is causing significant personal problems. On the one hand, the ease in which food, especially that which is laden with sugar, is causing a rise in cases of obesity. (Note: I am using the word ‘obesity’ as a blanket term for people who are overweight). It is a uniquely modern problem. Our ancestors never battled weight gain because they were only able to find or afford enough food to keep them and their families from starving. Now the quantity, cheapness, and, in many cases, poor quality of food means that the fattest amongst are also often the poorest. But obesity is less a problem that arises out of food and more of a problem arising from laziness and gluttony. (Naturally, I am excluding health problems and genetic disorders from this conclusion).

On the other hand, however, our obsession over being skinny or muscle-bound is also causing problems. I have seen plenty of people who are clearly overweight. In rare cases, I have even seen people who are so morbidly obese that it can only be described as breathtaking. However, I have also seen women (and it primarily women, by the way) who can only be described as unnaturally thin. It is as though our society, having realised that being overweight is healthy, has decided that its opposite must be good. It isn’t. Just right is just right.

And it’s not just individuals who are subjecting themselves to this kind of self-imposed torture. And it’s not limited to people in the here and now, either. In 1998, The Independent reported that many doctors in the United Kingdom were concerned that well-meaning parents were unintentionally starving their children to death by feeding them low fat, low sugar diets. These children were said to be suffering from the effects of “muesli-belt nutrition.” They had become malnourished because either they or their parents had maintained had become obsessed with maintaining a low-fat, low-sugar, low-salt diet. The article reported: “Malnutrition, once associated with slums, is said to have become an increasing problem for middle-class families in the past fifteen years. The victim of so-called ‘muesli-belt nutrition’ are at risk of stunted growth, anaemia, learning difficulties, heart disease and diabetes.”

Our obsession over diet is really a sign of how well-off our society is. Our ancestors had neither the time nor the resources to adhere to the kind of crazy-strict diets that modern people, in their infinite stupidity, decide to subject themselves to. It is high time we stopped obsessing over food and got a grip.

TRANSGENDERISM IS NO BASIS FOR PUBLIC POLICY

transgender-star-jumbo-v2

It has been over fourteen-year since David Reimer, the victim of an insane and evil scientific experiment, committed suicide. After his penis had been burnt off in a botched circumcision, David’s parents had turned to the infamous sexologist and social constructionist, Dr. John Money for help. Following Dr. Money’s advice, David’s parents agreed to allow a sex change operation to be performed on their young son and raised him as a girl.

Despite Dr. Money’s boasting that his experiment had been a success, however, David Reimer did not settle comfortably into his female identity. David tore up his dresses at three, asked if he could have his head shaved like his father, and engaged in all manner of boyish behaviour. David was bullied at school and, upon hitting puberty, decided that he was a homosexual (in reality, of course, he was heterosexual).

Finally, when he was fourteen David’s parents revealed the truth about his gender identity. David reverted to his masculine identity, broke off contact with Dr. Money whom he described as an abusive brainwasher, and received a non-functioning penis through phalloplasty. Unable to handle the immense psychological damage that had been inflicted upon him, David Reimer blew his brains out with a shotgun at the age of thirty-eight.

For all of human history, boy has meant boy and girl has meant girl. Traditionally, sex was used to refer to the biological markers of gender. If you were born with a penis and an XY chromosome, you were a man. If you were born with a vagina and an XX chromosome, you were a woman. One’s gender expression was thought to compliment one’s biological sex. A biological man would have masculine personality traits and a biological female would have feminine personality traits. These complimentary characteristics, among them body shape, dress, mannerisms, and personality, were thought to be produced by a mixture of natural and environmental forces.

Recently, however, gender theorists have begun to question the relationship between biological sex and gender identity. They argue that gender, which they see as distinctive from sex, is a social construct. Since gender refers to the expression of masculinity and femininity, gender is something that a person acquires. (Needless to say, this movement is driven by a pernicious post-modern, Neo-Marxist worldview). Under this philosophy, gender expression is the manner in which a person expresses their gender identity. Gender identity is expressed through dress, behaviour, speech, and nothing else besides.

Neuroplasticity provides the gender theorist with perhaps his greatest argument. If underlying brain processes are theoretically strengthened through repetitive use, it follows that gender identity comes from a narrowing down of potential gender categories through the repetitive use of certain brain processes. However, it also reveals a fatal flaw in the gender theorist’s (and social constructionist’s) philosophy. If the human brain is so malleable that an individual’s gender identity is constructed, then why can’t the brain of a transgender person be adapted out of its transgenderism?

The primary problem with gender theory is that it just plain wrong. The idea that gender is distinct from sex has absolutely no basis in science whatsoever. As Jordan Peterson, the Canadian psychology/philosopher, has stated: “the idea that gender identity is independent of biological sex is insane. It’s wrong. The scientific data is clear beyond dispute. It’s as bad as claiming that the world is flat.” Men and women differ both at the cellular and the temperamental level. Unlike men, for example, women menstruate, they can have babies, and they show a slew of personality characteristics that mark them as different from men. David C. Page, the Director of the Whitehead Institution at the Massachusetts Institute of Technology, has even claimed that genetic differences exist at the cellular level asserting that “throughout human bodies, the cells of males and females are biochemically different.” These differences even affect how men and women contract and fight diseases.

The philosopher Alain de Benoist has also strongly criticised gender theory. De Benoist argued against the scientific errors and philosophical absurdities in his work Non à la théorie de genre (No to Gender Theory).

First, De Benoist points out that the gender theorists have used the fact that some gender characteristics are socially constructed to argue that all characteristics are socially constructed.

Second, De Benoist argued that the “hormonal impregnation of the foetus” (as De Benoist puts it) causes the brain to become genderised because it has a “direct effect on the organisation of neural circuits, creating a masculine brain and a feminine brain, which can be distinguished by a variety of anatomical, physiological, and biochemical markers.”

Third, De Benoist argued that biological sex has a profound effect on the way people think, act, and feel. In order to support their theory, gender theorists are forced to deny the natural differences between men and women. De Benoist wrote:

“From the first days of life, boys look primarily at mechanized objects or objects in movement while girls most often search for visual contact with human faces. Only a few hours after birth, a girl responds to the cries of other infants while a boy shows no interest. The tendency to show empathy is stronger in girls than in boys long before any external influence (or “social expectations”) have been able to assert themselves. At all ages and stages of development, girls are more sensitive to their emotional states and to those of others than boys … From a young age, boys resort to physical strategies where girls turn to verbal ones … From the age of two, boys are more aggressive and take more risks than girls.”

Furthermore, gender theory cheapens what it means to be a man or a woman. And, by extension, it denigrates the contributions that each gender has to make to civil society. Gender values give people ideals to strive for and helps them determine the rules that govern human interactions. The idea that men and women ought to be treated the same is ludicrous beyond belief. No parent would like to see their son treat a woman the same way they treat their male friends. Men have been taught to be gentlemen and women have been taught to be ladies for a reason.

All of this is not to say, however, that those pushing transgender rights do not have a case. They are right when they claim that the transgender peoples of the world face discrimination, prejudice, and violence. Some countries treat transgenderism as a crime, and it is certainly true that transgender people are more likely to be victims of violence, including murder. A reasonable transgender rights argument would be that transgender people cannot help their affliction and that society ought to treat them with kindness, tolerance, and compassion.

Unfortunately, that is not the argument that gender activists like to make. Rather than focusing on promoting tolerance, gender activists have instead sought to do away with gender distinctions altogether (which is, more likely than not, their actual aim). Using a very tiny minority of the population as their moral basis, the gender activists are attempting to force society to sacrifice its traditional classifications of male and female.

Transgenderism is clearly a mental health disorder. In the past, it was referred to as “gender dysphoria”, considered a mental illness, and treated as such. To assert the fact that transgenderism is a mental health disorder is not a denial of an individual’s integral worth as a human being. It is merely the acknowledgement of the existence of an objective reality in which gender is both binary and distinct. Unfortunately, this is not the attitude of those who influence public opinion. Consequently, programs for LGBTQ youth have seen an increase in youth who identify as transgender. The transgender journalist, Libby Down Under, has blamed instances of rapid-onset gender dysphoria on the normalisation of transgenderism in the culture. With a slew of celebrities coming out as transgender (former Olympian Bruce Jenner being a primary example), and with transgender characters being featured on numerous television shows, many teens and tweens have suddenly decided that they are transgender despite having no prior history of gender confusion.

Transgender youth increasingly feel that it is their right to express themselves however they please. And they feel that it is their right to silence all who dare to criticise or disagree with that expression. Cross-living, hormone therapy, and sex reassignment surgery are seen as part of this self-expression. Alarmingly, the mainstream response of psychotherapists to these children and adolescents is the “immediate affirmation of [their] self-diagnosis, which often leads to support for social and even medical transition.”

It is a classic case of political posturing overshadowing the pursuit of truth. Most youth suffering from gender dysphoria grow out of their predilection. Dr. James Cantor of the University of Toronto has cited three large-scale studies, along with other smaller studies, to show that transgender children eventually grow out of their gender dysphoria. The Diagnostic and Statistics Manual 5th Edition claims that desistance rates for gender dysphoria is seventy to ninety percent in “natal males” and fifty to eighty-eight percent in “natal females.” Similarly, the American Psychological Association’s Handbook of Sexuality and Psychology concludes that the vast majority of gender dysphoria-afflicted children learn to accept their gender by the time they have reached adolescence or adulthood.

It is not a secret that transgenderism lends itself to other mental health problems. Forty-one percent of transgender people have either self-harmed or experienced suicidal ideation (this percentage, of course, does not reveal at what stage of transition suicidal ideation or attempts occur). The postmodern, neo-Marxist answer to this problem is that transgender people are an oppressed minority and that they are driven to mental illness as a result of transphobia, social exclusion, bullying, and discrimination.

It is typical of the left to presume that society is to blame for an individual’s suffering. And to a certain extent, they are right. Transgender people are the victims of discrimination, prejudice, and violence. But it is more than likely that these abuses exacerbate their problems rather than causing them. One in eight transgender people, for example, rely on sex and drug work to survive. Is that the fault of society or the fault of the individual? The National Center for Transgender Equality claims that it is common for transgender people to have their privacy violated, to experience harassment, physical and sexuality violence, and to face discrimination when it comes to employment. They claim that a quarter of all transgender people have lost their jobs and three-quarters have faced workplace discrimination because of their transgender status.

In Australia, there has been a move to allow transgender children access to hormone-blocking drugs and sex-change surgeries. Australian gender activists – surprise, surprise – support the idea of as a way to reduce the rates of suicide among transgender people. The Medical Journal of Australia has approved the use of hormone therapy on thirteen-year-olds despite the fact that the scientific community remains, as of 2018, undecided on whether or not puberty-blocking drugs are either safe or reversible.

In the United States, a great deal of debate has occurred over transgender rights. In particular, there have been debates over what bathroom they should be allowed to use, how they should be recognised on official documents, and whether they should be allowed to serve in the military. In 2016, former President Barack Obama ordered state schools to allow transgender students to use whatever bathroom they desire. Similar ordinances have been passed in hundreds of cities and counties across the United States. Seventeen states and the District of Columbia are subject to ‘non-discrimination’ laws which include gender identity and gender expression. These include restrooms, locker rooms, and change rooms.

In March of 2016, North Carolina passed a law which required people in government buildings to use the bathroom appropriate to their biological gender. The US Federal Government decried the decision as bigotry and accused the government of North Carolina of violating the Civil Rights Act. The Federal Government threatened to withhold over US$4 billion in education funding. The government of North Carolina responded by filing suit against the government of the United States. The US government responded by filing suit against North Carolina. North Carolina received support from Mississippi, Tennessee, and Texas whilst Washington received support from most of the northern states.

Pro-transgender bathroom policies are not limited to government, however. Many businesses in the United States have similar bathroom policies. Many large corporations, among them Target, allow transgender people to use the bathroom of their choice. And they are perfectly prepared to enforce these policies, as well. A Macy’s employee in Texas was fired after he refused to allow a man dressed as a woman to use the female change rooms. Similarly, Planet Fitness revoked the membership of a woman who complained that a transgender man was in the female change rooms.

The most alarming trend of the gender theory movement is the attempt to indoctrinate children through changes to the education system. In 2013, France unleashed the ABCD de l’égalité (the ABCs of Equality) on six hundred elementary schools. In their own words, the program was designed to teach students that gender was a social construct:

“Gender is a sociological concept that is based on the fact that relations between men and women are socially and culturally constructed. The theory of gender holds that there is a socially constructed sex based on differentiated social roles and stereotypes in addition to anatomical, biological sex, which is innate.”

The creators of the program are smart enough to include the disclaimer: “biological differences should not be denied, of course, but those differences should not be fate.”

Fortunately, it would seem that many people are not taken in by this race to fantasyland. They are not taken in by the idea that the program merely exists to combat gender stereotypes and teach respect, and have protested. The French Minister of Education dismissed the protestors by saying that they “have allowed themselves to be fooled by a completely false rumour… at school we are teaching little boys to become little girls. That is absolutely false, and it needs to stop.” In America, The Boston Globe dismissed the protests against the program as being motivated by fear. Judith Butler event went as far as to say that France’s financial instability was the true cause of the protests.

And such a profound misuse of the education system isn’t limited to France, either. In Scotland, teachers are given guidance by LGBT Youth Scotland, children are expected to demonstrate “understanding of diversity in sexuality and gender identity”, and children are allowed to identify as either a girl or boy, or neither. The government of the United Kingdom has mandated that transgender issues be taught as part of the sex and relationships curriculum in primary and secondary school. Justine Greening, the education secretary, said: “it is unacceptable that relationships and sex education guidance has not been updated for almost twenty years especially given the online risks, such as sexting and cyberbullying, our children and young people face.”

It is in Australia, however, that there is the most shocking case of gender theory indoctrination. A great deal of controversy has been generated over the Safe Schools program. The program, which was established by the Victorian government in 2010, is supposedly designed to provide a safe, supportive, and inclusive environment for LGBTI students. It states that schools have the responsibility to challenge “all forms of homophobia, biphobia, transphobia, intersexism to prevent discrimination and bullying.”

The Safe Schools program promotes itself as an anti-bullying resource supporting “sexual diversity, intersex and gender diversity in schools.” It requires Victorian schools to eliminate discrimination based on gender identity, intersex, and sexual orientation, including in terms of an inclusive school environment.

The program addresses the issues of sleeping and bathroom arrangements and dress code. In terms of dress code, the program states:

“An inflexible dress code policy that requires a person to wear a uniform (or assume characteristics) of the sex that they do not identify with is likely to be in breach of anti-discrimination legislation including under the Equal Opportunity Act (1984) SA”

Likewise, the program states on the issue of bathrooms and change rooms that “transgender and diverse students should have the choice of accessing a toilet/changeroom that matches their gender identity.” In addition, the program states:

“Schools may also have unisex/gender neutral facilities. While this is a helpful strategy for creating an inclusive school environment for gender diverse students broadly, it is not appropriate to insist that any student, including a transgender student, use this toilet if they are not comfortable doing so.”

The idea that a transgender boy or girl should be allowed to sleep, shower, and defecate in the same place as a group of boys or girls ought to ring alarm bells for everyone. It increases the risk of sexual activity, sexual assault, pregnancy, and the transmission of sexually-transmitted-diseases. There is a reason why schools segregate changerooms, toilets, and dormitories.

The tragedy of David Reimer reveals just how dangerous it is to ignore the truth in favour of a false and malevolent social philosophy. It is one thing to seek tolerance and compassion for those in the community who may be struggling with their identity. It is something else entirely to use the plight of transgender peoples as a means of cording society to change the way it categorises gender. And it is completely insane to allow a false philosophy like gender theory to be used as the basis of public policy. If we don’t want more tragedies like David Reimer’s, we should put gender theory out in the trash where it belongs.

Language Matters

kevin-rudd-speaks-during-a-press-conference-data

“What’s in a word?”, asks Michael J. Knowles (1990 – ), host of the Michael Knowles Show, in a Prager University YouTube video entitled “Control the Words, Control the Culture.”

Knowles asks the viewer to consider the difference between an illegal immigrant and an undocumented immigrant, or the difference between a Christmas tree and a holiday tree. The answer, he tells us, lies in semantics. It is not the objects in themselves that are different, but the words used to define and describe them.

The manner in which we define and describe different things has a powerful effect on the way we view them. Our thoughts are processed and articulated through words. And it is through this articulation that our worldview is formed.

Language, therefore, is a vital cornerstone of civilisation. When it is used properly, it leads people towards truth and reason. But when it is abused, it leads people towards lies and irrationality.

The Judeo-Christian tradition is based upon written and verbal articulation. God’s first act of creation is the verbal commandment “let there be light.” Moses is commanded to write down the Ten Commandments. And Jesus Christ, the Messiah, is described as “the word of God made flesh.”

The left has come to realise that they can use language to manipulate the way people think. Through their domination of academia, culture, and media has ensured that it is their definitions and descriptors are the ones accepted within the larger culture.

The left controls language by using euphemisms to distort and obscure facts. These euphemisms make it easier for lies to be accepted by the larger populace.

Through their perversion of language, the left has all-ready been able to engineer significant social change. Would society have accepted gay marriage had it not been deviated from its original definition of the union of husband – man – and wife – woman? And would society have been so ready to accept abortion if those being killed were referred to as unborn babies and not as foetuses?

And the left continues to use language as a means to engineer social change. They refer to policies that favour groups based upon arbitrary factors such as race, gender, or sexuality as “social justice.” But to be just means to have “the quality of being fair and reasonable.” In reality, there is nothing just about the policies that comprise “social justice.”

Likewise, policies that unfairly favour non-white, non-male, and non-heterosexual individuals in academia and the workforce is referred to as, alternatively, positive discrimination and affirmative action. In reality, such practices are discrimination.

Intellectual conformity is enforced in the name of “diversity”, opposing points of views are censored in the name of “tolerance”, and voices of dissent are silenced because they are dismissed as “hate speech.”

When you control the words, you control the culture. And when you control the culture, you control the future of a civilisation.

Conservatives Don’t Care About Culture, Maybe It’s Time They Started To

man-repeller-erica-smith-pop-culture-shit-2017-1272x848

Culture is more important than politics. However, in the hierarchy of priorities, many conservatives rank it somewhere between checking their privilege and meeting diversity and inclusion quotas. They simply do see it as being of any importance.

Conservatives mistakenly believe that the culture is less important than politics and economics. In their mind, culture is akin to leisure, something that is relegated to times to relaxation. However, as the late Andrew Breitbart (1969 – 2012), was fond of pointing out: politics is downstream of culture. It is culture – art, film, theatre, literature, sports, video games, news media, and comic books, among other things – that informs public opinion long before policy is announced to the public or even made.

The left has realised this. They have made it a key aspect of their long-term strategy to dominate the culture and exclude conservatives. It has spent decades infiltrating the halls of culture, politics, and academia with little to no opposition from conservatives who, much to their detriment, have failed to realise the importance of these institutions.

To understand the importance of culture it is necessary to understand what culture is. Culture communicates ideas through art, literature, literature, film, and so forth. It is from culture that ideas and beliefs are popularised or dismissed. And it is from culture that our worldview is formed.

The difference between left-wing culture and right-wing culture is that left-wing culture expresses false ideas, whilst the ideas expressed by right-wing culture tend to be truthful.

Just take a look at conservative art compared with left-wing art. Left-wing art champions communism: a political ideology that has killed and enslaved tens-of-millions of people, Conservative art champions Christian values, honour, patriotism, love, and freedom. The Brady Bunch featured a two-parent family (admittedly blended, but that doesn’t really matter) and espoused the virtues of duty, honour, and responsibility whereas a show like Gilmore Girls glorified single motherhood and self-centredness.

If conservatives wish to promote good and truthful ideas, they must be prepared to invest more in the culture. They must be prepared to create businesses, establish grants, and more in order to finance and distribute conservative art. In doing so, they can prevent left-wing censorship and can ensure that good, truthful ideas continue to be promoted.

The War On Christmas

82fc043279dc02335cc4b640f6f55785-christmas-scenes-christmas-windows

In 2015, the then-Presidential candidate, Donald Trump (1946 – ) called for a boycott of Starbucks after the famous coffee shop chain failed to include the words “Merry Christmas” on their annual Christmas cups. “Did you read about Starbucks?”, Trump asked a rally in Springfield, Illinois. “No more ‘Merry Christmas’ on Starbucks. Maybe we should boycott Starbucks.”

Two years later, Donald Trump, now President of the United States, doubled down on his pro-Christmas message. Speaking at a Christian Public Policy conference, the President stated:

“We’re getting near that beautiful Christmas season that people don’t talk about anymore. They don’t use the word ‘Christmas’ because it’s not politically correct.”

Trump continued:

“You got to department stores and they’ll say, ‘Happy New Year’, or they’ll say other things and it’ll be red, they’ll have it painted. But they don’t say it. Well, guess what? We’re saying ‘Merry Christmas’ again.”

The sentiment that there is a War on Christmas designed to push the religious holiday out of public consciousness carries a great deal of validity. Since 2000, the Becket Institute has listed the biggest Christmas scrooges in American public life, giving the worst offenders an ‘Ebenezer award.’

In 2000, city manager of Eugene, Oregon, Jim Johnson was given the Ebenezer Award after he issued a five-page memo banning Christmas trees from any “public space” in the city.

In 2011, the Ebenezer Award was given to the United States Post Office after they enforced a policy preventing people from singing Christmas carols on Government property. This decision stands in direct contradiction to Benjamin Franklin’s (1706 – 1790) (their founder) commandment to “always live jollily; for a good conscience is a continual Christmas.”

In 2014, the City of Sioux Falls was given the Ebenezer Award after they threatened to repaint and censor snowploughs that featured artwork celebrating the religious nature of Christmas.

In 2015, the Ebenezer Award was given to the Department of Veteran Affairs after they banned their employees at their Salem, Virginia facility from saying ‘Merry Christmas.’

The problem is not unique to the United States, either. During an interview with 2GB Radio, Peter Dutton (1970 – ), Australia’s minister for immigration and border protection, became incensed after a caller informed him that there had not been any Christmas carols in a performance at his grandchild’s school. The caller informed Dutton that the school in question, Kerdon State High School, had replaced the lyric “we wish you a Merry Christmas” with “we wish you a happy holiday.” Dutton replied: “You make my blood boil with these stories. It is political correctness gone mad and I think people have just had enough of it.”

20xp-waronxmas-master768

I believe that the drive to remove the more traditional and religious aspects from holidays like Christmas and Easter is indicative of a larger attempt to abolish the influence of Christianity on society and culture.

The problem with this, needless to say, is that it is akin to chopping down a tree and still wishing to enjoy its fruits. It is not possible to enjoy the fruits of Western culture and civilisation when its ideological origins and overarching philosophical-cum-theological structures have been removed. Christianity and Western civilisation are inextricably linked. The poet, T.S. Eliot (1888 – 1965) wrote in Notes Towards the Definition of Culture (1943) that “to our Christian heritage we owe many things besides religious faith. Through it we trace the evolution of our arts, through it we have a conception of Roman Law which has done so much to shape the Western world, through it we have our conception of private and public morality.”

The War on Christmas is an attack on the very fabric of Western Civilisation. Christmas symbolises the central axiom our culture was built on: that the Universe was constructed to have a natural and moral order. The War on Christmas is not merely an attack of Judeo-Christian belief, nor is it merely an attack on Western culture, it is an attack upon truth itself.  And the truth cannot prosper while those who believe it are unwilling to defend it.

BUSTING THE MYTH OF THE DARK AGES

best-history-podcast-history-of-england

Is there any other time in history more malaligned than the Middle Ages?  Our modern conception of the so-called “dark ages” is that it was time characterised by superstition, barbarity, oppression, ignorance with a few outbreaks of the plague, just to make things interesting.

This view has been helped by numerous so-called educational resources. BBC’s Bitesize website, for example, takes a leaf from certain 19th-century British historians,  the type of who saw Catholics as ignorant and childish, and caricatures Medieval peasants as “extremely superstitious” individuals who were “encouraged to rely on prayers to the saints and superstition” for guidance through life.  It even accuses the Catholic Church of stagnating human thought and impeding technological development.

This does not represent the view, however, of many serious historians and academics. As Professor Ronald Numbers of Cambridge University explains:

“Notions such as: ‘the rise of Christianity killed off ancient science’, ‘the medieval Christian Church  suppressed the growth of the natural sciences’, ‘the medieval Christians thought that the world was  flat’, and ‘the Church prohibited autopsies and dissections during the Middle Ages’ [are] examples of  widely popular myths that still pass as historical truth, even though they are not supported by  historical research.’

In reality, the Middle Ages saw advances in law, politics, the sciences, theology, philosophy, and more. It saw the birth of the chartered town which ushered in the tradition of local self-governance. The existence of a strong papacy laid the foundations of limited political power as it prevented monarchs, who justified their power through their so-called “unique” relationship with God and the Church, from monopolising power.  This symbolic limitation on monarchical power was manifested in the Magna Carta (1215) and the birth of the English Parliament.

The people of the Middle Ages produced magnificent Gothic cathedrals and churches. Many medieval monks became patrons of the arts and many were even artists themselves. In literature, the Middle Ages saw Dante’s the Divine Comedy and Geoffrey Chaucer’s Canterbury Tales. In music, the Middle Ages laid the foundation of Western classical music and saw the development of musical notation, western harmony, and many of the Christmas carols we know and love today.

Likewise, the Carolingian Renaissance of the 8th and 9th centuries saw advancements in the study of literature, architecture, jurisprudence, and theology. Medieval scholars and scientists, many of whom were monks and friars, studied natural philosophy, mathematics, engineering, geography, optics, and medicine.

In the spirit of intellectual and spiritual enlightenment, many universities, including Oxford University, Cambridge University, and the University of Cologne. These universities educated their students on law, medicine, theology, and the arts. In addition, the period also saw the foundation of many schools and many early Christian monasteries were committed to the education of the common people.

The Middle Ages saw advances in science, literature, philosophy, theology, the arts, music, politics, law, and more. Its legacy is all around us: whether it is in the limitations placed on the powers of Governments, the music we listen to, or in the tradition of education many of us have benefited from. In an era of political correctness perhaps we should be wondering whether we’re living in the “dark ages.”

THE PROBLEM WITH MULTICULTURALISM

no-blood-5-1024x778

At a security conference in Germany, the former British Prime Minister, David Cameron, condemned multiculturalism as a failure. He stated: “we need less of the passive tolerance of recent years and much more active, muscular liberalism.” In a similar statement, the French president, Nicolas Sarkozy, also condemned the doctrine of multiculturalism. Sarkozy told the French people: “we have been too concerned about the identity of the person who was arriving and not enough about the identity of the country that was receiving him.” In recent years, the Western nations that have preached multiculturalism and diversity as bastions of peace, tolerance, and diversity – Great Britain, France, Germany, the United States – have been the primary targets of radical Islamic terrorism.

Progressives like to believe multiculturalism and diversity create harmonious and peaceful societies. When, in reality, it creates division. Telling newcomers that they do not have to assimilate into their adopted culture fosters tribalism: Irish form communities with fellow Irish, Muslims form communities with fellow Muslims, Japanese form communities with fellow Japanese, and so forth. As these cultures, especially those lacking the fundamental roots and beliefs of their adopted countries, compete for supremacy, they inevitably conflict with one another. So, whilst Germanic and French cultures may be able to live harmoniously thanks to their shared Christian heritage, the same cultures would not fare as well if they were expected to co-exist with a culture whose central tenants are profoundly different.

galleria_euro-castello-valerio-49-3

Why am I harping on about the inherent faults in multiculturalism and diversity? It is because I believe we have created the greatest culture mankind has ever seen: a culture that has produced Shakespeare, Mozart, Voltaire, Plato, Aristotle, John Locke, freedom and democracy, the television, the I-Phone, the movies, free market capitalism, Van Gogh, Da Vinci, Einstein, Newton, Mary Shelley, the Bronte sisters, and more. And I believe it is a culture worth protecting. And how do we protect it? We start by protecting the very things that have made the West so great in the first place: Christianity, an adherence to truth and a deep esteem towards the logos, the supremacy placed on individual rights and liberties, the free-market place of ideas and commerce, Small Governments, and political freedom.

Moral and cultural relativism is being used to tear down and replace the existing social order. When the Mayor of London, Shadiq Khan, is able to state “terror attacks are part and parcel of living in a big city” and young German women are able to hold signs proudly proclaiming “will trade racists for rapists” unopposed, it is clearly time for certain ideas to go away.

THE DEATH OF GOD

nietzsche-274x300

This week for our theological article, we will be examining Friedrich Nietzsche’s (1844 – 1900) infamous statement, “God is dead.”

Friedrich Wilhelm Nietzsche (pronounced ‘knee-cha’) was born in Röcken, near Leipzig, on October 15th, 1944. His father, Karl Ludwig Nietzsche (1813 – 1849), was a Lutheran pastor and former teacher, and his mother was Franziska Oehler (1826 – 1897). The Nietzsche family quickly grew to include a daughter, Elisabeth (1846 – 1935), and another son, Ludwig Joseph (1848 – 1850). Unfortunately, the family would be beset by tragedy. In 1849, when Nietzsche was five-years-old, Karl Nietzsche would suffer a devastating brain haemorrhage and die. Then, as if to rub in salt in their wounds, the infant Ludwig Joseph, would die unexpectedly shortly after.

Nietzsche was educated at the prestigious Schulpforta school near Naumburg. There he received an education in theology, classical languages, and the humanities. After graduating, young Nietzsche attended the University of Bonn before moving to the University of Leipzig. During his time there, Nietzsche became acquainted with the philosophy of Arthur Schopenhauer (1788 – 1860) whose work, the World as Will and Representation (1818), would have a tremendous influence. Then, aged only twenty-four, Nietzsche was awarded the position of professor of Greek language and Literature at the University of Basel in Switzerland. He had never written a doctoral dissertation.

Nietzsche left academia briefly to serve as a medical orderly in the Franco-Prussian War (1870-1871). He was discharged due to poor health. Nietzsche returned to Basel where he came acquainted with the cultural historian, Jacob Burckhardt (1818 – 1897), and the composer, Richard Wagner (1813 – 1883). Wagner’s influence on Nietzsche can most readily be seen in the Birth of Tragedy.

During the late 1870s, Nietzsche became increasingly beset with debilitating health problems: digestive problems, poor eyesight, and migraines. He was forced to spend months off work, and eventually agreed to retire with a modest pension. Nietzsche was only thirty-four years old.

From there, Nietzsche devoted the rest of his life to the study and writing of philosophy. Between 1870 and 1889, Nietzsche wrote nineteen books, including: The Birth of Tragedy (1872), Philosophy in the Tragic Age of the Greeks (1873), Human, All Too Human (1878), the Gay Science (1882), Thus Spake Zarathustra (1883), Beyond Good and Evil (1886), On the Genealogy of Morals (1887), Twilight of the Idols (1888), Ecce Homo (1888), and the Will to Power (1901, technically unpublished manuscripts published by his sister, Elisabeth).

In 1889, in Turin Italy, Nietzsche suffered a mental breakdown after seeing a horse being flogged in the Piazza Carlo Alberto. In the following days, Nietzsche sent a series of ‘madness letters’ to Cosimo Wagner (1837 – 1930) and Jacob Burckhardt in which he signed his name ‘Dionysos’, claimed to be ‘the crucified one’, and asserted that he was the creator of the world. It was quickly agreed that Nietzsche should be brought back to Basel. There he was incarcerated in a clinic in Jena.

In 1890, Nietzsche’s mother, Franziska, brought him home to Naumburg where she looked after him until her death in 1897. From there, Nietzsche was cared for by his sister, Elisabeth, in Weimar. He died on August 25th, 1900 at the age of fifty-five.

moses-destruction-tablets-2

The statement, “God is dead” is Nietzsche’s most memorable and provocative statement. (Of course, he wasn’t the first one to coin the term. That was Heinrich Heine (1797 – 1856). Nietzsche merely philosophised it). It first appeared in the Gay Science in a fable entitled, the Parable of the Madman. In the parable, the madman asks, ‘where is God?’, only to be informed that God had been killed by man:

“God is dead. God remains dead. And we have killed him. How shall we, murderer of all murderers, console ourselves? That which was holiest and mightiest of all that the world has yet possessed has bled to death under our knives. Who will wipe the blood off us? With what water could we purify ourselves?”

Of course, Nietzsche wasn’t talking about the literal death of God (he was, after all, an atheist). Instead, he was referring to the death of the concept or idea of God. The statement was meant as a reference to the decline of traditional and metaphysical doctrines that had dominated European thought and culture for centuries.

Nietzsche observed, correctly, that western morality was predicated on the presumption of the truth of Judeo-Christian values. Christianity had become infused in European culture and thought. Philosophers and scientists like Copernicus (1473 – 1543), René Descartes (1596 – 1650), Isaac Newton (1643 – 1727), Saint Thomas Aquinas (1225 – 1274), George Berkeley (1685 – 1753), Saint Augustine (354-430AD), Gottfried Wilhelm Leibniz (1646-1716), and more were all deeply influenced by their belief in God. Culturally, Handel’s (1685 – 1759) Messiah, Da Vinci’s (1452 – 1519) the Last Supper, and Michelangelo’s (1475 – 1564) Statue of David are all infused with religious themes.

The decline of Christianity’s supremacy in society began with the Enlightenment. Science replaced scripture. During this time, the belief in a universe governed by God was replaced by governance through the laws of physics, the divine right to rule was replaced with rule by consent, and morality no longer had to emanate from a loving and omniscient God.

The legacy of the Enlightenment, Nietzsche rightly observed, was that Christianity lost its central place in Western culture. (Of course, it can also be argued that Christianity’s central doctrines and tenets have been so absorbed by society people no longer recognise their influence). Science, replete with its elaborate depictions of physical reality, ultimately replaced religious truth.

Hitler at Dortmund Rally

Nietzsche’s assertion is often seen as a triumphal or victorious statement. However, analysis reveals that Nietzsche did not necessarily see the death of God as a good thing. He recognised that as society moved closer to secularisation, the order and meaning religion gave to society would fall by the wayside. People would no longer base their lives on their religious beliefs, but on other factors. Their lives would not be grounded in anything. As Nietzsche wrote in the Twilight of the Idols:

“When one gives up the Christian faith, one pulls the right to Christian morality out from under one’s feet. This morality is by no means self-evident… Christianity is a system, a whole view of things thought out together. By breaking one main concept out of it, the faith in God, one breaks the whole.”

Nietzsche believed the solution to the problem would be to create our own, individual values. Christian morality (derided by Nietzsche as ‘slave morality’) would be replaced by ‘master morality.’ Human beings would strive to become Übermensches or overmen.

The problem with Nietzsche’s suggestion is that it is virtually impossible to keep society ordered when everyone’s values are different. Furthermore, as Carl Jung (1875 – 1961) points out, it is impossible for us to create our own values. Most of us can’t keep our new year’s resolutions, let alone create a value system that will bring order to society.

Nietzsche, along with Russian novelist, Fyodor Dostoevsky (1821 – 1881), predicted that the 20th Century would be characterised either by apocalyptic nihilism or equally apocalyptic ideological totalitarianism. In the end, the world experienced both. The wake of the Great War (1914 – 1918) saw Europe plagued by communism, fascism, Nazism, and quasi-religious nationalism. In Russia, communism, through which a person’s value was derived from his labour, arose under the Bolsheviks. In Italy, fascism, through which a person’s value was derived from his nationality, arose under Benito Mussolini (1883 – 1945). In Germany, Nazism, through which a person’s value was derived from his race, arose under Adolf Hitler (1889 – 1945). All of these systems attempted to give people’s lives meaning by replacing the state with God.

In the end, the 20th Century would be the deadliest and most destructive in human history. The legacy of two world wars, nuclear weapons, communism, and fascism has been millions of painful and unnecessary deaths. This is what we get when we remove God from society: needless pain and suffering.

ON WAR

pic5cs5co5csoviet20offensive20on20berlin20in201944

The evolutionary psychologist E.O. Wilson referred to war as “humanity’s hereditary curse.” It has become infused in our collective and individual psyches. The Iliad tells the story of the Trojan War, Shakespeare’s Henry V is centred around the Battle of Agincourt, and All Quiet on the Western Front tells of the experiences of young German soldiers on the Western Front.

The purpose of war can be split into two fields: philosophical and pragmatic. Most modern wars are fought for ideological, and therefore philosophical reasons: capitalism versus communism, fascism versus democracy, and so forth. Richard Ned Lebow, a political scientist at the University of London, hypothesised that nations go to war for reasons of ‘national spirit.’ Institutions and nation-states may not have psyches per-say, but the individuals who run them do, and it is natural for these individuals to project the contents of their psyches onto the institutions and nation-states they are entrusted with.

Rationalists, on the other hand, have another perspective. War, they argue, is primarily used by nations to increase their wealth and power: allowing them to annex new territories, take control of vital resources, pillage, rape, and so forth. Bolshevism arose in the political instability and food shortages of World War One Russia. The Nazis used the spectre of Germany’s humiliating defeat in the Great War and its treatment in the Treaty of Versailles as a stepping stone to political power. In the Ancient World, Sargon of Akkad (2334-2279BC) used war to form the Akkadian Empire, and then used war to quell invasions and rebellion. Similarly, Philip II of Macedonia (382BC – 336BC) used war to unify the city states of Ancient Greece.

Another explanation may be that we engage in war because we are naturally inclined to. War speaks to our need for group identity, and to our deep predilection for conflict. And it should come as no surprise that the two are not mutually exclusive. Our strong predilection towards our own group not only makes us more willing to help other members of that group, it makes us more willing to commit evil on its behalf. Chimpanzees have been known to invade other congresses of chimps and go on killing sprees. The obvious intention being to increase territory and decrease intra-sexual competition. Similarly, our own evolutionary and primitive past is fraught with violence and conflict. It should not escape our attention that history is abundant with examples of invading soldiers slaughtering men and raping women.

Like all the profound aspects of culture, war conceptualises a facet of a deeper truth. It has been central to our history and culture capturing both the more heroic and the more frightening aspects of our individual and collective psyches. We both influence and are influenced by war.