Home » Posts tagged 'political power'
Tag Archives: political power
I would like to begin this essay by reciting a poem by the English Romantic poet, William Wordsworth (1770 – 1850):
Milton! thou shouldst be living at this hour:
England hath need for thee: she is a fen
Of stagnant waters: altar, sword, and pen,
Fireside, the heroic wealth of hall and bower,
Have forfeited their ancient English dower
Of inward happiness. We are selfish men;
Oh! raise us up, return to us again;
And give us manners, virtue, freedom, power.
Thy soul was like a star, and dwelt apart:
Thou hadst a voice whose sound was like the sea:
Pure as the naked heavens, majestic, free
So didst thou travel on life’s common way,
In cheerful godliness; and yet thy heart
The lowliest duties on herself did lay.
The poem, entitled London 1802, is Wordsworth’s ode to an older, nobler time. In it he attempts to conjure up the spirit of John Milton (1608 – 1674), the writer and civil servant immortalised for all time as the writer of Paradise Lost.
Milton acts as the embodiment for a nobler form of humanity. He symbolises a time when honour and duty played far greater a role in the human soul than it did in Wordsworth’s time, or even today. It is these themes of honour, duty, and nobility that will provide the spiritual basis for constitutional monarchy.
It is a subject that I will return to much later in this essay. But, to begin, it would perhaps be more prudent to begin this essay in earnest by examining those aspects of English history that allowed both constitutional monarchy and English liberty to be borne.
The English monarchy has existed for over eleven-hundred years. Stretching from King Alfred the Great in the 9th century to Elizabeth II in the 21st, the English people have seen more than their fair share of heroes and villains, wise kings and despotic tyrants. Through their historical and political evolution, the British have developed, and championed, ideals of liberty, justice, and good governance. The English have gifted these ideals to most of the Western World through the importation of their culture to most of the former colonies.
It is a sad reality that there are many people, particularly left-wing intellectuals, who need to reminded of the contributions the English have made to world culture. The journalist, Peter Hitchens (1951 – ) noted in his book, The Abolition of Britain that abhorrence for one’s own country was a unique trait of the English intellectual. Similarly, George Orwell (1903 – 1950) once observed, an English intellectual would sooner be seen stealing from the poor box than standing for “God Save the King.”
However, these intellectuals fail to notice, in their arrogance, that “God save the King” is actually a celebration of constitutional monarchy and not symbolic reverence to an archaic and rather powerless royal family. It is intended to celebrate the nation as embodied in the form of a single person or family and the fact that the common man and woman can live in freedom because there are constitutional restraints placed on the monarch’s power.
If one’s understanding of history has come from films like Braveheart, it is easy to believe that all people in all times have yearned to be free. A real understanding of history, one that comes from books, however, reveals that this has not always been the case. For most of history, people lived under the subjugation of one ruler or another. They lived as feudal serfs, subjects of a king or emperor, or in some other such arrangement. They had little reason to expect such arrangements to change and little motivation to try and change them.
At the turn of the 17th century, the monarchs of Europe began establishing absolute rule by undermining the traditional feudal institutions that had been in place for centuries. These monarchs became all-powerful wielding their jurisdiction over all forms of authority: political, social, economic, and so forth.
To justify their mad dash for power, Europe’s monarchs required a philosophical argument that vindicated their actions. They found it in a political doctrine known as ‘the divine rights of kings.’ This doctrine, formulated by the Catholic Bishop, Jacques Bossuet (1627 – 1704) in his book, Politics Derived from Sacred Scripture, argued that monarchs were ordained by God and therefore represented His will. It was the duty of the people to obey that individual without question. As such, no limitations could be put on a monarch’s power.
What Bossuet was suggesting was hardly a new, but it did provide the justification many monarchs needed to centralise power in themselves. King James I (1566 – 1625) of England and Scotland saw monarchs as God’s lieutenants and believed that their actions should be tempered by the fear of God since they would be called to account at the Last Judgement. On the basis of this belief, King James felt perfectly justified in proclaiming laws without the consent of parliament and involving himself in cases being tried before the court.
When King James died in 1625, he was succeeded by his second-eldest son, Charles (1600 – 1649). King Charles I assumed the throne during a time of political change. He was an ardent believer in the divine rights of kings, a belief that caused friction between the monarch and parliament from whom he had to get approval to raise funds.
In 1629, Charles outraged much of the population, as well as many nobles, when he elected to raise funds for his rule using outdated taxes and fines, and stopped calling parliament altogether. Charles had been frustrated by Parliament’s constant attacks on him and their refusal to furnish him with money. The ensuing period would become known as the eleven years tyranny.
By November 1640, Charles had become so bereft of funds that he was forced to recall Parliament. The newly assembled Parliament immediately began clamouring for change. They asserted the need for a regular parliament and sought changes that would make it illegal for the King to dissolve the political body without the consent of its members. In addition, the Parliament ordered the king to execute his friend and advisor, Thomas Wentworth (1593 – 1641), the 1st Earl of Stafford, for treason.
The result was a succession of civil wars that pitted King Charles against the forces of Parliament, led by the country gentlemen, Oliver Cromwell (1599 – 1658). Hailing from Huntingdon, Cromwell was a descendant of Henry VIII’s (1491 – 1547) chief minister, Thomas Cromwell (1485 – 1550). In the end, it would decimate the English population and forever alter England’s political character.
The English Civil War began in January 1642 when King Charles marched on Parliament with a force of four-hundred-thousand men. He withdrew to Oxford after being denied entry. Trouble was brewing. Throughout the summer, people aligned themselves with either the monarchists or the Parliamentarians.
The forces of King Charles and the forces of Parliament would meet at the Battle of Edgehill in October. What would follow is several years of bitter and bloody conflict.
Ultimately, it was Parliament that prevailed. Charles was captured, tried for treason, and beheaded on January 30th, 1642. England was transformed into a republic or “commonwealth.” The English Civil War had claimed the lives of two-hundred-thousand peoples, divided families, and facilitated enormous social and political change. Most importantly, however, it set the precedent that a monarch could not rule without the consent of parliament.
The powers of parliament had been steadily increasing since the conclusion of the English Civil War. However, total Parliamentary supremacy had proven unpopular. The Commonwealth created in the wake of the Civil War had collapsed shortly after Oliver Cromwell’s death. When this happened, it was decided to restore the Stuart dynasty.
The exiled Prince Charles returned to France and was crowned King Charles II (1630 – 1685). Like his father and grandfather, Charles was an ardent believer in the divine rights of kings. This view put him at odds with those of the Enlightenment which challenged the validity of absolute monarchy, questioned traditional authority, and idealised liberty.
By the third quarter of the 17th century, Protestantism had triumphed in both England and Scotland. Ninety-percent of the British population was Protestant. The Catholic minority was seen as odd, sinister, and, in extreme cases, outright dangerous. People equated Catholicism with tyranny linking French-Style autocracy with popery.
It should come as no surprise, then, that Catholics became the target of persecution. Parliament barred them from holding offices of state and banned Catholic forms of worship. Catholics were barred from becoming members of Parliament, justices of the peace, officers in the army, or hold any other position in Parliament unless they were granted a special dispensation by the King.
It is believed that Charles II may have been a closet Catholic. He was known for pardoning Catholics for crimes (controversial considering Great Britain was a protestant country) and ignoring Parliament.
However, Charles’ brother and successor, James (1633 – 1701) was a Catholic beyond any shadow of a doubt. He had secretly converted in 1669 and was forthright in his faith. After his first wife, Anne Hyde (1637 – 1671) died, James had even married the Italian Catholic, Mary of Modena (1658 – 1718). A decision that hardly endeared him to the populace.
The English people became alarmed when it became obvious that Charles II’s wife, Catherine of Braganza (1638 – 1705) would not produce a Protestant heir. It meant that Charles’ Catholic brother, James was almost certainly guaranteed to succeed him on the throne. So incensed was Parliament at having a Catholic on the throne, they attempted to pass the Crown onto one of Charles’ Anglican relatives.
Their concern was understandable, too. The English people had suffered the disastrous effects of religious intolerance since Henry VIII had broken away from the Catholic Church and established the Church of England. The result had been over a hundred years of religious conflict and persecution. Mary I (1516 – 1558), a devout Catholic, had earnt the moniker “bloody Mary” for burning Protestants the stake. During the reign of King James, Guy Fawkes (1570 – 1606), along with a group of Catholic terrorists, had attempted to blow up Parliament in the infamous “gunpowder plot.”
Unlike Charles II, James made his faith publicly known. He desired greater tolerance for Catholics and non-Anglican dissenters like Quakers and Baptists. The official documents he issued, designed to bring about the end of religious persecution, were met with considerable objection from both Bishops and Europe’s protestant monarchs.
Following the passing of the Test Act in 1672, James had briefly been forced to abandon his royal titles. The Act required officers and members of the nobility to take the Holy Communion as spelt out by the Church of England. It was designed to prevent Catholics from taking public office.
Now, as King, James was attempting to repeal the Test Act by placing Catholics in positions of power. His Court featured many Catholics and he became infamous for approaching hundreds of men – justices, wealthy merchants, and minor landowners – to stand as future MPs and, in a process known as ‘closeting’, attempting to persuade them to support his legal reforms. Most refused.
That was not the limits of James’ activities, either. He passed two Declarations of Indulgences to be read from every stage for two Sundays, and put those who opposed it on trial for seditious libel. Additionally, he had imprisoned seven Bishops for opposing him, made sweeping changes to the Church of England, and built an army comprising mainly of Catholics.
The people permitted James II to rule as long as his daughter, the Protestant Prince Mary (1662 – 1694) remained his heir. All this changed, however, when Mary Modena produced a Catholic heir: James Francis Edward Stuart (1688 – 1766). When James declared that the infant would be raised Catholic, it immediately became apparent that a Catholic dynasty was about to be established. Riots broke out. Conspiracy theorists posited that the child was a pawn in a Popish plot. The child, the theory went, was not the King’s son but rather a substitute who had been smuggled into the birthing chamber in a bed-warming pan.
In reality, it was the officers of the Army and Navy who were beginning to plot and scheme in their taverns and drinking clubs. They were annoyed that James had introduced Papist officers into the military. The Irish Army, for example, had seen much of its Protestant officer corps dismissed and replaced with Catholics who had little to no military experience.
James dissolved Parliament in July 1688. Around this time, a Bishop and six prominent politicians wrote to Mary and her Dutch husband, William of Orange (1650 – 1702) and invited them to raise an army, invade London, and seize the throne. They accepted.
William landed in Dorset on Guy Fawkes’ day accompanied by an army of fifteen-thousand Dutchmen and other Protestant Europeans. He quickly seized Exeter before marching eastward towards London. James II called for troops to confront William.
Things were not looking good for James, however. Large parts of his officer corps were defecting to the enemy and taking their soldiers with them. Without the leadership of their officers, many soldiers simply went home. English magnates started declaring for William. And his own daughter, Princess Anne (1665 – 1714) left Whitehall to join the rebels in Yorkshire. James, abandoned by everyone, fled to exile in France. He would die there twelve-years-later.
On January 22nd, 1689, William called the first ‘convention parliament.’ At this ‘convention’, Parliament passed two resolutions. First, it was decided that James’ flight into exile constituted an act of abdication. And second, it was declared a war against public policy for the throne to be occupied by a Catholic. As such, the throne was passed over James Francis Edward Stuart, and William and Mary were invited to take the Crown as co-monarchs.
They would be constrained, however, by the 1689 Bill of Rights and, later, by the 1701 Act of Settlement. The 1689 Bill of Rights made Great Britain a constitutional monarchy as opposed to an absolute one. It established Parliament, not the crown, as the supreme source of law. And it set out the most basic rights of the people.
Likewise, the 1701 Act of Settlement helped to strengthen the Parliamentary system of governance and secured a Protestant line of succession. Not only did it prevent Catholics from assuming the throne, but it also gave Parliament the ability to dictate who could ascend to the throne and who could not.
The Glorious Revolution was one of the most important events in Britain’s political evolution. It made William and Mary, and all monarchs after them, elected monarchs. It established the concept of Parliamentary sovereignty granting that political body the power to make or unmake any law it chose to. The establishment of Parliamentary sovereignty brought with it the ideas of responsible and representative government.
The British philosopher, Roger Scruton (1944 – ) described British constitutional monarchy as a “light above politics which shines down [on] the human bustle from a calmer and more exalted sphere.” A constitutional monarchy unites the people for a nation under a monarch who symbolises their shared history, culture, and traditions.
Constitutional monarchy is a compromise between autocracy and democracy. Power is shared between the monarch and the government, both of whom have their powers restricted by a written, or unwritten, constitution. This arrangement separates the theatre of power from the realities of power. The monarch is able to represent the nation whilst the politician is able to represent his constituency (or, more accurately, his party).
In the Need for Roots, the French philosopher, Simone Weils (1909 – 1943) wrote that Britain had managed to maintain a “centuries-old tradition of liberty guaranteed by the authorities.” Weils was astounded to find that chief power in the British constitution lay in the hands of a lifelong, unelected monarch. For Weils, it was this arrangement that allowed the British to retain its tradition of liberty when other countries – Russia, France, and Germany, among others – lost theirs when they abolished their monarchies.
Great Britain’s great legacy is not their once vast and now non-existent Empire, but the ideas of liberty and governance that they have gifted to most of their former colonies. Even the United States, who separated themselves from the British by means of war, inherited most of their ideas about “life, liberty, and the pursuit of happiness” from their English forebears.
The word “Commonwealth” was adopted at the Sixth Imperial Conference held between October 19th and November 26th, 1926. The Conference, which brought together the Prime Ministers of the various dominions of the British Empire, led to the formation of the Inter-Imperial Relations Committee. The Committee, headed for former British Prime Minister, Arthur Balfour (1848 – 1930), was designed to look into future constitutional arrangements within the commonwealth.
“We refer to the group of self-governing communities composed of Great Britain and the Dominions. Their position and mutual relation may be readily defined. They are autonomous Communities within the British Empire, equal in status, in no way subordinate one to another in any aspect of their domestic or external affairs, though united by a common allegiance to the Crown, and freely associated as members of the British Commonwealth of Nations.”
“Every self-governing member of the Empire is now the master of its destiny. In fact, if not always in form, it is subject to no compulsion whatsoever.”
Then, in 1931, the Parliament of the United Kingdom passed the Statute of Westminster. It became one of two laws that would secure Australia’s political and legal independence from Great Britain.
The Statute of Westminster gave legal recognition to the de-facto independence of the British dominions. Under the law, Australia, Canada, the Irish Free State, Newfoundland (which would relinquish its dominion status and be absorbed into Canada in 1949), New Zealand and South Africa were granted legal independence.
Furthermore, the law abolished the Colonial Validity Act 1865. A law which had been enacted with the intention of removing “doubts as to the validity of colonial laws.” According to the act, a Colonial Law was void when it “is or shall be in any respect repugnant to the provisions of any Act of Parliament extending to the colony to which such laws may relate, or repugnant to any order or regulation under authority of such act of Parliament or having in the colony the force and effect of such act, shall be read subject to such act, or regulation, and shall, to the extent of such repugnancy, but not otherwise, be and remain absolutely void and inoperative.”
The Statute of Westminster was quickly adopted by Canada, South Africa, and the Irish Free State. Australia, on the other hand, did not adopt it until 1942, and New Zealand did not adopt it until 1947.
More than forty-years-later, the Hawke Labor government passed the Australia Act 1986. This law effectively made the Australian legal system independent from Great Britain. It had three major achievements. First, it ended appeals to the Privy Council thereby establishing the High Court as the highest court in the land. Second, it ended the influence the British government had over the states of Australia. And third, it allowed Australia to update or repeal those imperial laws that applied to them by ending British legislative restrictions.
What the law did not do, however, was withdraw the Queen’s status as Australia’s Head of State:
“Her Majesty’s Representative in each State shall be the Governor.
Subject to subsections (3) and (4) below, all powers and functions of Her Majesty in respect of a State are exercisable only by the Governor of the State.
Subsection (2) above does not apply in relation to the power to appoint, and the power to terminate the appointment of, the Governor of a State.
While her Majesty is personally present in a State, Her Majesty is not precluded from exercising any of Her powers and functions in respect of the State that are the subject of subsection (2) above.
The advice of Her Majesty in relation to the exercise of powers and functions of Her Majesty in respect of a State shall be tendered by the Premier of the State.”
These two laws reveal an important miscomprehension that is often exploited by Australian Republicans. That myth is the idea that Australia does not have legal and political independence because its Head of State is the British monarch. The passage of the Statute of Westminster in 1931 and the Australia Act in 1986 effectively ended any real political or legal power the British government had over Australia.
In Australia, the monarch (who is our head of state by law) is represented by a Governor General. This individual – who has been an Australian since 1965 – is required to take an oath of allegiance and an oath of office that is administered by a Justice (typically the Chief Justice) of the High Court. The Governor-General holds his or her position at the Crown’s pleasure with appointments typically lasting five years.
The monarch issues letters patent to appoint the Governor General based on the advice of Australian ministers. Prior to 1924, Governor Generals were appointed on the advice of both the British government and the Australian government. This is because the Governor General at that time represented both the monarch and the British government. This arrangement changed, however, at the Imperial Conferences of 1926 and 1930. The Balfour Report produced by these conferences stated that the Governor General should only be the representative of the crown.
The Governor General’s role is almost entirely ceremonial. It has been argued that such an arrangement could work with an elected Head of State. However, such an arrangement would have the effect of politicising and thereby corrupting the Head of State. A Presidential candidate in the United States, for example, is required to raise millions of dollars for his campaign and often finds himself beholden to those donors who made his ascent possible. The beauty of having an unelected Head of State, aside from the fact that it prevents the government from assuming total power, is that they can avoid the snares that trap other political actors.
The 1975 Constitutional Crisis is a perfect example of the importance of having an independent and impartial Head of State. The crises stemmed from the Loans Affair which forced Dr. Jim Cairns (1914 – 2003), Deputy Prime Minister, Treasurer, and intellectual leader of the political left, and Rex Connor (1907 – 1977) out of the cabinet. As a consequence of the constitutional crisis, Gough Whitlam (1916 – 2014) was dismissed as Prime Minister and the 24th federal parliament was dissolved.
The Loan’s affair began when Rex Connor attempted to borrow money, up to US$4b, to fund a series of proposed national development projects. Connor deliberately flouted the rules of the Australian Constitution which required him to take such non-temporary government borrowing to the Loan Council (a ministerial council consisting of both Commonwealth and state elements which existed to coordinate public sector borrowing) for approval. Instead, on December 13th, 1974, Gough Whitlam, Attorney-General Lionel Murphy (1922 – 1986), and Dr. Jim Cairns authorised Connor to seek a loan without the council’s approval.
When news of the Loans Affair was leaked, the Liberal Party, led by Malcolm Fraser (1930 – 2015), began questioning the government. Whitlam attempted to brush the scandal aside by claiming that the loans had merely been “matters of energy” and claiming that the Loans Council would only be advised once a loan had been made. Then, on May 21st, Whitlam informed Fraser that the authority for the plan had been revoked.
Despite this, Connor continued to liaise with the Pakistani financial broker, Tirath Khemlani (1920 – 1991). Khemlani was tracked down and interviewed by Herald Journalist, Peter Game (1927 – ) in mid-to-late 1975. Khemlani claimed that Connor had asked for a twenty-year loan with an interest of 7.7% and a 2.5% commission for Khemlani. The claim threw serious doubt on Dr. Jim Cairn’s claim that the government had not offered Khemlani a commission on a loan. Game also revealed that Connor and Khemlani were still in contact, something Connor denied in the Sydney Morning Herald.
Unfortunately, Khemlani had stalled on the loan, most notably when he had been asked to go to Zurich with Australian Reserve Bank officials to prove the funds were in the Union Bank of Switzerland. When it became apparent that Khemlani would never deliver Whitlam was forced to secure the loan through a major American investment bank. As a condition of that loan, the Australian government was required to cease all other loans activities. Consequentially, Connor had his loan raising authority revoked on May 20th, 1975.
The combination of existing economic difficulties with the political impact of the Loan’s Affair severely damaged to the Whitlam government. At a special one day sitting of the Parliament held on July 9th, Whitlam attempted to defend the actions of his government and tabled evidence concerning the loan. It was an exercise in futility, however. Malcolm Fraser authorised Liberal party senators – who held the majority in the upper house at the time – to force a general election by blocking supply.
And things were only about to get worse. In October 1975, Khemlani flew to Australia and provided Peter Game with telexes and statutory declarations Connor had sent him as proof that he and Connor had been in frequent contact between December 1974 and May 1975. When a copy of this incriminating evidence found its way to Whitlam, the Prime Minister had no other choice but to dismiss Connor and Cairns (though he did briefly make Cairns Minister for the Environment).
By mid-October, every metropolitan newspaper in Australia was calling on the government to resign. Encouraged by this support, the Liberals in the Senate deferred the Whitlam budget on October 16th. Whitlam warned Fraser that the Liberal party would be “responsible for bills not being paid, for salaries not being paid, for utter financial chaos.” Whitlam was alluding to the fact that blocking supply threatened essential services, Medicare rebates, the budgets of government departments and the salaries of public servants. Fraser responded by accusing Whitlam of bringing his own government to ruin by engaging in “massive illegalities.”
On October 21st, Australian’s longest-serving Prime Minister, Sir Robert Menzies (1894 – 1978) signalled his support for Fraser and the Liberals. The next day, Treasurer, Bill Hayden (1933 – ) reintroduced the budget bills and warned that further delay would increase unemployment and deepen a recession that had blighted the western world since 1973.
The crisis would come to a head on Remembrance Day 1975. Whitlam had asserted for weeks that the Senate could not force him into an election by claiming that the House of Representatives had an independence and an authority separate from the Senate.
Whitlam had decided that he would end the stalemate by seeking a half-senate election. Little did he know, however, that the Governor-General, Sir John Kerr (1914 – 1991) had been seeking legal advice from the Chief Justice of the High Court on how he could use his Constitutional Powers to end the deadlock. Kerr had come to the conclusion that should Whitlam refuse to call a general election, he would have no other alternative but to dismiss him.
And this is precisely what happened. With the necessary documents drafted, Whitlam arranged to meet Kerr during the lunch recess. When Whitlam refused to call a general election, Kerr dismissed him and, shortly after, swore in Malcolm Fraser as caretaker Prime Minister. Fraser assured Kerr that he would immediately pass the supply bills and dissolve both houses in preparation for a general election.
Whitlam returned to the Lodge to eat lunch and plan his next movie. He informed his advisors that he had been dismissed. It was decided that Whitlam’s best option was to assert Labor’s legitimacy as the largest party in the House of Representatives. However, fate was already moving against Whitlam. The Senate had already passed the supply bills and Fraser was drafting documents that would dissolve the Parliament.
At 2pm, Deputy Prime Minister, Frank Crean (1916 – 2008) defended the government against a censure motion started by the opposition. “What would happen, for argument’s sake, if someone else were to come here today and say he was now the Prime Minister of this country”, Crean asked. In fact, Crean was stalling for time while Whitlam prepared his response.
At 3pm, Whitlam made a last-ditch effort to save his government by addressing the House. Removing references to the Queen, he asked that the “House expresses its want of confidence in the Prime Minister and requests, Mr. Speaker, forthwith to advice His Excellency, the Governor-General to call the member of Wannon to form a government.” Whitlam’s motion was passed with a majority of ten.
The speaker, Gordon Scholes (1931 – 2018) expressed his intention to “convey the message of the House to His Excellency at the first opportunity.” It was a race that Whitlam was not supposed to win. Scholes was unable to arrange an appointment until quarter-to-five in the afternoon.
Behind the scenes, departmental officials were working to provide Fraser with the paperwork he needed to proclaim a double dissolution. By ten-to-four, Fraser left for government house. Ten minutes later, Sir John Kerr had signed the proclamation dissolving both Houses of Parliament and set the date for the upcoming election for December 13th, 1975. Shortly after, Kerr’s official secretary, David Smith (1933) drove to Parliament House and, with Whitlam looming behind him, read the Governor General’s proclamation.
The combination of economic strife, political scandal, and Whitlam’s dismissal signed the death warrant for Whitlam’s government. At the 1975 Federal Election, the Liberal-National coalition won by a landslide, gaining a majority of ninety-one seats and obtaining a popular vote of 4,102,078. In the final analysis, it seems that the Australian people had agreed with Kerr’s decision and had voted to remove Whitlam’s failed government from power once and for all.
Most of the arguments levelled against constitutional monarchies can be described as petty, childish, and ignorant. The biggest faux pas those who oppose constitutional monarchies make is a failure to separate the royal family (who are certainly not above reproach) from the institution of monarchy itself. Dislike for the Windsor family is not a sufficient reason to disagree with constitutional monarchy. It would be as if I decided to argue for the abolition of the office of Prime Minister just because I didn’t like the person who held that office.
One accusation frequently levelled against the monarchy is that they are an undue financial burden on the British taxpaying public. This is a hollow argument, however. It is certainly true that the monarchy costs the British taxpayer £299.4 million every year. And it is certainly true that the German Presidency costs only £26 million every year. However, it is not true that all monarchies are necessarily more expensive than Presidencies. The Spanish monarchy costs only £8 million per year, less than the Presidencies of Germany, Finland, and Portugal.
Australia has always had a small but vocal republican movement. The National Director of the Republican Movement, Michael Cooney has stated: “no one thinks it ain’t broken, that we should fix it. And no one thinks we have enough say over our future, and so, no matter what people think about in the sense of the immediate of the republic everyone knows that something is not quite working.”
History, however, suggests that the Australian people do not necessarily agree with Cooney’s assessment. The Republican referendum of 1999 was designed to facilitate two constitutional changes: first, the establishment of a republic, and, second, the insertion of a preamble in the Constitution.
The Referendum was held on November 6th, 1999. Around 99.14%, or 11,683,811 people, of the Australian voting public participated. 45.13%, or 5,273,024 voted yes. However, 54.87%, or 6,410,787 voted no. The Australian people had decided to maintain Australia’s constitutional monarchy.
All things considered, it was probably a wise decision. The chaos caused by establishing a republic would pose a greater threat to our liberties than a relatively powerless old lady. Several problems would need to be addressed. How often should elections occur? How would these elections be held? What powers should a President have? Will a President be just the head of state, or will he be the head of the government as well? Australian republicans appear unwilling to answer these questions.
Margaret Tavits of Washington University in St. Louis once observed that: “monarchs can truly be above politics. They usually have no party connections and have not been involved in daily politics before assuming the post of Head of State.” It is the job of the monarch to become the human embodiment of the nation. It is the monarch who becomes the centrepiece of pageantry and spectacle. And it the monarch who symbolises a nation’s history, tradition, and values.
Countries with elected, or even unelected, Presidents can be quite monarchical in style. Americans, for example, often regard their President (who is both the Head of State and the head of the government) with an almost monarchical reverence. A constitutional monarch might be a lifelong, unelected Head of State, but unlike a President, that is generally where their power ends. It is rather ironic that the Oxford political scientists, Petra Schleiter and Edward Morgan-Jones have noted that allow governments to change without democratic input like elections than monarchs are. Furthermore, by occupying his or her position as Head of State, the monarch is able to prevent other, less desirable people from doing so.
The second great advantage of constitutional monarchies is that they provide their nation with stability and continuity. It is an effective means to bridging the past and future. A successful monarchy must evolve with the times whilst simultaneously keeping itself rooted in tradition. All three of my surviving grandparents have lived through the reign of King George VI, Queen Elizabeth II, and may possibly live to see the coronation of King Charles III. I know that I will live through the reigns of Charles, King William V, and possibly survive to see the coronation of King George VII (though he will certainly outlive me).
It would be easy to dismiss stability and continuity as manifestations of mere sentimentality, but such things also have a positive effect on the economy, as well. In a study entitled Symbolic Unity, Dynastic Continuity, and Countervailing Power: Monarchies, Republics and the Economy Mauro F. Guillén found that monarchies had a positive impact on economies and living standards over the long term. The study, which examined data from one-hundred-and-thirty-seven countries including different kinds of republics and dictatorships, found that individuals and businesses felt more confident that the government was not going to interfere with their property in constitutional monarchies than in republics. As a consequence, they are more willing to invest in their respective economies.
When Wordsworth wrote his ode to Milton, he was mourning the loss of chivalry he felt had pervaded English society. Today, the West is once again in serious danger of losing those two entities that is giving them a connection to the chivalry of the past: a belief in God and a submission to a higher authority.
Western culture is balanced between an adherence to reason and freedom on the one hand and a submission to God and authority on the other. It has been this delicate balance that has allowed the West to become what it is. Without it, we become like Shakespeare’s Hamlet: doomed to a life of moral and philosophical uncertainty.
It is here that the special relationship between freedom and authority that constitutional monarchy implies becomes so important. It satisfies the desire for personal autonomy and the need for submission simultaneously.
The Christian apologist and novelist, C.S. Lewis (1898 – 1964) once argued that most people no more deserved a share in governing a hen-roost than they do in governing a nation:
“I am a democrat because I believe in the fall of man. I think most people are democrats for the opposite reason. A great deal of democratic enthusiasm descends from the idea of people like Rousseau who believed in democracy because they thought mankind so wise and good that everyone deserved a share in the government. The danger of defending democracy on those grounds is that they’re not true and whenever their weakness is exposed the people who prefer tyranny make capital out of the exposure.”
The necessity for limited government, much like the necessity for authority, comes from our fallen nature. Democracy did not arise because people are so naturally good (which they are not) that they ought to be given unchecked power over their fellows. Aristotle (384BC – 322BC) may have been right when he stated that some people are only fit to be slaves, but unlimited power is wrong because there is no one person who is perfect enough to be a master.
Legal and economic equality are necessary bulwarks against corruption and cruelty. (Economic equality, of course, refers to the freedom to engage in lawful economic activity, not to socialist policies of redistributing wealth that inevitably lead to tyranny). Legal and economic equality, however, does not provide spiritual sustenance. The ability to vote, buy a mobile phone, or work a job without being discriminated against may increase the joy in your life, but it is not a pathway to genuine meaning in life.
Equality serves the same purpose that clothing does. We are required to wear clothing because we are no longer innocent. The necessity of clothes, however, does not mean that we do not sometimes desire the naked body. Likewise, just because we adhere to the idea that God made all people equal does not mean that there is not a part of us that does not wish for inequality to present itself in certain situations.
Chivalry symbolises the best human beings can be. It helps us realise the best in ourselves by reconciling fealty and command, inferiority and superiority. However, the ideal of chivalry is a paradox. When the veil of innocence has been lifted from our eyes, we are forced to reconcile ourselves to the fact that bullies are not always cowards and heroes are not always modest. Chivalry, then, is not a natural state, but an ideal to be aimed for.
The chivalric ideal marries the virtues of humility and meekness with those of valour, bravery, and firmness. “Thou wert the meekest man who ever ate in hall among ladies”, said Sir Ector to the dead Lancelot. “And thou wert the sternest knight to thy mortal foe that ever-put spear in the rest.”
Constitutional monarchy, like chivalry, makes a two-fold demand on the human spirit. Its democratic element, which upholds liberty, demands civil participation from all its citizens. And its monarchical element, which champions tradition and authority, demands that the individual subjugate himself to that tradition.
It has been my aim in this essay to provide a historical, practical, and spiritual justification for constitutional monarchy. I have demonstrated that the British have developed ideals of liberty, justice, and good governance. The two revolutions of the 17th century – the English Civil War and the Glorious Revolution – established Great Britain as a constitutional monarchy. It meant that the monarch could not rule without the consent of parliament, established parliament as the supreme source of law, and allowed them to determine the line of succession. I have demonstrated that constitutional monarchs are more likely to uphold democratic principles and that the stability they produce encourages robust economies. And I have demonstrated that monarchies enrich our souls because it awakens in us the need for both freedom and obedience.
Our world has become so very vulgar. We have turned our backs on God, truth, beauty, and virtue. Perhaps we, like Wordsworth before us, should seek virtue, manners, freedom, and power. We can begin to do this by retaining the monarchy.
The rise to power of Nationals leader and Deputy Prime Minister, Barnaby Joyce has come to a dramatic halt as news of his marital infidelity dominates the headlines.
The political fallout has been immense, but predictable. On Thursday, the Senate passed a motion that called for Joyce for to relinquish his post as Deputy Prime Minister. Greens leader, Richard Di Natale called on Joyce to resign and even demanded that the Nationals fire him if he refuses.
The Prime Minister, who commented that Joyce had made a “shocking error of judgement”, responded to the scandal by changing the ministerial code of conduct to prevent Federal Ministers from having sexual relations with members of their staff.
Joyce’s shocking lack of moral fibre has jeopardised any real political power conservatives in Australia have, and has threatened the delicate balance of power between the right-wing and left-wing factions of the coalition Government.
Following the usurpation of the conservative Prime Minister, Tony Abbott by Malcolm Turnbull – a prominent voice of the left-wing faction of the Liberal Party – many on the right hoped that a Joyce-led Nationals would be able to counteract the centre-left leaning Liberal Party with their brand of traditionalism.
Naturally, Barnaby Joyce’s marital infidelity and dishonesty puts the trustworthiness of politicians in question.
A large part of the fury over Joyce’s affair is not the sexual infidelity, but the fact that he dipped into the public purse to finance the charade. As the political scientist and commentator, Jennifer Oriel stated in her article, “Barnaby Joyce’s Greatest Sin is Being Conservative”, the combination of corruption and marital infidelity violates the most basic codes of common decency.
Barnaby Joyce’s behaviour is precisely the reason Australians are cynical about politicians.
The idea that people ought to be cynical about politicians is hardly news to anyone with any real knowledge of history, politics, or human nature.
The reason countries like Australia place so many checks and balances – separation of powers, the Constitution, an independent judiciary – on those in power is that power tends to have a corrupting effect on the human soul.
As Lord Acton famously put it: “Power tends to corrupt, and absolute power corrupts absolutely.”
The greatest measure against tyranny is the establishment of a political and legal system that places restrictions on power. We should be thankful that Barnaby Joyce’s biggest transgression was marital infidelity, and not much worse besides.
Everyone versed in culture and politics understands the truth in Percy Bysshe Shelley’s (1792 – 1822) argument that creators of culture are the “unacknowledged legislators of the world.” Our view of the world is derived from our religious beliefs, the stories we read as children, the movies we watched, the cultural customs we become accustomed to, and so forth. It is not that culture constructs the physical edifices of civilisation per say, but that culture forms the values and philosophies upon which civilisation is founded.
In the west, the prevailing cultural narrative champions wholesome virtues: kindness, compassion, love, fair-play, and so forth, as being the only way to achieve prosperity and success. The individual must avoid combat with others, and be polite, civil, pleasant, and diplomatic to all. To be seen using aggression or wanting power leads to social isolation. This has certainly been the message in culture. In Shakespeare’s Richard III, the title character is a corrupt, twisted, and Machiavellian prince who schemes his way into power. By contrast, the future Henry VII is seen to be fair and humane. By the end of the play, Richard dies hated even by members of his own family, whereas Henry is celebrated as a noble hero.
This worldview bears little resemblance to reality:
“The manner in which we live, and that in which we ought to live, are things so wide asunder, that he who quits the one to betake himself with the other is more likely to destroy than to save himself; since anyone who would act up to a perfect standard of goodness in everything, must be ruined among so many who are not good. It is essential for a prince who wishes to maintain his position, to have learned how to be other than good, and to use or not to use his goodness as necessity requires.” (Niccolo Machiavelli, The Prince, 1532, Chapter 15, page 114)
Bubbling just below the surface are the real, amoral virtues which foster prosperity and success. In Beyond Good and Evil (1886), Friedrich Nietzsche (1844 – 1900) puts forth the following proposition:
“Suppose nothing is given as ‘real’ except our world of desires and passions, and we could not get down, or up, to any other ‘reality’ besides the reality of our drives.” (Beyond Good and Evil, page 59).
Maybe we aren’t as driven by morality and Godliness as we like to think we are. Maybe we are driven by lust for power, material wealth, and sex. (This, of course, brings forth the possibility that the purpose of wholesomeness is to temper our real desires).
Even though we loathe having to admit it, all of us want power. Power gives us greater control and makes us feel more secure. But since it is socially unacceptable to be seen wanting power we are forced to rely on subtlety. We are forced to become honest on the one hand, and duplicitous on the other, congenial yet cunning, democratic yet devious.
In chapter twenty-one of the Prince, Machiavelli (1469 – 1527) wrote: “Nothing makes a prince so well thought of as to undertake great enterprises and give striking proofs of his capacity.” Our civilisation was built through ambitious and power-hungry individuals. Not by the wholesome virtues presented to us.
Shorter men who attempt to assert or defend themselves are frequently met with the harrowing accusation that they are suffering from ‘Napoleon complex’, otherwise known as ‘short man syndrome.’ While there is some evidence – based both on research and common experience – that this may be the case, the root causes of the issue reveal a problem that is more complex and entrenched than the general public would like to believe.
The term ‘Napoleon complex’ was first coined by Alfred Adler (1870 – 1937) in 1912. Remarkably, however, Napoleon Bonaparte (1769 – 1821), the man for whom ‘Napoleon syndrome’ is named, was not actually short. Napoleon’s personal physician, Francesco Antommarchi (1780 – 1838), recorded the deposed Emperor’s height as being five pieds, two pouces, or five-feet, six-and-a-half inches. This was a half-inch taller than the average Englishman of the time, and a full two inches taller than the average Frenchman. The myth of Napoleon’s short stature comes from two places. First is the fact that Napoleon frequently surrounded himself with men taller than himself. Height requirements specified that the Grenadiers in the Elite Imperial Guard be 5’10 or over, whilst members of the Mounted Chasseurs had to be 5’7. To any casual observer, Napoleon would have looked noticeably smaller by comparison. And second, there is the anti-Napoleonic propaganda that frequently depicted the Emperor as small.
Like many physical characteristics, height can have a profound effect on a person’s self-perception. The shorter man’s poor self-perception begins in childhood when smaller children are often the targets of taunts and ridicule. As adults, shorter men are more likely to be overly-aggressive, domineering, and have an increased proclivity for resorting to extreme measures in order to prove themselves. Unfortunately, research shows that shorter men may, in extreme cases, resort to violence as a means of disguising their insecurities. The Journal of Injury Prevention found that men who struggled with their height and masculinity were three times more likely to commit violent assaults using a weapon. This study, which involved six-hundred American men aged between eighteen and fifty, asked participants to answer two sets of questions. The first asked about their self-image, drug use, and violent behaviour. The second set of questions asked the participants about their beliefs on gender roles, how they felt women and their friends perceived them, how they perceived their own masculinity, and how much they’d like to be a “macho man.”
Taller men are far more likely to succeed in positions of authority and power than shorter men. An early study of height and occupation reveals bishops to be taller than parish priests, sales managers to be taller than salesmen, and university presidents to be taller than the presidents of more modest higher-education facilities. In US Presidential elections, it is typically the taller of the two Presidential candidates that end up winning: John F. Kennedy (1917 – 1963) was six-feet tall compared to Richard Nixon (1913 – 1994) who was five-foot-eleven, Ronald Reagan (1911 – 2004) was six-foot-one compared to Jimmy Carter (1924 – ) who was five-foot-ten, and Barack Obama (1961 – ) was six-foot-one compared to John McCain’s (1936 – ) who was five-foot-nine.
And, as if that isn’t bad enough, merely finding employment can be a struggle for many shorter men. A 2001 study by Nicolo Persico, Andrew Postlewaite, and Dan Silverman of the University of Pennsylvania, found that shorter teenagers had a harder time finding employment than their taller counterparts. Persico, Postlewaite, and Silverman chalked this up to the attitudes and worldview of the shorter teenager. “Those who were relatively short when young”, they explained, “were less likely to participate in social activities associated with the accumulation of productive skills and attributes, and report lower self-esteem.”.
Things don’t get much better once they are employed, either. Shorter men are less likely to be afforded promotions and pay-rises than their taller peers. A study by Leland Deck of the University of Pittsburgh found that men who are 6’2 or taller earn 12.4% more than men who are below six feet.
Then there is the challenge of forming intimate relationships. Men are considered attractive when they are tall, broad-shouldered, and well-toned. An analysis of personal ads found that most women prefer dating men who are six-foot-tall and over, especially when it comes to casual sex. A study published in the March 2016 edition of Personality and Individual Differences journal found that while women did not particularly care about hair, weight, or penis size, they did care about a man’s height. It is believed that the primary reason for this preference is that height is a sign of high testosterone – and men with higher testosterone tend to be better protectors and lovers.
There is plenty of evidence to suggest that height is a source of great insecurity for many men. The shorter man’s sense of insecurity and resentment is almost certainly borne out of poor experiences associated with their stature. Smaller children are more likely to be the victims of taunts and ridicule. As adults, shorter men find it more difficult to form intimate relationships, find employment, and achieve positions of authority and status. Perhaps people ought to remember that Napoleon Complex is more complicated and entrenched than they like to believe.
Is there any other time in history more malaligned than the Middle Ages? Our modern conception of the so-called “dark ages” is that it was time characterised by superstition, barbarity, oppression, ignorance with a few outbreaks of the plague, just to make things interesting.
This view has been helped by numerous so-called educational resources. BBC’s Bitesize website, for example, takes a leaf from certain 19th-century British historians, the type of who saw Catholics as ignorant and childish, and caricatures Medieval peasants as “extremely superstitious” individuals who were “encouraged to rely on prayers to the saints and superstition” for guidance through life. It even accuses the Catholic Church of stagnating human thought and impeding technological development.
This does not represent the view, however, of many serious historians and academics. As Professor Ronald Numbers of Cambridge University explains:
“Notions such as: ‘the rise of Christianity killed off ancient science’, ‘the medieval Christian Church suppressed the growth of the natural sciences’, ‘the medieval Christians thought that the world was flat’, and ‘the Church prohibited autopsies and dissections during the Middle Ages’ [are] examples of widely popular myths that still pass as historical truth, even though they are not supported by historical research.’
In reality, the Middle Ages saw advances in law, politics, the sciences, theology, philosophy, and more. It saw the birth of the chartered town which ushered in the tradition of local self-governance. The existence of a strong papacy laid the foundations of limited political power as it prevented monarchs, who justified their power through their so-called “unique” relationship with God and the Church, from monopolising power. This symbolic limitation on monarchical power was manifested in the Magna Carta (1215) and the birth of the English Parliament.
The people of the Middle Ages produced magnificent Gothic cathedrals and churches. Many medieval monks became patrons of the arts and many were even artists themselves. In literature, the Middle Ages saw Dante’s the Divine Comedy and Geoffrey Chaucer’s Canterbury Tales. In music, the Middle Ages laid the foundation of Western classical music and saw the development of musical notation, western harmony, and many of the Christmas carols we know and love today.
Likewise, the Carolingian Renaissance of the 8th and 9th centuries saw advancements in the study of literature, architecture, jurisprudence, and theology. Medieval scholars and scientists, many of whom were monks and friars, studied natural philosophy, mathematics, engineering, geography, optics, and medicine.
In the spirit of intellectual and spiritual enlightenment, many universities, including Oxford University, Cambridge University, and the University of Cologne. These universities educated their students on law, medicine, theology, and the arts. In addition, the period also saw the foundation of many schools and many early Christian monasteries were committed to the education of the common people.
The Middle Ages saw advances in science, literature, philosophy, theology, the arts, music, politics, law, and more. Its legacy is all around us: whether it is in the limitations placed on the powers of Governments, the music we listen to, or in the tradition of education many of us have benefited from. In an era of political correctness perhaps we should be wondering whether we’re living in the “dark ages.”
This week for our theological article, we will be examining Friedrich Nietzsche’s (1844 – 1900) infamous statement, “God is dead.”
Friedrich Wilhelm Nietzsche (pronounced ‘knee-cha’) was born in Röcken, near Leipzig, on October 15th, 1944. His father, Karl Ludwig Nietzsche (1813 – 1849), was a Lutheran pastor and former teacher, and his mother was Franziska Oehler (1826 – 1897). The Nietzsche family quickly grew to include a daughter, Elisabeth (1846 – 1935), and another son, Ludwig Joseph (1848 – 1850). Unfortunately, the family would be beset by tragedy. In 1849, when Nietzsche was five-years-old, Karl Nietzsche would suffer a devastating brain haemorrhage and die. Then, as if to rub in salt in their wounds, the infant Ludwig Joseph, would die unexpectedly shortly after.
Nietzsche was educated at the prestigious Schulpforta school near Naumburg. There he received an education in theology, classical languages, and the humanities. After graduating, young Nietzsche attended the University of Bonn before moving to the University of Leipzig. During his time there, Nietzsche became acquainted with the philosophy of Arthur Schopenhauer (1788 – 1860) whose work, the World as Will and Representation (1818), would have a tremendous influence. Then, aged only twenty-four, Nietzsche was awarded the position of professor of Greek language and Literature at the University of Basel in Switzerland. He had never written a doctoral dissertation.
Nietzsche left academia briefly to serve as a medical orderly in the Franco-Prussian War (1870-1871). He was discharged due to poor health. Nietzsche returned to Basel where he came acquainted with the cultural historian, Jacob Burckhardt (1818 – 1897), and the composer, Richard Wagner (1813 – 1883). Wagner’s influence on Nietzsche can most readily be seen in the Birth of Tragedy.
During the late 1870s, Nietzsche became increasingly beset with debilitating health problems: digestive problems, poor eyesight, and migraines. He was forced to spend months off work, and eventually agreed to retire with a modest pension. Nietzsche was only thirty-four years old.
From there, Nietzsche devoted the rest of his life to the study and writing of philosophy. Between 1870 and 1889, Nietzsche wrote nineteen books, including: The Birth of Tragedy (1872), Philosophy in the Tragic Age of the Greeks (1873), Human, All Too Human (1878), the Gay Science (1882), Thus Spake Zarathustra (1883), Beyond Good and Evil (1886), On the Genealogy of Morals (1887), Twilight of the Idols (1888), Ecce Homo (1888), and the Will to Power (1901, technically unpublished manuscripts published by his sister, Elisabeth).
In 1889, in Turin Italy, Nietzsche suffered a mental breakdown after seeing a horse being flogged in the Piazza Carlo Alberto. In the following days, Nietzsche sent a series of ‘madness letters’ to Cosimo Wagner (1837 – 1930) and Jacob Burckhardt in which he signed his name ‘Dionysos’, claimed to be ‘the crucified one’, and asserted that he was the creator of the world. It was quickly agreed that Nietzsche should be brought back to Basel. There he was incarcerated in a clinic in Jena.
In 1890, Nietzsche’s mother, Franziska, brought him home to Naumburg where she looked after him until her death in 1897. From there, Nietzsche was cared for by his sister, Elisabeth, in Weimar. He died on August 25th, 1900 at the age of fifty-five.
The statement, “God is dead” is Nietzsche’s most memorable and provocative statement. (Of course, he wasn’t the first one to coin the term. That was Heinrich Heine (1797 – 1856). Nietzsche merely philosophised it). It first appeared in the Gay Science in a fable entitled, the Parable of the Madman. In the parable, the madman asks, ‘where is God?’, only to be informed that God had been killed by man:
“God is dead. God remains dead. And we have killed him. How shall we, murderer of all murderers, console ourselves? That which was holiest and mightiest of all that the world has yet possessed has bled to death under our knives. Who will wipe the blood off us? With what water could we purify ourselves?”
Of course, Nietzsche wasn’t talking about the literal death of God (he was, after all, an atheist). Instead, he was referring to the death of the concept or idea of God. The statement was meant as a reference to the decline of traditional and metaphysical doctrines that had dominated European thought and culture for centuries.
Nietzsche observed, correctly, that western morality was predicated on the presumption of the truth of Judeo-Christian values. Christianity had become infused in European culture and thought. Philosophers and scientists like Copernicus (1473 – 1543), René Descartes (1596 – 1650), Isaac Newton (1643 – 1727), Saint Thomas Aquinas (1225 – 1274), George Berkeley (1685 – 1753), Saint Augustine (354-430AD), Gottfried Wilhelm Leibniz (1646-1716), and more were all deeply influenced by their belief in God. Culturally, Handel’s (1685 – 1759) Messiah, Da Vinci’s (1452 – 1519) the Last Supper, and Michelangelo’s (1475 – 1564) Statue of David are all infused with religious themes.
The decline of Christianity’s supremacy in society began with the Enlightenment. Science replaced scripture. During this time, the belief in a universe governed by God was replaced by governance through the laws of physics, the divine right to rule was replaced with rule by consent, and morality no longer had to emanate from a loving and omniscient God.
The legacy of the Enlightenment, Nietzsche rightly observed, was that Christianity lost its central place in Western culture. (Of course, it can also be argued that Christianity’s central doctrines and tenets have been so absorbed by society people no longer recognise their influence). Science, replete with its elaborate depictions of physical reality, ultimately replaced religious truth.
Nietzsche’s assertion is often seen as a triumphal or victorious statement. However, analysis reveals that Nietzsche did not necessarily see the death of God as a good thing. He recognised that as society moved closer to secularisation, the order and meaning religion gave to society would fall by the wayside. People would no longer base their lives on their religious beliefs, but on other factors. Their lives would not be grounded in anything. As Nietzsche wrote in the Twilight of the Idols:
“When one gives up the Christian faith, one pulls the right to Christian morality out from under one’s feet. This morality is by no means self-evident… Christianity is a system, a whole view of things thought out together. By breaking one main concept out of it, the faith in God, one breaks the whole.”
Nietzsche believed the solution to the problem would be to create our own, individual values. Christian morality (derided by Nietzsche as ‘slave morality’) would be replaced by ‘master morality.’ Human beings would strive to become Übermensches or overmen.
The problem with Nietzsche’s suggestion is that it is virtually impossible to keep society ordered when everyone’s values are different. Furthermore, as Carl Jung (1875 – 1961) points out, it is impossible for us to create our own values. Most of us can’t keep our new year’s resolutions, let alone create a value system that will bring order to society.
Nietzsche, along with Russian novelist, Fyodor Dostoevsky (1821 – 1881), predicted that the 20th Century would be characterised either by apocalyptic nihilism or equally apocalyptic ideological totalitarianism. In the end, the world experienced both. The wake of the Great War (1914 – 1918) saw Europe plagued by communism, fascism, Nazism, and quasi-religious nationalism. In Russia, communism, through which a person’s value was derived from his labour, arose under the Bolsheviks. In Italy, fascism, through which a person’s value was derived from his nationality, arose under Benito Mussolini (1883 – 1945). In Germany, Nazism, through which a person’s value was derived from his race, arose under Adolf Hitler (1889 – 1945). All of these systems attempted to give people’s lives meaning by replacing the state with God.
In the end, the 20th Century would be the deadliest and most destructive in human history. The legacy of two world wars, nuclear weapons, communism, and fascism has been millions of painful and unnecessary deaths. This is what we get when we remove God from society: needless pain and suffering.
The evolutionary psychologist E.O. Wilson referred to war as “humanity’s hereditary curse.” It has become infused in our collective and individual psyches. The Iliad tells the story of the Trojan War, Shakespeare’s Henry V is centred around the Battle of Agincourt, and All Quiet on the Western Front tells of the experiences of young German soldiers on the Western Front.
The purpose of war can be split into two fields: philosophical and pragmatic. Most modern wars are fought for ideological, and therefore philosophical reasons: capitalism versus communism, fascism versus democracy, and so forth. Richard Ned Lebow, a political scientist at the University of London, hypothesised that nations go to war for reasons of ‘national spirit.’ Institutions and nation-states may not have psyches per-say, but the individuals who run them do, and it is natural for these individuals to project the contents of their psyches onto the institutions and nation-states they are entrusted with.
Rationalists, on the other hand, have another perspective. War, they argue, is primarily used by nations to increase their wealth and power: allowing them to annex new territories, take control of vital resources, pillage, rape, and so forth. Bolshevism arose in the political instability and food shortages of World War One Russia. The Nazis used the spectre of Germany’s humiliating defeat in the Great War and its treatment in the Treaty of Versailles as a stepping stone to political power. In the Ancient World, Sargon of Akkad (2334-2279BC) used war to form the Akkadian Empire, and then used war to quell invasions and rebellion. Similarly, Philip II of Macedonia (382BC – 336BC) used war to unify the city states of Ancient Greece.
Another explanation may be that we engage in war because we are naturally inclined to. War speaks to our need for group identity, and to our deep predilection for conflict. And it should come as no surprise that the two are not mutually exclusive. Our strong predilection towards our own group not only makes us more willing to help other members of that group, it makes us more willing to commit evil on its behalf. Chimpanzees have been known to invade other congresses of chimps and go on killing sprees. The obvious intention being to increase territory and decrease intra-sexual competition. Similarly, our own evolutionary and primitive past is fraught with violence and conflict. It should not escape our attention that history is abundant with examples of invading soldiers slaughtering men and raping women.
Like all the profound aspects of culture, war conceptualises a facet of a deeper truth. It has been central to our history and culture capturing both the more heroic and the more frightening aspects of our individual and collective psyches. We both influence and are influenced by war.
There is an alarming trend in media today. Type into google ‘men are useless’, ‘men are worthless’, or ‘society doesn’t need men and various articles, mostly by left wing and pro-feminist news organisations, will come up. These articles have the same basic message: men are, at best, a nuisance in the age of ‘girl power’.
Feminist philosophy is centred around the idea – a conspiracy theory in reality – that men have deliberately conspired to keep women down and take power for themselves. In reality, the differences in male and female achievements have been the result of the differing expectations thrust upon men and women and the different choices they make. As Camille Paglia wrote in her article It’s a Man’s World: “history must be seen clearly and fairly: obstructive traditions arose not from men’s hatred or enslavement of women but from the natural division of labour that had developed over thousands of years during the agrarian period and that once immensely benefited and protected women, permitting them to stay at hearth to care for helpless infants and children.” Civilisations were constructed not to keep women down, but for their benefit. The result of this natural division of labour is that men have dominated many tiers of achievement.
It could, therefore, be argued that much of feminism’s vitriol towards men is derived not from injustice, but from envy over male achievements. Second and third wave feminists have spent a great deal of time vilifying men and turning their shortcomings into symbols of pure evil. They have written a slew of anti-male books designed to erase men’s contribution to civilisation and devalue their achievements. Among the more infamous have been the End of Men by Hanna Rosin, Are Men Necessary by Maureen Dowd, and the Female Brain, in which author Louann Bridendine tells men they’ll be envious of the female brain. (Just imagine the reaction if an author wrote a book telling women they’d envious of male brains!).
What these writers fail to understand is that men are the builders and protectors of civilisations. It has always been men, and not women, who have built the larger edifices of civilisation, who have constructed the institutions upon which civilisations are founded, who have been the pioneers in virtually every aspect of human endeavour, and who take up arms to protect civilisations (and as a natural extension, its women) from outside threats .
In philosophy, it is men who have given us Plato’s Republic, Aristotle’s Nicomachean Ethics, Thomas Hobbes Leviathan, John Locke’s Second Treatise of Government, and Arthur Schopenhauer’s The World as Will and Idea. In literature, men have given us Homer’s the Iliad, Shakespeare, Charles Dicken’s Great Expectations, Fyodor Dostoevsky’s Crime and Punishment, and Leo Tolstoy’s War and Peace. Johannes Gutenberg gave us the printing press, Alexander Graham Bell gave us the telephone, Thomas Alva Edison gave us the lightbulb, and Karl Benz gave us the car. The modern world is an epic of male achievement.
Needless to say, society views men and women differently. Drawing from mountains of data on gender stereotypes, psychologist Alice Eagly found the existence of a ‘women are wonderful’ sentiment held by both men and women. Women are considered women purely by virtue of their existence. By contrast, manhood has to be earnt. Civilisation and culture set up the parameters upon which men ‘earn’ their masculinity.
Much of the ‘earnt manhood’ philosophy comes from the different roles men and women have occupied in civilisations. Men have always been expected to build and protect civilisation. Women, on the other hand, have always been valued as creators of life. This is derived from a symbiotic relationship between men and women which existed for civilisation’s benefit. Civilisation was organised so male strengths could offset female weaknesses, and vice-versa.
In reality, men are both better and worse than women, and the way society views its men depends on which men it chooses to focus on. If a society chooses to focus on men who are leaders, entrepreneurs, social reformers, and innovators, it will conclude that men are ‘better than women.’ But if it chooses to focus on men who are homeless, incarcerated, mentally ill, or suffering from intellectual disabilities, it will conclude that ‘women are better than men.’
It is motivation, not ability, that explains the vast differences in achievements between men and women. Men and women are motivated by different incentives to attempt different tasks. Research by Jacquelynne Eccles suggests that the shortage of women in maths and science is not the result of women’s inability to perform well in these fields per se, but a reflection of their different motivational choices. In simpler terms, there are fewer women in the maths and sciences because women are less inclined to study those fields. Similarly, fewer men do housework or change dirty diapers because they are not inclined to do so.
And, of course, the way one chooses to spend one’s time will reap different rewards. This may explain the often-fabled gender pay-gap myth in which feminists argue that women are deliberately and systemically paid less than their male colleagues. In fact, economic study after economic study has found that the difference in earnings between men and women are the result of different lifestyle choices men and women make. Men, on average, are willing to work longer hours and take fewer holidays. (To be fair, women do take significant time off work to raise children). This explains why men not only earn more money over the course of their working lifetimes but also why men gain more promotions and climb the ladder of success better than women.
Society encourages men to attempt high-risk ventures for the benefit of society and gives them big rewards when they manage to pull them off. (Women are not encouraged to take big risks and therefore do not reap big rewards.) It is men who are sent off to die in war, it is men who are given the dirty and dangerous jobs, and it is men who comprise the vast majority of workplace deaths. Women have never been expected to sacrifice themselves in this way and society has never seen fit to reward them in the way it has rewarded men.
It is a well-known fact among economists that men are, on average, more willing to take risks than women. One explanation for this may be the historic differences between the reproductive success of men and women. DNA analysis suggests that today’s population is descended from twice as many women as men. It would be reasonable to assume that this disparity has produced some significant personality differences.
For women, the best strategy was to play it safe, be nice, and go along with the crowd. Sooner or later, a decent man would come along with whom she could have children. It is no wonder, then, that women are not known for exploring uncharted territories or conquering far off lands. As Roy F. Baumeister, social psychologist at the University of Queensland, puts it: “we’re descended from women who played it safe.”
For men, however, the outlook was radically different. The competition between males for available females was a lot tougher. A man can choose to sit at home and play it safe if he wants to, but he probably won’t reproduce. Men, therefore, had to distinguish themselves by becoming risk-takers and innovators. Men who took big risks and managed to pull them off reproduced, men who stayed at home didn’t.
The American psychologist B.F. Skinner once wrote: “Men build society and society builds men.” It is the result of the different expectations civilisation thrust upon men and women and the different choices they make. Men are expected to ‘earn’ their manhood and are motivated by different things than women. Feminists can ridicule masculinity and male achievements as much as they like, but female achievement is only possible in civilisations that have been modernised and protected by men. And when things go wrong, as they inevitably will, it will be men, and not women, who save the day.
 One should also note that it has been the social and technological advances achieved by men that have freed women from lives as homemakers and child-bearers.